Tuesday, August 31, 2010

When you feel plainly dumb...

There are time when you work on a problem and nothing seems to help. You have checked everything, yet nothing works. And then you see the obvious answer. I had such a problem today. Until now, Laterna Magica only allowed you to pay costs in advance, first adding all the mana to the pool, then activating an ability or (more likely in LM) casting a spell.

The rules allow it, however, to play mana abilities while playing spells/abilities. So I wrote the code: A piece in the cost to ask the player for input, a piece in the player to handle the GUI. and it didn't work. Clicking on a card was just not recognized.

I first thought it could be that my legality checking code was not working while a spell is being cast, but I couldn't find any line in the code that even bothered about that. Next, I suspected that the asynchronous communication between game and GUI didn't work, and enabled a few debug statements, which in turn made clear that everything went like desired. So I thought to myself: where could I easily check if the action was successfully processed? Well, in the cost, where the player was asked in the first place.

But I never got to writing a debug statement; the bug was clear and simple: While I retrieved the activate action, I didn't execute it. These are the times when you feel plainly dumb...

Saturday, August 28, 2010

New Downloader

Besides working on combat, I played a little more with Jetlang and created a downloader, which works asynchronously and can inform interested listeners over PropertyChangeEvents.
Jetlang's concurrency approach is centered around the actors (who performs a task/where is it performed) and the tasks instead of around the data. Normal multithreading requires you to synchronize on the data you share, while an Actor framework doesn't share data but pass it around, so that only one thread "has" it at a time.
This makes it easier to code, because it is how humans think: You don't worry that two people will write on the same piece of paper at the same time; you wait until the piece of paper is passed to you. That said, it still needs a lot of thinking: Who are the people in your program, what's the piece of paper, and when should you pass it around?

Btw, I made a little comparison between the new downloader and the one used by forge (admittedly, in which I also had a great part).

Program           After 2 minutes  For 1500 images
Laterna Magica    1500 images       2:00 minutes
Forge              200 images      10:00 minutes

I looked at the code for Forge and checked if it is roughly comparable to mine, and it was; otherwise I wouldn't have posted this. I changed two things, however: I increased the buffer size to 8 kB to match Laterna Magica, and I removed a print statement. These are pretty slow and shouldn't be there when measuring time.

Laterna Magica's downloader uses 10 threads, which means that 10 pictures are downloaded at the same time. Even though it might seem at first that it shouldn't make much of a difference, because the total bandwidth stays the same, the numbers clearly say that download speed is five times better. The reason for that is the reduced overhead. Every picture needs the following steps to be done:
  1. Connect to the server
  2. Request the image
  3. Receive the data, store it into a file
The time is lost between step 2 and 3. The first two steps has you upload data (the request), the third has you download data. Between that is only waiting. When using only a single thread, this waiting directly influences the download speed, because nothing else happens in that time. Using multiple threads, waiting only blocks one of the threads, leaving 9 downloads running in the meantime.

Besides downloading card pictures, I adapted Laterna Editor to use the new downloader, which also shown its flexibility: Laterna Editor downloads HTML pages from Gatherer, but doesn't save them to files. Instead, it further processes them to compact card descriptions for Laterna Magica. While it took a little work, the downloader is not limited to Files but can download to any location, such as a string buffer in memory.

The card downloader from Laterna Editor also shows how the task-centric approach helps with programming. This picture pretty much describes how the download process works:


Both Source and Destination are classes implemented by CardDownloader, but implement interfaces specified by Downloader. The source checks if the URL is a search or a single card, and also handles the automatic redirect if Gatherer only finds a single card for a search. The destination saves the HTML page to a string and waits for the download to finish, by registering a property change listener to the download job. If the download was successful, it sends the page to the search- or card processor, depending on the source. If it wasn't, it just discards the downloaded page.
The search processor finds additional result pages (every page has at most 100 cards in the compact view, which is used by the downloader) and of course the individual cards. The download of those is again done by the same process. The search processor simply posts the DownloadJob to the downloader and that's it.
The card processor finds the individual fields like name and mana cost, and stores them into a card. Once that's done, it posts the card to a Channel where the Gui can find it and in turn show the editor.

Monday, August 23, 2010

Combat and concurrency

It was on July 15th when I titled a blog post "Combat in sight?", but now it really is. Today, I committed a version to my project where players can attack and thus deal damage. Blocking is not possible yet, but that's more a matter of work and not of thinking. Players still can't loose the game, but that's another story...

Now, for those of you not familiar with "concurrency" in programming, it means when multiple parts of a program run at the same time, "competing for the computer's resources" like CPU or RAM. Concurrency is hard for three reasons:
  • you have to control what happens when, because the classical "everything happens in a fixed sequence" is not true
  • speaking of "competing for resources", when two concurrent threads of execution access the same data, they can overwrite each other, messing things up
  • for this reason, locks and synchronizing have come up, which grant exclusive access to a resource to a single thread. This also means that threads have to wait for resources until they aren't locked any more. In the worst case, two threads can wait for each other.
Laterna Magica uses an approach that has two main threads; one for the game, and one for the user interface. While the UI thread is mandated by Java, I could have implemented LM to run on the same thread. The main reason I did not was to make LM independent of the GUI, meaning that the program can use different UIs and even run without a User Interface, say in a Computer-vs-Computer simulation or on a server (while the latter needs more preparation, I believe it will be possible someday).

So there are two threads, which means every GUI interaction means synchronization. The first GUI interaction implemented was playing cards and activating abilities. The code to enable that was pretty wordy and hardly extensible:


private boolean    ready;
private PlayAction a;

public GuiActor(Player player) {
    super(player);
} 
 
public synchronized void putAction(PlayAction a) {
    ready = true;
    this.a = a;
    notifyAll();
}

public synchronized PlayAction getAction() {
    ready = false;
    while(!ready)
        try {
            wait();
        } catch(InterruptedException ex) {
            ex.printStackTrace();
        }
    return a;
}

On the surface, it doesn't look that bad, but it has several issues:
  • It is low-level: It handles all the synchronization stuff itself, as seen by boolean ready, notifyAll(), synchronized, while(!ready) ... This is all stuff that has to be repeated for every sort of user input: declare attacker, blocker, ...
  • It moves control away from the actor: putAction takes a PlayAction, which is the result of e.g. clicking on a card. How that click is associated to the PlayAction is not controlled by the actor which violates atomicity
    • Translating click to PlayAction is specific to playing spells and not compatible with e.g. declaring attackers. This means that the translation, which is not atomic, has to be performed for every type of input
  • Missing Encapsulation: The interpretation as a PlayAction is only valid when the player has priority, and this fact is not represented here: There is no representation of the fact whether the Actor is waiting for a PlayAction or for declaring an attacker
On the opposite, here is the new variant:

public class GuiMagicActor extends AbstractMagicActor {
    public final GuiChannels    channels = new GuiChannels();
   

    private static T getValue(Fiber f, Channel ch, GuiActor a) {
        a.start();
        log.debug("Waiting for result...");
        T result = Parallel.getValue(f, ch);
        log.debug("received!");
        a.dispose();
        return result;
    }
   
    public PlayAction getAction() {
        return getValue(channels.fiber, channels.actions, new ActionActor(this));
    }

    
    ...
}

This new version is based on the Jetlang concurrency framework, which works using channels and callbacks: Messages published to a channel are - asynchronously - forwarded to registered callbacks, which may then in turn publish new messages. The Parallel.getValue() method is used to get a value back from the channel synchonously, and the ActionActor encapsulates the state, in this case choosing a PlayAction to perform:

public class GuiChannels {
    /**
     * Channel for receiving {@link PlayAction}s to execute when the player has priority
     */
    public final Channel  actions      = new MemoryChannel();
   
    /**
     * Channel for publishing {@link MagicObject}s when the user clicks on them
     */
    public final Channel objects      = new MemoryChannel();
   
    /**
     * Channel for publishing {@link Player}s when the user clicks on them
     */
    public final Channel      players      = new MemoryChannel();
   
    /**
     * Channel for publishing when the user clicks "pass priority"
     */
    public final Channel        passPriority = new MemoryChannel();
   
    public final Fiber                fiber;
   
    public GuiChannels() {
        PoolFiberFactory f = new PoolFiberFactory(Executors.newCachedThreadPool());
        fiber = start(f.create());
    }
   
    private Fiber start(Fiber f) {
        f.start();
        return f;
    }
}


public class ActionActor extends GuiActor {
    public ActionActor(GuiMagicActor actor) {
        super(actor);
    }
   
    @Override
    public void start() {
        disposables.add(actor.channels.objects.subscribe(actor.channels.fiber, new CardCallback()));
        disposables.add(actor.channels.passPriority.subscribe(actor.channels.fiber, new PassPriorityCallback()));
    }
   
    private class CardCallback implements Callback {
        @Override
        public void onMessage(MagicObject c) {
            log.debug("Received: " + c);
            PlayAction a = GuiUtil.getActionOptional(actor.getPlayer(), c);
            if(a != null) actor.channels.actions.publish(a);
        }
    }
   
    private class PassPriorityCallback implements Callback {
        @Override
        public void onMessage(Void v) {
            log.debug("Received pass priority");
            actor.channels.actions.publish(null);
        }
    }
}
 

The ActionActor receives Messages through the objects and passPriority channels and reacts specific to its task.

Hope you liked it!

    Saturday, August 21, 2010

    Scala: Functional programming & DSL

    I first learned about Scala in forge's comment to this post, and the tutorial he mentioned is really great. Just for experimenting, I extended the Calculator example for Variables, Functions and exponentiation. The result is really extensible and took me just 140 lines of code, which really impressed me. So I'm showing you the result:

    AST.scala (Scala doesn't require Files to represent the class names, and AST (abstract syntax tree) fits the classes in this file)
    package net.slightlymagic.calc {
      abstract class Expr
      case class Variable(name: String) extends Expr {
        override def toString = name
      }
      case class Number(value: Double) extends Expr {
        override def toString = "" + value
      }
      case class UnaryOp(name: String, arg: Expr) extends Expr {
        override def toString = name + arg
      }
      case class BinaryOp(name: String, left: Expr, right: Expr) extends Expr {
        override def toString = "(" + left + name + right + ")"
      }
      case class Function(name: String, args: List[Expr]) extends Expr {
        override def toString = name + args.mkString("(", ", ", ")")
      }
    }


    Calc.scala

    package net.slightlymagic.calc {
      object Calc {
        def simplify(e: Expr): Expr = {
          def simpArgs(e: Expr) = e match {
            case BinaryOp(op, left, right) => BinaryOp(op, simplify(left), simplify(right))
            case UnaryOp(op, operand) => UnaryOp(op, simplify(operand))
            case Function(op, operands) => Function(op, operands.map((x) => simplify(x)))
            case x => x
          }
         
          def simpTop(e: Expr) = e match {
            case UnaryOp("-", UnaryOp("-", x)) => x
            case BinaryOp("-", x, Number(0)) => x
            case BinaryOp("-", Number(0), x) => UnaryOp("-", x)
            case BinaryOp("-", x1, x2) if x1 == x2 => Number(0)
           
            case UnaryOp("+", x) => x
            case BinaryOp("+", x, Number(0)) => x
            case BinaryOp("+", Number(0), x) => x
           
            case BinaryOp("*", x, Number(1)) => x
            case BinaryOp("*", Number(1), x) => x
           
            case BinaryOp("*", x, Number(0)) => Number(0)
            case BinaryOp("*", Number(0), x) => Number(0)
           
            case BinaryOp("/", x, Number(1)) => x
            case BinaryOp("/", x1, x2) if x1 == x2 => Number(1)
           
            case BinaryOp("^", x, Number(1)) => x
            case BinaryOp("^", x, Number(0)) if x != Number(0) => Number(1)
            case BinaryOp("^", x1, UnaryOp("-", x2)) => BinaryOp("/", Number(1), BinaryOp("^", x1, x2))
            case e => e
          }
         
          simpTop(simpArgs(e))
        }
       
        def evaluate(expr: Expr, variables: Map[String, Double], functions: Map[String, (List[Double]) => Double]): Double = {
          expr match {
            case Number(x) => x
            case UnaryOp("-", x) => -evaluate(x, variables, functions)
            case BinaryOp("+", x1, x2) => evaluate(x1, variables, functions) + evaluate(x2, variables, functions)
            case BinaryOp("-", x1, x2) => evaluate(x1, variables, functions) - evaluate(x2, variables, functions)
            case BinaryOp("*", x1, x2) => evaluate(x1, variables, functions) * evaluate(x2, variables, functions)
            case BinaryOp("/", x1, x2) => evaluate(x1, variables, functions) / evaluate(x2, variables, functions)
            case BinaryOp("^", x1, x2) => Math.pow(evaluate(x1, variables, functions), evaluate(x2, variables, functions))
            case Variable(x) => variables(x)
            case Function(x, args) => functions(x)(args.map((x) => evaluate(x, variables, functions)))
          }
        }
       
        def parse(text : String) = CalcParser.parse(text).get
       
        val standardVars =
          Map(
            "E"  -> Math.E,
            "Pi" -> Math.Pi
          )
       
        val standardFuns =
          Map(
            "sin"  -> {x:List[Double] => Math.sin(x(0))},
            "cos"  -> {x:List[Double] => Math.cos(x(0))},
            "tan"  -> {x:List[Double] => Math.tan(x(0))},
            "cot"  -> {x:List[Double] => 1/Math.tan(x(0))},
            "asin" -> {x:List[Double] => Math.asin(x(0))},
            "acos" -> {x:List[Double] => Math.acos(x(0))},
            "atan" -> {x:List[Double] => Math.atan(x(0))},
            "acot" -> {x:List[Double] => Math.atan(1/x(0))},
            "exp"  -> {x:List[Double] => Math.exp(x(0))},
            "log"  -> {x:List[Double] => Math.log(x(0))},
            "min"  -> {x:List[Double] => Math.min(x(0), x(1))},
            "max"  -> {x:List[Double] => Math.max(x(0), x(1))}
          )
       
        def evaluate(expr: String, variables: Map[String, Double], functions: Map[String, (List[Double]) => Double]): Double = {
          evaluate(
            simplify(parse(expr)),
            if(variables == null) standardVars else standardVars ++ variables,
            if(functions == null) standardFuns else standardFuns ++ functions
          )
        }
       
       
        import scala.util.parsing.combinator._
       
        object CalcParser extends JavaTokenParsers {
         
          def expr: Parser[Expr] =
            (term ~ "+" ~ term) ^^ { case lhs~plus~rhs => BinaryOp("+", lhs, rhs) } |
            (term ~ "-" ~ term) ^^ { case lhs~minus~rhs => BinaryOp("-", lhs, rhs) } |
            term
     
          def term: Parser[Expr] =
            (factor ~ "*" ~ factor) ^^ { case lhs~times~rhs => BinaryOp("*", lhs, rhs) } |
            (factor ~ "/" ~ factor) ^^ { case lhs~div~rhs => BinaryOp("/", lhs, rhs) } |
            factor
     
          def factor : Parser[Expr] =
            (exp ~ "^" ~ exp) ^^ { case lhs~exp~rhs => BinaryOp("^", lhs, rhs) } |
            exp
     
          def exp : Parser[Expr] =
            ("+" ~ expr) ^^ { case plus~rhs => UnaryOp("+", rhs) } |
            ("-" ~ expr) ^^ { case minus~rhs => UnaryOp("-", rhs) } |
            "(" ~> expr <~ ")" |
            ident ~ params ^^ {case name~params => Function(name, params)} |
            ident ^^ {case name => Variable(name)} |
            floatingPointNumber ^^ {x => Number(x.toFloat) }
         
          def params : Parser[List[Expr]] =
            "(" ~> repsep(expr, ",") <~ ")"
         
          def parse(text : String) = parseAll(expr, text)
        }
      }
    }


    Wow! I don't want to repeat the tutorial, which is really great, but I want to mention a few features that really improved the way this code is written.

    First of all, Scala is a mixture of functional and object oriented: Map[String, (List[Double]) => Double] (Scala uses square brackets for generic types) means a Map of Strings and Functions, which take a list of Doubles as a parameter and yields a Double as the result. To get the same result as standardFuns in the Calc object... Yes, object. This basically means that all members are static. On the opposite, normal classes can't have static members, which doesn't really matter because you can have class and object of the same name. To get the same result as standardFuns using Java, you'd have to define a common interface and several anonymous classes, which means a lot of duplication. Using anonymous functions and the fact that a function can be passed as a parameter, something you don't get from nonfuntional languages, you avoid most of the redundant code. No interface definition, just a single line per anonymous function.

    The second thing that you can write method calls in an "unusual way": a.plus(b) is the same as a plus b. Together with the fact that you can name methods, e.g. "+", you basically have overriding operators. This means not much more than you can write short, concise code. But that is very important; just look at the preceding paragraph. The parser package of Scala uses such "Operators" to write short, expressive text parsing code.

    Lastly, pattern matching. This is a great feature that is really hard to imitate using Java. It uses case classes, as seen in AST.scala, and the match/case construct. For example:

    expr match {
      case BinaryOp(x1, op, x2) => doSomething()
      case BinaryOp(Number(x1), "+", Number(x2) => doSomethingElse()
    }

    doSomething() is only executed if the case matches, and at the same time assigns the three variables x1, op, x2. You can build more complex constructs, as the second example shows, including nesting Objects and using constant expressions to narrow down matching cases. This works as long as you use case classes.

    And now comes another important part: Scala compiles to Java byte code, which means that you can code a part in scala. For example, the simple Calculator DSL was way easier to write in Scala than it would be in Java. Once it's packed as a jar, you don't even notice that the code was not written in traditional Java, and integrates nicely into the other parts. (Except you'd have a hard time to call "operator methods" or use API that uses Functions as parameters. Consider that for your public, plain-old-java accessible API)
    The only downside is that Eclipse integration is not great so far. Maven does a good task in still compiling your Scala code, but the Scala plugin is not very stable, so I use the plain text editor. It works for small things like the Calculator, but I miss obvious features like import organizing, content assist, automatic building on every change, syntax & error highlighting, parentheses matching, folding...

    Thursday, August 19, 2010

    Performance of Software

    I was reading a little about scala these days and think that it's a cool language that would probably make my life a lot easier. For example, the CardCompiler is hard to use; you have to go through a few hoops to implement new effect parsers. I think that I am on the right way, but there is much room for improvement in terms of usability.

    But today I don't want to talk about Scala, but software performance. One point with Scala is that it largely uses immutable objects, putting copying over mutating existing objects. The first thought when you hear this is probably how much memory this would waste and that it will also need more processing power all the data has to be copied.

    However, performance doesn't mean to use as little resources as possible. It means to use the resources you have the best possible. And nowadays, the resource we have but use the least are multiple processors.

    Modern processors work on the edge of physical possibilities. Circuits are not thicker than a few atoms, and currencies are so small that you can almost measure the number of electrons flowing. At this scale, quantum effects matter, and this means randomness. That's about the reason why (semiconductor) processors don't get faster any more and increasing speed does only occur by parralelism.

    Writing software that works parallel, however, is hard, because multiple threads can access the same resource at the same time and thus wreak havoc. Which leads back to immutable objects: Data corruption can only occur when writing. when every access is read-only or creates a copy, nothing bad can happen.

    This leaves one question: How bad is copying really? Well, every copy you don't need any more can be garbage collected, so memory should not really be a concern. Memory size is also not as restricted as processor speed, so there should be enough potential to support memory-heavy rather than processor-heavy software.
    Garbage collection itself, however, uses the processor. It finds objects that are no longer accessed by any thread; that means that it is not constrained to work in the program's normal flow and can work in its own thread. As the garbage collector is not really interested in the objects, it can even be spread to multiple threads if necessary.

    Copying itself uses processing power, and can take some time for large objects. Usually, the only type of large objects are arrays. There are really not many possibilities: boolean, byte, short, char, int, float, long, double, Object. All these take up to 8 bytes to store. All that stays are the indexes of arrays, which can go into several millions. If we stay with immutable objects, this is really the only bottleneck we have to deal with, because you don't need to copy the array's content objects, as they can't change.

    I guess this wasn't all that interesting to read, but I really wanted to speak this out: parallelism is the future of software, parallelism is hard, so any attempt to make it easier, for example by using immutable objects, is a step into the right direction

    Tuesday, August 17, 2010

    To URI or not to URI

    I'm not an English native speaker, so don't blame me if this Shakespeare reference is not recognizable. Maybe "URI or not URI" would fit better for pronunciation...

    Anyway, I'll brighten up the situation for anyone who doesn't know URIs: "URI" stands for Uniform Resource Identifier, and the "not to URI" part of the title refers to "URL", Uniform Resource Locator. Seriously, I don't know what the difference between them is supposed to be, just that they behave pretty differently in Java.
    (And for anyone who thinks, "Well, so you do know the difference?" No, the Java classes are an implementation of a specification that I don't understand, and not even really know. I know, however, how the implementation behaves. If this behavior follows the original specification, which was not made for the Java language but presumably for the internet, I don't know.)

    So, from my point of view, both URIs and URLs are meant to identify some sort of resource, like a network drive of some web site.
    • In Java, you can directly connect to a URL and read from and write to it. You can't do that for a URI, but you can convert between both forms (if the resource is legal in both representations).
    • The equals() and hashCode() methods of URL perform a name space lookup, making them very slow. URLs are therefore poor choices for map keys, for example for caching downloaded files.
    So, it seems like a tied match. The first one is "obvious", you can see it in the API that URI can't directly open connections. equals() and hashCode() being slow is so subtle that there is an entry in FindBugs, an Eclipse plugin that helps to find common programming mistakes, which is how I learned about it.

    The third difference I've experienced so far is very subtle and unexpected, and it is the real reason for this post. wow, about time to get to the point...

    You can represent a file inside a Jar (or Zip) archive using a URL or URI; both work. You can also "resolve" a relative URL/URI against another one to get an absolute one. But the combination of both only works with URL!

    jar:file:/path/to/archive.jar!/path/inside/archive/
    jar:file:/path/to/archive.jar!/path/inside/archive.txt

    are both legitimate URIs/URLs. Say, you want to resolve "hello.txt" against these, you expect:

    jar:file:/path/to/archive.jar!/path/inside/archive/hello.txt
    jar:file:/path/to/archive.jar!/path/inside/hello.txt

    however, this works only with URLs. A URI won't be able to resolve and will just return "hello.txt"

    I said that I recently reimplemented TreeProperties, and I did it based on URIs. As you might guess, resolving paths relative to some context is pretty important for a tree structure. Finding the bug was very annoying, because reading from a Jar file, or even multiple Jar files, is not something I have years of experience with. Fixing the bug was way easier once I sorted out its origin.

    I'd like to end this post with another Shakespeare quote:

    "But none come to mind. I hope I have at least fooled you with the quotation marks. Just be aware or the differences between Uniform Resource Locators and Identifiers"

    Saturday, August 14, 2010

    Migrating to Maven

    This week, I've finally done what I wanted to do for a very long time: I migrated my project to Maven. Warning: Non-programmers will probably be hopelessly bored.

    Maven is a build system with many capabilities that are probably beyond my current recognition. For me, it means essentially these things:
    • Managing dependencies to external libraries, such as Dom4J, an XML API, and also to the different projects that belong to LaternaMagica, like TreeProperties.
      • For an Open Source project, this also means that I don't have to care that others have all the required libraries, because they can find and download them on-demand. Having the right configurations in the development environment is not that important this way. As a downside, it makes Maven pretty much a requirement for working with my code, but that's just like English is required to read my blog: it's good to know English, no matter what.
    • Automatic testing and generation of source, binary and documentation archives, or "snapshots"
    I said I wanted to do this for a long time. If you're asking why I haven't done it earlier... well, I was dumb. I couldn't get my Eclipse installation working with the Maven plugin. The problem was... that my eclipse installation was owned by root (the administrator, for all the poor Windows users out there^^) and installing the plugin simply didn't do anything as I didn't have the rights.

    So finally, everything is alright. I have totally rewritten TreeProperties and Utils, and also greatly reworked LaternaDoc, which finally does what it should. More on those another time ;)

    Sunday, August 8, 2010

    Last Known Information with Undo

    Last Known Information is one of the hardest parts of the Magic rules to implement. It is important when handling effects that use a card that changed zones, for example Persist ("When this permanent is put into a graveyard from the battlefield, if it had no -1/-1 counters on it, return it to the battlefield under its owner's control with a -1/-1 counter on it.") or Swords to Plowshares ("Exile target creature. Its controller gains life equal to its power."). These effects happen when the card is already in the graveyard/exile, but need information of the card when it was on the battlefield. For us, this seems very natural, but from a programming point of view, we have to save the relevant values long enough so that interested effects can look it up.
    The two main questions of this concept are already in the previous sentence:
    • What values are "relevant"?
    • What is "long enough"?
    An this is where Undo steps in! Why store the values separately when we can travel back in time and look at the original ones? with that approach, there is no chance that we need a value that was not saved - because we look at the original.
    Well, when we talk about time travel, of course there are problems: We have to get back to the future after looking up the needed information, but this is only possible if we didn't change history: The game will discard any actions that lie in the future if we change the past. Unfortunately, even simply looking up the power of a creature will change the past, because it results in evaluating continuous effects etc.
    Well, I have another science fiction reference up my sleeve to save the day: If we must not change history, "simply" do it in a parallel universe: Create an exact copy of the game up to the point in the past we need to look up the information, then use it in our regular space-time continuum.

    So much for the theory of time travel and parallel universes, the reality is a bit more complex, though. I think I can eventually work it out so that this is possible. This is also the way I expect a potential AI to work: Try out multiple variants in a parallel universe and do the best in the real universe.
    But right now, only regular time travel is possible, with the limits set by changing history. The problem is that I can't transfer actions between games yet: If you tap a creature, I can't copy that action and execute it in another game, because the action stores the permanent that was tapped, which is exclusively associated with that game. The solution probably needs hashing or another type of identifier associated with each part of the game.

    I hope you enjoyed the short Science-Fiction sequence. I definitely did, and I would never have guessed to talk about Time Travel and Alternate Universes as a serious metaphor for a programming problem.

    Tuesday, August 3, 2010

    Simultaneity: Warp World

    The first time I mentioned simultaneity is with combat. It was just a coincidence that this was the first time, but it is very reasonable, because simultaneity happens very often during combat: Technically, creatures become attacking creatures simultaneously, then the same happens with blockers, damage etc.

    Simultaneity means multiple things, and I'm not sure if I have a full understanding of these. I'll take the example of Warp World, which is a classic example for discussing timing:

    Each player shuffles all permanents he or she owns into his or her library, then reveals that many cards from the top of his or her library. Each player puts all artifact, creature, and land cards revealed this way onto the battlefield, then does the same for enchantment cards, then puts all cards revealed this way that weren't put onto the battlefield on the bottom of his or her library.

    First, all permanents are shuffled into their respective owner's library, triggering from leaving play. All these abilities see all the cards leaving play, meaning that an ability like "leaves the battlefield" sees all the permanents. There is no variant where that ability leaves early and isn't active when the permanents leave play.

    The "then"s in the card's text neatly separate the parts that happen simultaneously. All "enters the battlefield" abilities of cards trigger for cards that enter the battlefield at the same time as them; it seems like the abilities are active just a little earlier than they are in play... it's a little awkward in my opinion.

    now hold on - Auras with "Enchant Enchantment" won't make it on the battlefield, because Enchant is not a triggered ability. The Comprehensive rules don't specify a wording for Enchant but rather give it extra rules. It feels, however, like a replacement effect, just like "As ... enters the battlefield" (which don't use the stack at all - they are effects generated by static abilities).

    Lands with additional costs, like the painlands, are worded like this, because if it was a triggered ability, you could play the mana ability in response to the cost. So, just like "Enchant Enchantment" enchantments (^^) don't make it, lands like "As ... comes into play, return a land you control to your hand. If you don't, put ... into your graveyard instead." won't, too. Champion, however, is a triggered ability, so your champions can survive. you can even enchant the creatures you wish to exile, because the triggered abilities won't resolve or even be put onto the stack until Warp World finished resolving.


    One last thing: after resolving the spells, all the abilities are put onto the stack, in an order depending on player seating and player choice. you can put abilities from entering the battlefield on the stack earlier that those from leaving the battlefield, even though these happened in another sequence...