16 Jun 2010

Kinectifail

I feel embarrassed for my Project Natal hype last year in light of Microsoft’s terrible E3 2010 conference. One year ago, Project Natal showed promise and potential, with its incredibly cool technology it could have been the next big thing, and Microsoft had more then enough time to prepare the ultimate killer application for Natal. Turns out these “killer applications” are a bunch of mini-games we’ve already seen 4 years before on the Wii. If the dashboard looks more interesting then the games on your gaming platform it’s really time to start worrying me thinks.

The new video chat, video player with voice control, dashboard with quick-links to the most useful areas, THAT’s really cool stuff. I wish everything on my TV would control like that instead of having all those remote controls with dozens of buttons (although I’m a bit disappointed about that “mouse-pointer emulation”, I was hoping for some sort of Minority Report “swooosh interface” instead of a point’n’click interface). Probably still not enough to make me buy, since the game pad is still “good enough” for menu navigation. A whole lot depends on the price. I feel somewhere around 79 Euros would be a good price point – yeah I know, just kidding haha.

But what happened to the cool games which really show the potential of full-body motion control? And what market is MS going after? And who green-lighted this terrible name “Kinect”? I don’t know what the native English speakers think about the name, but to a German “Kinect” is just like Knnnn….HÄH? And even after carefully decoding the sequence of characters the spoken name sounds like some industrial “middleware” device Siemens or Bosch would produce for car-manufacturers to built into their engine or gearbox. Makes the name Wii sound ingenious in retrospect. Or Move, easy, simple, to the point, even for non-English speakers.

So who should buy Kinect? The general idea of motion control is old news which was cool around 2 years ago, and (unfortunately) Kinect does too little to set itself apart with the presented launch titles. Sure, the technology is incredibly cool – to a technology nerd. But who cares about little details like… that maybe you have a little bit more control about your virtual bowling ball with Kinect compared to a Wiimote? Maybe the hardcore guys who know about those technical details (but usually don’t care about virtual bowling). To “normal people” this is just as interesting as the difference between a turbo and a compressor. People who like to play sports mini games while getting drunk at parties already got their fix with the Wii, and I can’t imagine that they’re so hot for a marginally better bowling experience that they’re defecting to the 360 en mass.

And here’s another strange thing: if MS is chasing the casual crowd all of the sudden, why is the Xbox redesign so “hardcore eXtreme”? It’s dark and edgy and looks menacing and dangerous. It says “buy me – if you dare”. I think that’s not quite the right message for mom and dad to buy an Xbox and play some bowling with the kids.

I honestly don’t understand why MS didn’t secure a real-time strategy title and created a breathtaking motion control interface to show-off for E3. “Real-time strategy on the console done right”. Throw a couple o’millions at Blizzard to get Starcraft II onto the 360 with really innovative motion controls. Now I’m just being silly, but I’m sure THAT would have turned a few heads, mayhaps even convert some hardcore PCers to the 360.

So yeah… I reeeeally can’t see myself jumping around a lot in front of my TV. New Xbox looks cool though, but I’ll wait until it is available in white.

-Floh.

6 Jun 2010

Radon Labs R.I.P. 2000-2010

Just a quick update, because I don’t have a lot of time for blogging at the moment. As you may have noticed, Radon Labs is no more. The plug was pulled on the planned financing model for the next Drakensang, and in the end we had to pull the plug on the company. The good news in all of this is that there were a lot of interested companies to help us keep the ball rolling, and since Tuesday last week we’re a part of Bigpoint which opens up a lot of new, very interesting directions where Drakensang and the technology behind it will move in the future (obvious hint: the net may play a very important role). Unfortunately it’s too early for me to answer questions regarding Nebula in any meaningful way. We’re currently busy getting to know our new comrades, plotting strategies and making plans.

24 Apr 2010

Splinter Cell Conviction

As an old-time Splinter Cell fan(atic) I’m happy to report that the new SC kicks ass big time! I was expecting the worst, because of its troubled and lengthy production (“back to the drawing board” etc…). After the demo there was much lamenting among Splinter Cell veterans, and the demo left me a bit worried as well, there didn’t seem too much left of the original Splinter Cell formula, and there was too little stealth and too much action. But after playing through the single player campaign twice now (first on Normal difficulty, then on Realistic) I think that the SC team went the right way with most changes.

At least in Realistic mode it is still very important to be stealthy, but (and that’s the important part) if stealth is broken, the game play doesn’t break down too. In previous SC’s (including the nearly perfect Chaos Theory – which by the way still looks phenomenal on the 360) I was often tempted to restart at the last checkpoint when open combat broke loose, because everything became just too messy.

In Conviction, the transition from stealth to combat and back actually works, and it’s really fun to play in this new way. That’s the one big - most important (and most risky) - change the SC team got exactly right.

What Conviction does *mostly* right is that it steers the franchise back onto a clear course which seemed to be lost after Chaos Theory. Double Agent added more and more bells and whistles (like all those utterly useless mini games) and Conviction looked like it didn’t know where to go as well before the reboot. The rebooted Conviction reduces this mess back into a nice, small set of game play features. Almost a little bit too streamlined for my liking (you can’t drag around bodies anymore, you can’t choose between fatal and non-fatal take-downs, and I actually liked that one lock-picking mini game), but the new agility of Sam Fisher, and the Mark-&-Execute feature makes up for the losses.

And sometimes there’s a workaround for missing features. For instance, instead of dragging a dead or unconscious body like in the old Splinter Cells, one can choke-hold a guard and instead of using him as a meat-shield, drag him into a dark corner and take him out there so surveillance cameras and other guards won’t find the body. But finding those new twists is probably harder for gamers who played the old Splinter Cells then for new gamers.

But once the player has learned to use Sam’s new skills without having to think about them, the game play experience is phenomenal. There’s nothing more satisfying then cleaning up half of the guards in a room with the new Mark-&-Execute, vanish again by dropping a flash-bang, flank the confused remaining guards and taking them out one by one by sneaking up on them from behind.

I have to confess that in my first play-through I often had to shoot my way out because I didn’t pay enough attention to the environment. There’s almost always a way to solve a situation stealthy, like a water-pipe on the wall or hidden passages to get above or behind the attackers. In the second play-through I already knew the basic layout of the levels, took my time to look around and explore the environment, and I was forced to plan my tactics more thoroughly because of the harder difficulty. The result was that I played much more with stealth, and always had a fallback plan in mind when the situation got out of control.

It’s also interesting to see how the big 3 Clancy games (Splinter Cell, Rainbow Six and Ghost Recon) are starting to share features that work well. Splinter Cell now uses the phenomenal cover system of the Rainbow Six Vegas games, and the Mark-&-Execute feature is similar to the Rainbow Six marking of priority targets before room-entry. I hope the next Ghost Recon will do similar things. The other 2 games could learn a bit from Sam Fisher’s agility, like jump-sliding over and under obstacles.

Story’s a bit… well, there is a story and at least it doesn’t get into the way of the actual game ;)

So all in all, really great game and I didn’t even dive that much into the Co-op and Deniable Ops modes yet…

23 Apr 2010

Build Pipeline

I’m really happy how far we’ve come along with our build pipeline in the past months. We now have a truly multi-project, multi-platform, distributed build pipeline for our daily builds along with a common programming framework for build tools and a few C# GUI tools which simplify its usage and generally are more pleasing to the eye then a raw DOS shell window.

Let’s start with the multi-project aspect. At Radon Labs there are usually several projects in flight at the same time. All projects are based on Nebula (but not all are running on top of Nebula3, we may decide to start a project on the older Nebula2/Mangalore if it makes sense). We always had a standardized project structure, daily builds, and rules how a project’s build script looks like but we had to deal with a few detail problems which were often pushed into the future because there were more important problems to fix. One of the fairly critical problems was a proper toolkit version history and more flexible toolkit update process. In the past we only had one current toolkit version, which was updated through a patching process. Toolkit updates are very frequent, from about once a week to a few times per day. It may happen that a new toolkit version breaks file format compatibility with older version. That’s less regularly, maybe once every few months. But this is becoming a problem if a project decides to create an engine branch and thus is decoupled from engine development on the main branch. Makes sense if the project is going into beta and stability is more important then new engine features. Problem is, that the project may come to a point where the toolkit can no longer be updated with the latest version from the main branch, because the main branch introduced some incompatibility.

What’s needed is that the lead programmer may “pin” a specific toolkit version to his project. We solved this problem with a new “Toolkit Manager” tool which tracks a history of previous versions and which takes care that the latest, or the “right” toolkit version is installed:

toolkit_expanded 

When switching to a new project, the Toolkit Manager automatically installs the right toolkit version (only if necessary), but it’s also possible to manually select and install a specific toolkit version.

The multi-platform aspect of our build pipeline lets us create optimized data builds for the various platforms (currently Win32/D3D9, Xbox360, PS3, Wii and the legacy Nebula2 “software platform”) from the same assets with a single “flip of a switch”. From the outside the build system on a workplace machine is represented by a very simple front-end tool, the new “N3 Batcher”:

n3batcher

The UI is hopefully self-explanatory, except maybe for the “Sync” build task. This performs a data-sync with the latest daily build from the project’s build server before exporting locally modified data which saves quite a bit of time in large projects with many day-to-day changes.

Under the hood the build system looks a bit more complex, but follows a clean layer model:

N3BuildPipelineOverview

At the top there’s the “N3 Batcher” front-end tool for workplaces, and the “makedistro” MSBuild script for the master build server which provides the daily build.

Those 2 front-ends don’t do much more then calling a centralized “buildassets” MSBuild script which takes care of build steps that are identical for all projects. If project-specific build-steps are necessary they are defined in a projectspecific.msbuild script which is located in the project directory.

The build scripts split the build process into several build tasks which form a dependency tree. Build tasks mainly call the Nebula3 command line tools, which in turn are often just wrappers for platform specific build tools provided by the various platform SDKs. For instance, you can simply call N3’s texturebatcher3 tool with the “-platform xbox360” argument to convert textures for the Xbox360 platform, or with “-platform ps3” to convert textures into the PS3 format (provided the Xbox360 and PS3 SDKs are installed on the build machine – of course). Another important task of the N3 command line tools is that they distribute the build jobs across multiple cores, and multiple build machines (more on that below).

The main result of the build process are platform-specific archive files which contain all the build data for one project (the actual daily build process also compiles the executable, creates an installer, and optionally uploads the finished build to the publisher’s FTP server).

All exported data is neatly separated by platform into separate directories to enable incremental builds for different platforms on the same build machine.

Distributed Builds: For the daily build dogma, a complete build must be finished during a single night. In Drakensang we hit this 12-hour ceiling several times until we reached a point where we couldn’t improve build time by throwing faster hardware at the problem. Thus we decided that we need a distributed build system. Evaluating existing systems wasn’t very fruitful (mind you, this is not about distributing code compilation, but distributing the process of exporting graphics objects, textures and other data), thus we considered building our own system. The most important task was to create a good tools framework, which makes it easier to create distributed tools in the future. The result of this is the DistributedToolkitApp class, which does all the hard work (distributing build tasks across CPU cores and across several machines). Tools created with this class basically don’t need to care whether they run locally or distributed, where the input data comes from and where the output goes to. They only need to worry about the actual conversion job. Of course there’s a lot of necessary standardization underneath, for instance how a “build job” is exactly defined, and some restrictions about input and output data, but defining these standards and rules wasn’t much of a problem. What surprised me most was how many small problems showed up until the distributed build system was robust enough for a real-world project. I’ve been under the impression that a TCP/IP connection inside a LAN is a relatively fool-proof way to communicate. Well, it worked “most of the time”, but we also had a lot of over-night builds break because of mysterious connection issues until we built more fault-tolerance into the communication (like automatic re-connection, or putting “vanished” build slaves onto a black-list). Well, it works now, and its relatively simple to maintain such a build cluster.

PS: we really need to update our tools icons though…

27 Mar 2010

It's Not So Grim Up North

I spent a wonderful day at the 2010 Grad Show of LTU Skellefteå in Northern Sweden last week. That's a branch of the LTU (Luleå University of Technology) where students learn the arts of film-making and game-development. The game guys use the Nebula "spelmotorn" (what a wonderful word for "game engine") for their projects for quite a while now, and invited me to talk about the engine and the challenges of making Drakensang. It was more of a project management talk and very light on the technical side, but since actual programmers were in the minority I think that was a good decision instead of going straight into hardcore tech stuff.

I wish I had a little more time to explore the town and its surroundings. Skellefteå (pronounced roughly like Shealleaf-tyo with emphasis on the 2nd syllable as I learned) is a relatively small town about 25 Swedish miles (1 mile is 10 km in Sweden!) south of the Arctic Circle, and as you can guess, it's still deep winter there. I was lucky to catch a nice sunny day, but the temperature was still only around -10 degree Celsius with about 1 meter of snow and a thick ice layer on the river which flows through the town.

The town of Skellefteå is really beautiful, with an orderly rectangular layout (so it's basically impossible to get lost). A surprising number of hotels, stores and restaurants are gathering around the town center, and there are colorful wood-planked town houses in the surroundings.
A small airport about 20 km south of town seems to take the role of the railway station. I was surprised to find myself in a packed-full Boeing 737 from Stockholm to Skellefteå (for some reason I was expecting something like this ;), when the flight from Berlin to Copenhagen (and back) was only handled by a mere Canadair CRJ900 (which - unfortunately - was packed with overweight German businessmen).

There's one thing I really like about Swedish people: they don't seem to like pointless small-talk. My seat neighbors on the plane didn't want to know where I come from and where I go to, what I want there, whether I like cats more then dogs or how the German beer is compared to their local beer. The atmosphere on the plane was quiet and relaxed, and I think I never encountered such a disciplined and friendly unboarding of a full plane as on the arrival at Skellefteå. Yet if you're asking for directions, they're friendly and helpful as if they know you for years.

Oh Glorious Sweden :)

30 Jan 2010

The Dark Side

Yeah, so I finally defected to the Dark Side and got myself a 13.3" MacBook Pro. Now that N3 runs on all 3 console platforms I'd like to explore the iPhone/iPod/iPad platform a bit. And iPhone development is only possible from the Mac, so the MacBook actually serves as some sort of hardware dongle.

Overall, the MacBook is a great device. It (still) looks slick, has a great display, and is relatively light weight. I considered the MacBook Air, but finally went for the Pro, because it had twice as much hard disc space and RAM, but was still cheaper then the Air. And since I also want to run Windows on the machine, hard disc space is precious.

The keyboard sucks ass though. I don't like those ultra-cheap ZX Spectrum keyboards which become more and more common on laptops. And the most important key, the Return key, is the smallest and easiest to miss. WTF?

Surprisingly, I'm not a big fan of the user interface. I've only used the machine for a day or so, and while the UI certainly looks slick, I had a quite a few WTF moments. The most puzzling thing is that I have no idea where to find an application after it's installed. The XCode installer for instance simply finished and I was left wondering how to find and start the damned thing. It's not in the dock, and its not under Applications. Turns out its "somewhere" on the hard disc under the /Developer directory. I was looking for something like TortoiseSVN, found a Finder plugin, and pretty much failed to install it. The read-me file told me to find some files and manually drag them somewhere onto the Finder tool bar, and to drag something else somewhere else to start it automatically after login. However, those files where nowhere to be found after the installer was done. Why doesn't the installer take care of all that crap? I'm pretty sure I could have fixed it after investing some more time, but something like this shouldn't happen these days. I'll see how the Subversion integration in XCode works.

I haven't found something like Live Writer to create blog posts (only some commercial apps, come on paying money for a blog editor? A good blog editor should come free with the OS like a web browser or a mail program).

I tried Safari and I was shocked to see the web pages I usually visit cluttered with blinking ad banners. Not Safari's fault of course, but the next thing I did was installing Firefox with AdBlock Plus.

I've dabbled around with MacOS from time to time during the 90s a bit, and at that time, usability was so much better compared to Windows. Today, I'm not so sure which one is better. Of course I've learned to get used to Window's quirks over the years, but I was honestly expecting myself to say something like "yeah, that's how its done" after playing around with OSX. But I didn't. Things are different, but not better. I'm still extremely excited to dive into Mac and iPhone development though :)