XNA 3.0 - Reading Text Files on the Xbox

We are making strong progress on an XNA Community Game title that we are working on and I have just spend a good 30 minutes trying to figure this out. Hence, I’m writing this as a future reference for myself and in the hope that it might help any coders out there trying to achieve the same thing.

The Problem

Our game is relatively simple and we would like to define the stages that the player progresses through in a plain text file. While it’s a very simple task to read a text file in C#, I initially had problems assuming that loading a new StreamReader on the the Xbox like so StreamReader(”LevelIndex.txt”); would load up a text file in the same directory as my executable. Not the case - it turns out that, as the Xbox doesn’t really follow a directory structure like a PC, this doesn’t work.

The Solution

The solution is also relatively simple, but not east to find - as it’s one of those things that you either know how to do or not. There are 2 important things to note before jumping directly into a code listing.

1. StorageContainer.TitleLocation: In order to construct the file path that you will be feeding into a StreamReader, you must prefix your path with this property. To load up the file ‘LevelIndex.txt’, which resides in the root directory of your solution, you must construct the path as follows; String fullPath = StorageContainer.TitleLocation + “LevelIndex.txt”;

2. Build Action (None), Copy to Output Directory (Copy if Newer): As your text file is not going to be processed by the Content Processor, you need to tell Visual Studio what should be done with your text file. For my project, I dragged and dropped the text file into my Solution (but NOT in the Content directory). The text file sits alongside your .cs files (although, could reside in a folder). Then, you must change the properties of the text so that Build Action is set to ‘None’ and Copy to Output Directory is set to ‘Copy if Newer’. This ensures that Visual Studio doesn’t try to compile the file as code or include it in your Content repository. It will simply copy the text file to the TitleLocation of your game - just what you want so that you can read it from within your title.

Code Listing

The following code listing shows how we have implemented a simple text file reader for an Xbox XNA Project;

function String[] readStageTitles( String filePath ) {
  ArrayList stageTitles = new ArrayList();
  String[] returnData = null;

  // Both Windows and Xbox - Ensures we are looking in the game's folder.
  String StageIndexPath = StorageContainer.TitleLocation + "\\" + file;

    StreamReader streamReader = new StreamReader(StageIndexPath);
    String line;

    while ((line = streamReader.ReadLine()) != null)
      String[] data = line.Split(';');

      if ( data.Length == 2 ) {

  catch( Exception ex )
    // Do things here incase it can't read the file

  returnData = (String[])stageTitles.ToArray( typeof(String) );
  return returnData;

Hope this helps someone out there.

Originally published at John Wordsworth's Blog. You can comment here or there.


Quick Fix for Really Slow Remote Desktop to Vista (X64)

I have spent a few hours today trying to use a Remote Desktop Connection from my Macbook to my Vista X64 machine on the same local network. After countless hits of exceptionally slow speed, I realised that it wasn’t network bandwidth or even heavy CPU usage that was slowing down my experience.

After a few searches on Google and many failed attempts, I found that the following command instantly fixed my problem;

netsh interface tcp set global autotuninglevel=disabled

I’m afraid I have no real idea what other impact this might have on your machine / server. I understand that this disables the built in network ‘tuning’ that Vista uses to try to improve your bandwidth usage so that all software on your machine is guaranteed a solid minimum Quality of Service (QoS). I suspect that it’ll have no noticable impact at all, it’ll just perform more like XP than Vista (which was fine).

If it does screw things up, you can reverse this action by;

netsh interface tcp set global autotuninglevel=normal

You’ll have to run this from a cmd.exe shell and will probably need to run cmd.exe with the ‘Run as Administrator…’ command. I have UAC disabled, so it runs fine anyway. You should just get a simple ‘Ok’ response from the software if it’s run ok.

I hope that this helps some people. It’s been running perfectly fine ever since.

Originally published at John Wordsworth's Blog. You can comment here or there.


Print Screen on a Mac

As many of you are aware, I’ve recently converted to a mac through the purchase of a Mac Book laptop. I also tried to convert my desktop life, but after some horrible experiences with the screens on the new iMac’s, I’ve decided that will have to come much later down the road in our business when we can afford a Mac Pro for my desk (where I get personal control over my desktop).

This week, I’ve mostly been playing with the Print Screen options / Screen Capture options. Previously, I’ve only used a couple of the plethora of screen capture options that OSX offers, but I thought it time to get a definitive list of the (hard to remember) key combinations available, mostly for my own reference.

So, putting Windows to shame, you can actually save a screen capture as a JPEG file with a mouse click. This would be a great addition to windows and would definitely stop me from receiving emails that are 5Mb with 2 BMP screen caps in that I’ve been receiving a lot of recently. An overview of the commands are;

Saving a Screen Capture Direct to a JPEG File

⌘ ⇧ 3: Capture the entire screen to a jpeg file on the desktop.

⌘ ⇧ 4 (Followed by Drag): Capture the selected area of the screen to a jpeg file on the desktop.

⌘ ⇧ 4 (Then Spacebar, Then click on a window): Capture selected window (and sometimes a border) to a jpeg file on the desktop.

Saving a Screen Capture to the Clipboard

⌘ ⇧ 3 + Ctrl: Capture the entire screen to the clipboard.

⌘ ⇧ 4 + Ctrl (Followed by Drag): Capture the selected area of the screen to the clipboard.

⌘ ⇧ 4 + Ctrl (Then Spacebar, Then click on a window): Capture selected window (and sometimes a border) to the clipboard.

While it’s fantastic having this much freedom over the print screen process, the last one especially reminds me of trying to learn a 10-hit combo in Soul Calibur. It even feels that way when you’re trying to press all of the buttons on the keyboard/mouse during the process!

Originally published at John Wordsworth's Blog. You can comment here or there.


Mono - Running .NET Applications on OSX/Linux

This may be old news to some people, but I have recently found the power of a piece of software called Mono. Now, it actually takes quite a lot of research and a bit of experimentation to realise just how powerful Mono is. Nowhere on their landing page does it tell you that it lets you run .NET Managed EXE files compiled for windows on Linux and even OSX.

I’m not sure how many of you are aware, but Microsoft’s .NET languages such as Visual Basic and C# don’t actually compile into windows specific executable code but they instead get compiled into an ‘intermediate’ code similar to Java. This is why you are required to download 80Mb of .NET Framework files to run them on XP.

So, what the clever guys on the Mono Project have done is rewrite the .NET Framework so that it runs on other operating systems. That means that you are able to run software that was compiled for Windows as .NET Managed Code in other operating systems as if it were a native application. Now the system is far from perfect, so you’re not able to just pick any application and guarantee it will work - but the project is improving all the time and making more and more programs compatible.

Again, something which is not stated, is how easy this is. Download the Mono application for your machine. And then type: ‘mono MyApplication.exe’ and wait for the application to run. Hassle free (when it works).

While this may not seem significant to many people, I found it amazing that I could run a bunch of .exe files on my mac directly from my Windows Partition with no problems what-so-ever. My only issue with this technology at the moment, is the inability to have ‘browser windows’ in the application run on OSX. They run on Linux apparently, but not on a Mac.

I really hope that .NET developers start making their applications Mono compatible, and I look forward to the browser window component being accessible on a Mac. I’ll definitely be trying my C# programs on Mono and I may even release some of them soon!

Originally published at John Wordsworth's Blog. You can comment here or there.


Hello Mac, Bye Bye Hash Key?

Just a couple of days ago saw the acquiring of my new Intel MacBook. It wasn’t many hours into using my new piece of hardware that I set myself back to work - some web-design and CSS work was in dire need of completion.Now, there are a few keys that have been moved around on the Mac keyboard compared to my old laptop, and it’s still not unusual for me to have a hunt around the keyboard for a button that I need. The most interesting situation came, however, when it was time to define some CSS elements requiring the Hash Key.

My usual reflex didn’t insert a #, so my eyes headed south. A minute passed and I scanned every key afore me. No sign of a hash - I must have missed it, so I look again. Nothing! Luckily, a friend of mine has a mac, so knows that the ‘intuitive’ short-cut is infact Alt-3.

However, had I not had that support there, I would’ve been rather aggravated. I hope this saves someone some hair-pulling at a later date!Otherwise, my experience with the MacBook has been sterling. It runs very smooth, and most interestingly - everything integrates exceptionally well - even integrating with windoze. If I click on Network Shares, the drives appear instantly, and doesn’t freeze my entire PC just to browse these shared folders. PDF’s no longer freeze my browser, and I can use Firefox just to remind me of my old laptop.

More to come at a later date - but my current mission is to program a Dashboard Widget and some Cocoa demos to see what Mac Development is like.

Originally published at John Wordsworth's Blog. You can comment here or there.


HTC TyTn / Hermes on 3 / Three Mobile (UK)

HTC TyTn / HermesI recently bought myself a HTC TyTn, confusingly, also known as the HTC Hermes, Orange M3100, O2 XDA Trion, Vodafone v1605, iMate JasJam and a whole plethora of other names. The eBay auction specifically stated “unlocked, but doesn’t work on Three mobile.”

This was interesting, as the first thing I did when I got the item was stick my 3 Mobile Sim card in it to find that it worked fine. I also did a bit of net searching to find conflicting reports on fairly serious questions: Will my phone get blocked by 3? Will I be able to access Planet 3? Will I ever be able to make video calls?Well, to date, I’ve had no problems what-so-ever. No angry text messages from 3 telling me this isn’t my regular phone, no problems accessing Planet 3 to see how many minutes I’ve got left, and I must confess, I’ve not tried to make a video call yet. So, in order to aid any other people that are having these worries, this is what I’ve done;I flashed the ROM and installed the ‘Black Satin’ version of Windows Mobile 6.0. I then downloaded the from XDA Developers, put this on my device using Active Sync and then ran the .cab file. Now, this certainly isn’t necessary, and all of the settings worked without doing this, but I like upgrading this and having more buttons. Now, some tips for the 3 Mobile UK Owner.3 Internet Connection Settings (Settings / Connections)

Modem: Cellular Line (GPRS, 3G)
Access Point Name (APN):
Username / Password / Domain: Blank

MMS Settings (Messaging / SMS/MMS / New Message / Message Options / Servers)

Port: 8799
Server Address:
Send Limit: 300k
WAP Version: 2.0

Last but not least, if you want to access Planet 3 you can simply fire up Internet Explorer and visit One final warning - if you want to check if you’re on 3G or not, look at the icon at the top of the screen. I believe, if you try to use GPRS when you’re not in a 3G area (your network will be displayed as “3″ for GPRS, or “UTMS 3″ for the full ‘3G access’) then you’ll be considered to be roaming, which means you’ll be charged money, where as, if you’re in a UTMS 3 area, you’ll not be charged. However, if your network just comes up as ‘3′, then you can still make calls / send SMS messages on your free credit.

Originally published at John Wordsworth's Blog. You can comment here or there.


Logical Prediction of Web 3.0

A complete stab in the dark at what could well be some of the defining features of what will one day, undoubtedly, have the mysterious name ‘Web 3.0′. I thought I would extrapolate the lines that connect the ‘read only web 1.0′ to the features we know relate to Web 2.0.

Constant Streaming Data: In the beginning, you would request a page once, and read it. With Web 2.0, you often request a page for a section / specific use application, and use it. Thus, I suggest that with Web 3.0, you will visit the homepage of a site that will provide the majority of the functionality of the site without a page refresh - the site will change to suit your needs, like a real world application.

Syndication and Aggregation of Sites and Functionality: Where Web 2.0 has seen the syndication of data across multiple websites, I propose that Web 3.0 might present the possibility to syndicate web applications / entire pages or sites or even just specific functions of a website. As bandwidth becomes cheaper, it will be easier to syndicate applications from a single source than to have multiple copies dotted around the web.

Mobile Web to reach ‘Web 2.5′: In the current day, the majority of websites based at Mobile Phones aren’t quite Web 2.0, due to the limited capacity of features and memory available to mobile devices. As they improve however, they will take a very important role in the way the web moves forwards. People will expect to access their favourite sites as easily from their mobile phone as they can from their PC. Non-Geeks will blog from their phone and update data whilst walking to work.

Complete Microformat Integration with Real Devices: No longer will some sites support specific microformats that a small selection of specially made tools can make use of, but microformatted data will be exchanged directly between websites keeping all of your often visited sites synchronised, as well as your mobile phone, and even your microwave and your fridge (with a shopping list and todo list etc).

Further Abstracted Design: CSS will be cleaner and cross-browser incompatibilities will (hopefully) be a thing of the past. New XHTML objects and new layout languages may come into use to allow for advanced features to be built directly into the web (auto-complete combo boxes, hover-over boxes, trees, expanding and movable divs etc). Designers will have to worry even less about how things will work, and programmers will have to worry even more about the complex points of programming.

Programmable Open Source Platform: The web will become a place where you cannot only just add data to wiki’s, but program how they function and add applications to other websites. Simple drag and drop programming models will even allow for standard web users to add functionality to their own pages/blogs/other sites without danger of breaking the server or disastrous consequences. People will even login to their own customized (web-based, or web-attached) applications to access their files online, because they’ve added functionality to their applications themselves.

Just my logical prediction. Some of this will undoubtedly be wrong, but I look forward to matching it up in a few years time.

Originally published at John Wordsworth's Blog. You can comment here or there.


Crazy Laws in Utah

As I’m sure most of you are aware - I’m currently at a Dynamical Systems Conference in a Conference Resort near Salt Lake City, Utah. Now, just to set the scene for what’s to follow - imagine the following;

“I’d like 3 bottles of beer please.”

“I’m afraid I can’t serve that to you sir.”


“It’s Utah State Law that you are only allowed to have 1 drink at any one time.”

So, after realising that Utah State Law prohibits me from having a single friend, I started asking around for more of the crazy laws that the Mormons have enforced upon the fun-hating state of Utah. Here are my top 5, in no particular order.

Birds have the right of way on all Highways: I kid you not.

No dancing in a public place after 2AM: They don’t serve alcohol after 1am, but still - you have to go to a private residence is you want to dance after 2am.

It is against the law to fish from Horseback in Utah: Fair enough!

One Drink at at Time: Yup, if you order another drink, and it arrives before you’ve finished your first - you either down it, or send the other one back. On a similar note, it’s illegal to have a pitcher of drink at a table with just one person sitting at it.

A husband is responsible for every criminal act committed by his wife while she is in his presence: Interesting.

Originally published at John Wordsworth's Blog. You can comment here or there.


Blu-Ray vs HD-DVD - My Predictions

It’s time to evaluate the latest battle that can only be matched by the war between VHS and Betamax. I don’t want to get into too many technical particulates, I’m going to look more at the social and economic side - as I’m sure this will have a greater influence over which format will dominate. Saying that, lets start with the obvious comparison;

Technical Comparison: HD-DVD offers 15GB per layer, per side - about 3 DVDs worth of data. Interestingly, you can also purchase (although, they are limited) Combo-DVDs that have a DVD film on one side and an HD-DVD film on the other. Blu-Ray on the other hand can store 25GB per layer, but lacks the combo-discs (although it’s theoretically possible in the same way). However, the Blu-Ray standard requires players to support Java, that can be used for the Menus of the Blu-Ray movies. This could lead to movies coming with basic games that you can play on your TV alongside highly interactive menus or making of features. While both discs support essentially the same video formats, Blu-Ray supports a higher audio bit-rate allowing for better 5.1 dolby surround audio. All in all, Blu-Ray wins this round, but history shows that technical details rarely state the winning format.

Film Studio / Corporate Support: One of the two main issues that I would consider the main decider in which format will win, is the economic issue of support. To date, Blu-Ray is exclusively supported by Colombia, MGM, Disney, Lionsgate and 20th Century Fox (5 studios). HD-DVD only has 2 exclusive supports, Universal and Weinstein. Paramount, Dreamworks, Warner Brothers and New Line Cinemas currently support both formats. A quick search of Amazon shows an almost equal number movies in each format available: 420 Blu-Ray titles and 439 HD-DVD titles. HD-DVD got off the ground quicker, so I think the Blu-Ray titles may have started being released a little later, explaining the slight short-fall.

As for support for the actual media. Toshiba, Microsoft, NEC and Intel actively support HD-DVD. Microsoft and Intel probably being the major factors here, as they will be releasing hard-ware to the general public for their format, probably much favouring HD-DVD in their updates for Windows (but they will not be able to deny Blu-Ray either). Blu-Ray has garnered support from Apple, Dell, HP, Panasonic and Sony, an equally impressive line-up in the industry.

In my eyes, the two formats are pretty equal on technological corporate support, but Blu-Ray wins outright on the film studio support. Sony, Disney and Fox were the three most profitable / biggest film studios last year, and they all exclusively support Blu-Ray, so this round also goes to Blu-Ray.

Consumer Support / The Console War: I’m going to be fairly blunt on this one. I personally believe that the X-Box 360 and the Sony PS3 are going to be the biggest factor in deciding the winner of this battle. The X-Box 360 has an add-on that allows you to play HD-DVD discs, where the PS3 allows for it out of the box. Currently, the stand-alone players for these 2 formats are very expensive, and purchasers will be a fairly limited group of people until their prices crash. However, the console market is much wider, and will really help to get these formats into society, and out to the people that really matter - the young adults that love their hordes of films and massive flat-panel TVs.

To a point, this is part of a bigger discussion - which console will dominate in a years time. If the PS3 dominate’s, then the answer to this round is easy - Blu-Ray will almost definitely win right out. As PS3 owners get hold of higher definition TV’s, they’ll buy high-definition movies, which will what their console supports - Blu-Ray. However, if the X-Box 360 wins the war, it won’t be so clear cut… As you won’t just be able to go out and spend £15 on a film just to see what it looks like on your console - you’ll have to purchase an add-on drive first. So, even if the consoles near-tie, then I suspect that Blu-Ray will still come out on top. However, if the X-Box 360 wins, it could go either way.

Current Conclusion: I’d always thought that Blu-Ray had a slghtly cooler name, but never really had an opinion over which media would likely still be available in 2-3 years time. I’m now convinced that the wire-wool resistant Blu-Ray ROMs are going to the dominant media. Either way, there’s no way I would splash out on a £500 player for either at the current time, unless I was going to get a shiny new games console. And even in that case, my decision would be biased towards the PS3, but solely because of the nature of the console (you can install Linux onto it, it supports SETI@Home out of the box and it’s got some amazing titles lined up for it). All in all, I think that Blu-Ray deserves to win, and with the massive of support that it has from Film-Studios and the PS3, I think it most likely will.

Originally published at John Wordsworth's Blog. You can comment here or there.


LJ Cross Poster & Site Updates

Wordpress integration with my LiveJournal has been achieved. Using the incredibly useful Live Journal Cross Poster (LJXP) all entries posted to my Wordpress blog are now automatically also posted to my Live Journal. Apparently, the plugin will also make sure that they are kept up to date - so that if I make any changes they will also be mirrored on my Live Journal.

Other progressive news is that I have now also updated the front page of my website so that it includes the latest posts from my blog and my articles database in an aggregated form. I guess the next quick job is to make that into an RSS feed so that people can keep up to date with my site if required.

My next task is to complete the articles section of my site - and to get some content on here too. I plan to write a few software reviews for less common software so that they might hit a target audience. Along the same lines, I will hopefully start work on some tutorials some time soon too, perhaps some Photoshop tutorials for comic style colouring, or maybe some programming orientated tutorials.

Originally published at John Wordsworth's Blog. You can comment here or there.