Skip to main content

Duke Nukem Forever, Taking Forever

If you've been into computer games for several years now, you've probably heard about the Duke Nukem franchise. I remember playing the 2d Duke Nukem side scroller on my old 386 box and heard about the new FPS Duke Nukems though popular PC magazines. So if it was a popular franchise comparable to Mario, Zelda, Final Fantasy and other games in that era, what the heck happened to it?

Image courtesy of Joystiq
 Much like software development, it deteriorated because of ambition and the ever changing technologies involved in software development. The development suffered a fatal blow shifting from one game engine to another. Its a bit tempting to use modern technology halfway through the development given the mindset that "hey the new engine would probably cut the development time into 1/4 so I don't mind sacrificing the time wasted developing with the old engine" But then again, the implications of such would be having to learn the new engine probably leaving your developers flying in blind. Not to mention that the workforce was slashed during 2009. Ouch.

Yes, a newer engine would probably yield better results, much like comparing the old Quake engine with the CS Source engine or what have you. The difference in the gaming experience would mean high sales or a flop. The problem with Duke Nukem was probably planning. If development were meticulously planned that despite improvements on gaming engines, despite delays in deliverables, that development using the previous engine would continue, then most probably even if its late.. They could have delivered.

It was announced on PAX that Duke Nukem Forever will be coming to the PC, PS3, XBOX 360 this 2011 thanks to the efforts of Gearbox.

Well, lets just wait and see.

More reading about this topic:
Joystiq - Duke Nukem Forever coming '2011' on XBOX 360, PS3 & PC, courtesy of Gearbox
Wikipedia - Duke Nukem Forever
The Duke Nukem Forever List

Comments

Popular posts from this blog

Self Signed SSL Certificates

Ever wondered how to enable SSL or HTTPS on your site? If you dont want to pay for commercial SSL certificates, you could create self signed certificates for your site by following the instructions here: https://www.digitalocean.com/community/articles/how-to-create-a-ssl-certificate-on-apache-for-ubuntu-12-04 The instructions in the site above will make your default site HTTPS enabled. If you prefer having a commercial SSL, save your certificate files and key files in your server and edit the location on the /etc/apache2/sites-enabled/default to point to the directory where you stored those files.

Moving to a New Linux Web Based Torrent Client

For years, I have been using TorrentFlux (url here) as my primary torrent client situated in my Ubuntu download server. But as time went on, the developers completely abandoned the development of TorrentFlux which led to several forks which I think is still insufficient for my needs. Main GUI of TorrentFlux Ive checked several options which runs on a GUI-less environment. Since my Ubuntu server is just running on command line to save precious memory, I needed something bare, simple and is packed with features. Installing uTorrent Server is pretty straight forward. Download. Uncompress. Run. This is better than the approach of TorrentFlux which you need to setup LAMP server and create a database. More often than not, it happens to me that some of the data in the DB gets corrupted. I normally just reinstall the whole thing again. Main GUI of uTorrent Server To further elaborate on the setup process, I've gotten an excerpt from this thread which, quite simply discusses ho

Modernizing Qwtlys Database Part 1

Its been years since I have last updated Qwtly and I was given the opportunity to play around and modernize the database for my application. I wanted to try the cloud offering of MongoDB called Atlas being that its free for a small database.  With this in mind and considering that Qwtly doesn't get traffic after I have disabled the add, edit and delete quote function along with the login, I don't see the application getting to that limit of 5GB anyway. Well, that is considering if I can even get this to work.  The first order of business was to see if we can import the MySQL export painlessly to MongoDB Atlas. I have searched for MongoDB tools, external tools, scripts, only to find old abandoned projects which would not be ideal given my situation. I have considered writing a PHP script to do it but that too would cost time. I was looking for something that consists of using existing tools or features I am familiar with along with some manual eyballing and checking. Luckily, I