Gaming History 101

Know Your Roots

Archive for the ‘Lessons’ Category

Know This Developer: WayForward

leave a comment »


Yesterday I saw a tweet from WayForward, a games developer that specializes in a retro feel and hand drawn animation, that it was celebrating 25 years.  That’s impressive, especially when you consider that 1990 predates the Super NES and also because the 16-bit style has been around now for two and half decades.  If you fancy yourself a fan of that era, long for the days of gorgeous hand drawn animation with large sprite-based characters on screen, and a 2D plane then WayForward is just the developer for you.  Oh yeah, and its strongest titles are typically tough as nails so just like back in the 90s you’re going to have to die a lot and restart before you ever think about beating one.  It should also be noted that WayForward is of the few studios that can really get a licensed game right and with the amount of care and detail afforded to this company’s many licensed outings it is akin to the Capcom Disney games.  All of these reasons and the fantastic original series Shantae make WayForward a developer that retro enthusiasts should definitely know.

voldi_way_picVoldi Way, founder and current self-proclaimed Tyrannical Overlord, started the company in 1990 as an independent developer out of Valencia, California.  He had an interesting childhood that included acting – his most notable film being The Changeling in 1980 – and founded a software company for sheet metal fabrication at the age of only 14.  At 20, he broke off from his partners to form WayForward for gaming software design and development, at that time his original company was netting more than $5 million annually.  Way named his company WayForward Technologies as a reference to the Douglas Adams book Dirk Gently’s Holistic Detective Agency in which character Gordon Way founds a company named WayForward Techonologies.  Logically, the focus at the time were the current 8-bit and 16-bit consoles/handhelds: Sega Genesis/Mega Drive, Super NES, Master System, Gameboy, and Game Gear.

mickeyultchalThe company’s first release was Mickey’s Ultimate Challenge, a 1994 puzzle platformer published by Disney Interactive for the the aforementioned systems (the Master System port didn’t release until 1998 for some reason and was the final released title for that console), that focused heavily on art and animation while featuring basic puzzles for younger players.  From the beginning WayForward always had a tendency to make games look great with large sprites and fluid animation rather than focus heavily on gameplay mechanics, a tactic that lends itself quite well to games targeted at younger audiences.  Mickey’s Ultimate Challenge is exactly that, a mix of basic activities like a memory matching game or jumping on books with letters in alphabetical order, all while enjoying seeing Disney friends Mickey Mouse, Donald Duck, Goofy, and others in a medieval setting and costumes.  Games press at the time included mostly magazines like Electronic Gaming Monthly (EGM) that didn’t have the audience to reflect either younger gamers or their parents so they provided lower-to-mid level review scores (like 5s and 6s out of 10) with comments relating to it being a weak title for puzzle fans but that younger audiences should love it.  Nintendo Power was a little nicer in its review, coming right out and stating that the SNES game was for younger audiences and they will find the game fun.  Not the greatest start for an opening work but the quality of art and animation cannot be denied and it gave way to the next big project for WayForward, edutainment.

In 1994, the same year of the release of Mickey’s Ultimate Challenge, WayForward entered into a partnership deal with American Education Publishing to generate a series of educational entertainment (edutainment) titles.  This deal was a success and allowed the company to get stronger with its animation and sprite-based work as well as garner some funds and attention with awards for innovation at the Consumer Electronics Show (CES) in 1995.  In 1997 the company switched gears with the help of CEO John Beck and focused on small teams and licensed products in a work-for-hire model that includes the entire team into the budget for the game.  According to creative director Matt Bozon, it created a surprisingly stable structure when compared to more traditional game-oriented development and allowed that team to stay together for so long.  Beck elaborated in an article with Gamasutra how they handled a limited team, “We utilize external teams for specific modular content work. For example if we need character modeling done, it’s a very well-defined, modular task that can be easily shopped out to an external company, and we’ll take advantage of that. For the most part, we don’t. We prefer to use internal team members to do work. But we will staff up with freelance help as project needs dictate.”  In addition, this smaller team size and focused project scope meant that a majority of games released early on for WayForward’s new model were portable licensed titles, not unlike a portion of the structure today, but there was usually a decent twist to the actual gameplay that kept the games interesting.  The result are games that most of us probably avoided unless we were fortunate enough to be of portable licensed game playing age in the early 2000s.

WWF_Betrayal_CoverartWrestling Gameboy Color title WWF Betrayal is one of those titles that transcends whatever the media said about it – I didn’t even bother to look it up – because the WayForward take on the WWF/WWE license created an addictive game that players’ anecdotal remarks are all positive about.  The game only fetches about $10 or less on eBay and might be worth picking up if you have interest.  I’ve also heard that Godzilla: Domination for the Gameboy Advance, a WayForward port to portable of the very well received Pipeworks Software title Godzilla: Destroy All Monsters Melee!, was worth a look but in closer analysis with overall review scores, GameRankings, and personal experience most of these positive reports must be remembering the PS2/Xbox/Gamecube original because much was lost in translation.  One positive note is that all reviewers agreed that the visuals were particularly compelling for the Gameboy Advance, but the gameplay itself couldn’t make the jump to handheld, which was common for the time.  For good or for bad, WayForward continued to create visually compelling works that garnered enough attention to keep work coming as it led up to the first original intellectual property (IP) that still remains one of the best games for the Gameboy Color: Shantae.


Of all the personalities at WayForward, the individual you are most likely familiar with is Matt Bozon, who is now creative director at WayForward and creator of the infamous Gameboy Color platformer Shantae.  Fun fact: he’s also brother to former IGN journalist Mark Bozon, which is why the last name may ring a bell if you followed IGN from 2005-2010.  Oddly enough, Mark worked on the Nintendo team and frequently heard about his brother’s work on the Nintendo Voice Chat podcast or among the other IGN reviewers, but to my knowledge has never reviewed his brother’s games (and never should have).  Back to Shantae, it was the first original title from WayForward and was a puzzle platformer featuring a young half-genie (genie mom and human dad) that is protecting a small fishing village when pirate Risky Boots and his band of thieves steals a steam engine.  The character Shantae is actually the brainchild of animator Erin Bell, who worked with Matt Bozon at the California Institute of Arts (CalArts as it’s often referred) and worked as a freelance animator with WayForward from time to time.  Spoiler alert: Erin married Matt and is now known as Erin Bell Bozon and the two came up with the defining characteristics of the character while together.  Erin based Shantae on one of her campers back in her camp counselor days and pictured her dancing and summoning animals as special powers, but it was Matt who came up with the signature hair whip move based on Erin’s long hair constantly whipping into Matt’s face when she would quickly turn around near him.  Matt Bozon went to work on the project as an internal labor of love to flesh out the character’s origins and create a game within the team’s spare time.  Development began in 1996 and was originally intended to be a PC or Playstation/Saturn game until owner Way scaled back the project and internally it was moved to the Gameboy Color.  Using the engine created by Jimmy Huey for Xtreme Sports earlier, Huey made an art capture tool that allowed for quick and easy transition from canvas to game engine.  While not talked about much, Huey’s programming skills at transitioning art are quite impressive and his work at WayForward for more than a decade reflected that.  After about 4 months and with a few changes to the mechanics and aesthetics (Shantae was originally a brunette and she had dancing as specific moves instead of animal transformation), the game was wrapped up and ready to start looking for a publisher in 2001.  It should be noted that this long development cycle is due to the fact that the game had potentially many iterations, changed platforms, and was clearly something to do on the side when current projects weren’t in the way, so it is logical that this would stifle the completion.  Finding a publisher had proved to also be a challenge because aside from the relatively low publishing budgets of original IP on portables, the Gameboy Advance was releasing and shadowing Shantae’s home platform on the Gameboy Color, and the game required a special cartridge to produce its impressive visuals that made it more expensive to manufacture and thus reduced profit margins.  Eventually Capcom did agree to publish the game, but held it into 2002 to allow the Gameboy Advance launch craze to taper off, and unfortunately the game did not perform as hoped.  As a silver lining, Shantae received high critical praise, many above 9/10 or 90/100 on review scales and is still considered one of the best Gameboy Color games to ever release.  For a long time players wanted to get their hands on and play the title but its low sales made it a rare and expensive find online until its recent re-release on the 3DS Virtual Console (worldwide) in 2013.  Now anyone who wants to check it out can on a 3DS for the worthwhile price of $6.

aliensinfestationWayForward continued on despite the lack of success with Shantae and continued to garner work with licensed properties on portable consoles and eventually transitioned to the Nintendo Wii.  Of these projects some of the more notable is the fantastic Contra 4 on the DS that perfectly captures the feel of old school 2D Contra and picks up right after the events of Contra III: The Alien Wars on the SNES, abandoning all that had released since.  Despite critical praise, it didn’t sell well, and lately there seems to be some backlash against the game for being “too hard”, which will remain consistent with a majority of WayForward’s games.  Thankfully there are still enough copies around that it only fetches about $15-$20 online.  Another great revival from the past is 2009’s A Boy and His Blob for the Nintendo Wii that features some of the best visuals that console has to offer, a much improved mechanic and campaign than the original, and is just about the cutest title core gamers would be interested in playing.  I’ve never gotten around to playing the game, but it is on the shelf, and it will probably have to become a game club title sometime this year to finally push me into giving it a try.  What makes this title so intriguing to me is that I loved the concept originally introduced in A Boy and his Blob: Trouble on Blobolonia by David Crane (Pitfall) on the NES, but it has too many fail states that even if you know exactly what to do there’s a high chance you will not complete the game.  This title was also praised critically, especially for WayForward’s incredible ability to update a game “the right way” and keep it faithful to the original while tweaking what didn’t work.  At the same time that old complaint of the game being too easy for adult core gamers that might remember the original and twitchy controls prevented a many gamers from picking it up, although this could also be due to the Wii just not being a popular platform for that crowd.  Another decent DS title that won’t cost you much is the licensed game Aliens Infestation that might be one of the best, if not the best, Aliens licensed games to ever come out.  It’s a MetroidVania set on the USS Sulaco (the ship from Aliens) after it is intercepted in open space following the events at the beginning of Alien 3.  You play as one of four marines in a team sent to investigate the abandoned ship and series planet LV-426 to uncover the activities of the Weyland-Yutani Corporation and the bio weapon project involving the aliens.  When you die the adventure continues with one of the other four marines unless you lose all of them and then it’s game over.  I’ve not gotten a chance to try this gem, but like most of WayForward’s library, I picked it up with the intention of playing it and haven’t gotten to it yet.  Everything I’ve heard from those that played it were unanimously positive.  Also if you are a fan of the original, WayForward did manage to get a sequel to ShantaeRisky’s Revenge, out on the DSiWare store and it can now be downloaded on either a DSi or 3DS as well as recent releases on the PC and iOS store (although I have no idea how a puzzle platformer takes to touch controls).


It was at this time, around 2011, that WayForward made the next jump into HD consoles Xbox 360, PS3, and PC with several re-hashes of previous properties.  I was reviewing games at this time so I had a chance to try most of these outings such as Double Dragon NeonBloodrayne Betrayal, and DuckTales: Remastered.  Most of these games are consistent with previous re-hashes in that the graphics are gorgeous (and unique for an era where the hand-drawn sprite was almost completely replaced by the 3D rendered model) and that they are too difficult.  It was at this point that I realized too many individuals had an idea in their head of what these retro games were, but few of the people playing them – including reviewers – had actually gone back recently to see these beloved franchises for their flaws.  Double Dragon is the easiest game to poke at given that in hindsight there leaves a lot to be desired from both the arcade game and the NES version that is longer, clunkier, and more positively regarded.  I bring this up because Double Dragon Neon is a stronger and oddly enough more fair game than the originals it stems from and the biggest gripes or gameplay mechanics that modern reviewers poked at were series staples from the originals that WayForward transplanted in.  It just goes to show that most of the audience of remakes are living in nostalgia world and don’t really want games that are the same as the way they used to be.  This is double for Bloodrayne: Betrayal, which I found to be a fantastic new 2D platformer/brawler take on the 3D original that offered every bit as much care, content, and challenge as any old school 16-bit title but most reviewers completely dismissed for being too hard and even bragged that they refused to complete before writing a review.  Well I played it, I did complete it, and I reviewed it positively.  While I will admit that no reviewer’s opinion is wrong, especially if properly backed up, I do take large issue with anyone who reviews a game and does not either complete it or see it completed by someone else in person.  Finally there is DuckTales: Remastered, WayForward’s tweaking of the original NES title DuckTales, that I just didn’t agree with.  Many liked it and even more of the audience had a much easier time with it than me, but it all boils down to the simple fact that I did not care for the inevitable tweaks of this remake.  Regardless of how you felt about these games, there’s no doubt that they can be polarizing, which isn’t good from a sales perspective.  All are available digitally on the 360/PS3/PC and a couple have been given out through the Playstation Plus program, so check if you have added them in the past.


Recently WayForward has continued on doing what they do best: licensed games on portables that look amazing an add something new to a genre that is almost universally made up of terrible games.  Perhaps there are more re-hashes for the future and possibly even some new properties, but regardless WayForward should be commended for twenty five years of fantastic titles, tech, and never forgetting its roots.  Someday I hope to try out that new TMNT title Danger of the Ooze, see what all the fuss is about regarding these Adventure Time games, and finally getting to Shantae and the Pirate’s Curse, the recently released third Shantae installment, before the release of the successfully Kickstarted Shantae 1/2 Genie Hero later this year.

Wayforward to the Handheld Future – Gamasutra
Godzilla Domination Review - IGN
Shantae Wiki - Wikia
Wayforward Jumps on the Wii Bandwagon – IGN

Written by Fred Rojas

March 2, 2015 at 1:13 pm

Sound’s Good: Your Video Game Audio Buying Guide

leave a comment »


This week I decided to take on another technical escapade and look into the sound options for video games.  This requires you to know quite a bit about the concept of analog sound vs digital sound, then compressed audio vs. uncompressed, stereo vs. surround, and all the wonderful tidbits mixed in-between.  Just to make things more complicated, the Internet forums are chock full of people who have no idea what they are talking about and will pollute decent message boards with misinformation only to be ignored by the elite knowledgeable on that board, thus making anyone who does a search end up on a page where the misinformation is the only answer in town.  Additionally companies like Dolby, DTS, and a whole group of fun little logos that can appear as stickers on your receiver’s box, case, or display fill you with the joy and satisfaction that what you see is what you are hearing and that it’s better.  Well guess what, it’s not.  In fact, probably the best surround sound you can possibly get is LPCM (or Linear PCM), which is uncompressed audio that has been around since before CDs and still stands as the best surround sound format – albeit at the cost of TONS of storage space that most consumer products refuse to utilize (remember that TitanFall’s uncompressed audio weighed in around 40 GBs).  With all the mess and bull that exists, I figured why not enlighten my fine readers with a lesson and best practices so that you can easily determine the sound options for your consoles and get them up and running and sounding great.

Please Note: As previously mentioned, there’s tons of misinformation on the web about sound profiles.  For that reason I may be more restrictive about comments that I know are incorrect and whether you choose to disregard this post for that reason is up to you.  Additionally sound, like visuals, is a subjective medium and therefore it won’t be the same for everyone.  Some swear 1080i looks better than 720p and visa versa, the same can be said for compressed DTS 5.1 and uncompressed DTS-HD Master Audio.  Despite the research and blatant facts suggesting otherwise, pick what helps you sleep at night, this is merely a guide of options.

The Set Up

The first thing you need to decide is how you want to set all your game systems up and what kind of sound setup you want.  If you are going to do TV speakers (which I don’t recommend on non-tube TVs), a soundbar, or any stereo (2.0 or 2.1) receiver, then most of the work is done for you because almost every digital sound format supports uncompressed stereo.  In the rare event that it doesn’t, you’ll just have to roll with it.  Also keep in mind that many video game consoles are analog audio so you’re stuck with stereo at best.  If you have a 5.1/7.1 receiver then you can adjust surround options, but I’ll get to that in a sec.  Here’s a quick breakdown on sound setups and definitions.

2.0/2.1/5.1/7.1: These refer to the specific number of speakers you have.  Anytime you see these numbers in relation to stereo/surround, that’s what they mean.  Anytime you see a “.1″ at the end, it means that you have a subwoofer added for base.  For example, a 2.0 setup is a two speaker stereo setup, whereas a 2.1 is two speakers plus a subwoofer.  Subwoofers can be powered (ie: separate plug for power) or unpowered (the receiver sends out the power) and I personally prefer powered so that I know it gets enough power for the sound output I want.

If you are going to use a receiver, always send the sound to the receiver separately or first before going to the television.  Almost all televisions will downgrade or strip sound with modern connections, it’s best to get used to hooking up your components to audio first.  In many retro consoles the cords have video and audio together, forcing you to hook up to them and then out to the receiver.  People grew up thinking that the television was the sole and initial source for everything, it isn’t, it’s the location of the finished product.

Retro Consoles

comp_svidEvery console from Pong clones to the Sega Dreamcast (and including the Gamecube) use analog audio for sound.  This is most typically with RCA plug-type composite cables (the yellow/red/white) connection, but you will definitely have some with S-Video also integrated or alternatively from the yellow video connection and of course most consoles before the NES and even today have coaxial (screw connection) Radio Frequency (RF) adaptors that output the sound and video as one.  This makes the audio part easy: it’s the white (left stereo) and red (right stereo) connections.  This is either mono or stereo analog audio that will automatically push through the connection whenever the console is powered on.  If you have a coaxial RF connection I recommend upgrading the output cable or in the rare even of the Atari VCS/2600 era and the Turbografx-16, finding a way to adapt to RCA (either by console modification or by simply hooking it up to a VCR with RCA composite a/v out).  Then all you do is hook the video up to your source and often in the menu of the game you’re playing you can set either mono or stereo (some more modern games will also have “surround” but more on that later).  That’s it, that’s all you need to do.  Most modern receivers will take in your composite, extract the sound, and then send the video out to the TV in either the same plug or high end models will upscale/upgrade the signal and output to better resolution (like most recently HDMI).  If you have a lot of consoles you will be swapping out and do not want to keep accessing the back of the receiver, simply get a composite A/B switch and hook the receiver in the “output” part and leave that out so that you can plug any of your many a/v cables into it.  Like it or not, almost no consoles share the same a/v out plug (save for SNES/N64/Gamecube and only for composite video).

Now we will move on to surround sound, the next step in audio and the first speed bump.

Early Surround Sound and the 5.1 Compressed Audio Format

Starting with the Playstation 2 and with almost every game on the original Xbox, there was support for 5.1 digital audio.  Earlier consoles like the N64 and especially the Gamecube liked to tout surround sound, especially with the “Dolby Pro Logic” logo, but that was merely a decoder for analog stereo sound that would emulate 5.1 on those types of setups.  Today every receiver will have modes similar to that and there are even competitors.  My receiver features Dolby Pro-Logic IIz, DTS Neo:6, and it’s own proprietary format – these formats turn a stereo setup into surround, and they are quite good at faking it.  For the most part, an analog stereo source will merely feed the stereo to the left and right side of speakers according to the setup and then use both speakers in the center channel.  For example, Eternal Darkness when sent to my receiver from the Gamecube analog a/v cable will send all the left stereo to the two or three speakers of my left side, the same on the right, and the center channel that has two speakers will act like a typical stereo speaker.  That’s all.


Now if you have a digital fiber optic cable (also known as a TOSlink, Optical, or S/PDIF) or a digital coaxial cable, you could get 5.1/7.1 compressed audio or 2.1 uncompressed audio.  This was first used by high end CD players, then later by laser disc, moved on to DVD, and of course was integrated into the PS2 and Xbox.  Digital coaxial was mostly used in CD players to send a digital uncompressed stereo sound (known as Pulse Code Modulation or PCM) that was said to sound richer.  On laser discs and DVD players, there would often be either digital coaxial and/or optical so that you could hook your component up to whatever port your receiver had and get compressed surround sound (known as Bitstream).  The benefit was that you could get 5.1 (and later 7.1) surround sound that was dynamic and came out of all the speakers, but at a price.  The bitsream sound format had to be compressed and lost quality, however it was the only way to fit these massive sound files onto a single laser disc or DVD.  Two major encoders emerged that were already doing similar formats in theaters to help get the best and most dynamic versions of compressed audio: those were Dolby (Dolby Digital) and Digital Theater Systems (DTS).

Digital Coaxial Cable

Digital Coaxial Cable

Back then the only way to get access to these compressed sound files were to have devices that could extract and send out the audio signal and a receiver that could receive it.  Think of it as a conversation – you can only speak English if you know the language and you can only use it with someone else who knows it.  This created the big compatibility wars of the 90s and 2000s that people have been so affected by they can’t seem to let go of it today (and frankly that’s why so few people even understand what’s going on).  You had all kinds of outlandish issues, each one that failed would result in the same effect: the digital cable would output stereo PCM uncompressed signal and you would basically lose surround sound.  This would happen if all your devices and media didn’t match – so to watch Day of the Dead in DTS you would need a version of the movie with DTS on it (look for the logo), a player that could read and output DTS (look for the logo), and a receiver that could accept and decode DTS (again, look for the logo).  This became such a headache that logo hunting is probably the biggest marketing ploy of home audio today.

Dolby Digital (DD) was the first widespread home compressed 5.1 format so most laser disc players, DVD players, and receivers could receive and decode it, dropping into uncompressed stereo (PCM) if for any reason the receiver couldn’t detect or read a DD 5.1 signal.  DTS came on the scene later and was basically the Pepsi to Dolby’s Coke, but many swore it was a more rich and better sounding format.  In truth I think it was just that DD was the go-to so you had more people using it and thus a larger range of quality differences, whereas DTS was a very specific format often only utilized when extra cost and quality were at stake.  In truth if you prefer one over the other it probably has to do with the goals of each format, because they did have different ones.  DD was set on creating a more discrete experience where the audio seemed all around you but not from the perspective of a single audio channel or direction, it wanted full immersion of the whole room, which is why hints of each sound were typically in all speakers.  DTS wanted directional based sound output so that when the Predator ran around the room the “whoosh!” of his run would jump from speaker to speaker in a circle, which any audiophile looking to show off his new equipment was probably in love with (myself included).  Back then, however, it was seen as a feature jump to include DTS because you typically had to buy more expensive equipment to support it and a more expensive version of the movie (DTS cuts were all the rage once DVD learned you could re-sell the same movie with a new feature and customers would buy it up).  In all the fun and trickery of creating a 5-7 speaker experience to replace the 2 speaker stereo, something it seems was lost because the overall sound quality was poorer.  Think of compressed as an MP3 with a low bitrate, just doesn’t sound as good as the original, whereas uncompressed is the full rich sound but only in two speakers.  It was the trade off that you had to make and everyone from Dolby, to DTS, to receiver manufacturers had an opinion.  Bose touted for years that their stereo systems using uncompressed PCM audio and then re-encoded for a 5.1 or 7.1 speaker setup was far superior to the compressed audio of DD and DTS, which given the current state of things may have been true.

Uh, Video Games?

Right, of course, so how does this relate to video games?  Well the Playstation 2 featured a fiber optic connection on the back of each iteration of that console and with the “HD AV Pack” the Xbox also could output fiber optic audio.  In many cases that only meant the game delivered uncompressed stereo, or even worse, compressed stereo.  On Playstation 2, provided you had a receiver that supported it, limited games like The Bouncer and Metal Gear Solid 2 supported Dolby Digital 5.1 in the cutscenes (you could watch the logo appear and disappear on your receiver if you had it set up properly) and the infamous Grand Theft Auto Vice City did have true 5.1 DTS (I remember hearing the world of random sounds all around me when I set this up).  There were other games, though, like SSX Tricky that would output DTS but only in 2.1, it was weird.  This was because the Playstation 2 didn’t do any encoding or decoding in the system; just like a DVD player it would simply strip the audio and send it out to the receiver in the form it was on the disc.  This meant that if your receiver could read DTS and DTS was in the game, you would get DTS.

Xbox HD AV Pack

Xbox HD AV Pack

Xbox was different thanks to the specific Nvidia chipset that made up the hardware configuration.  In this case it would take the the audio and actually construct (or reconstruct) it into Dolby Digital format.  This meant that no matter how the audio came (stereo, dolby digital, DTS, uncompressed stereo) the Xbox would take it and re-create a Dolby Digital 5.1 audio signal for it and send that to the receiver.  Honestly that’s how most of the consoles work from this point on, but I’m getting ahead of myself.  The benefit was that no matter what type of sound the developer gave, the Xbox would convert it to a compressed 5.1 Dolby Digital that would work 100 percent of the time provided your receiver could decode Dolby Digital.  It got rid of the guess work and also assured that every game you put into the system would have sound coming out of every speaker every time.  While it may have been a great deal of smoke and mirrors, the Xbox did it very well and those that had a 5.1 system and wanted to play a video game with surround sound usually opted for the always 5.1 Xbox version.

This all changed with the High Definition Multimedia Interface (HDMI) that eliminated compressed audio, bitstream, and optical cables but didn’t really bother to tell the main consumer.  And thus, the war that wages on even today of trying to understand what the hell your device is doing was begun.

Uncompressed Linear PCM

Remember way back at the beginning of this article when I mentioned the earliest form of digital audio was uncompressed stereo (aka PCM)?  Well secretly the home audio world converted from that lovely compressed “faking it” 5.1/7.1 of Dolby Digital and DTS and migrated back to uncompressed now that things like Blu Ray discs, digital video formats, and the HDMI plug were around.  See fiber optic and digital coax cables only compressed the 5.1 because they had to, they couldn’t carry the proper signal of uncompressed with more than 2 channels (stereo).  HDMI was different, it could carry a full uncompressed 5.1/7.1 sound.  Couple that with Blu Ray, which had a 50 GB capacity and you could finally fit the full high definition video and uncompressed audio experience.  Unfortunately it basically meant that all of us “suckers” who had signed on for the old school format were left in the dust with nothing to show for it.  Instead of openly copping to it, hardware manufacturers of all kinds opted to sweep it under the rug instead of force it forward.  In truth we should be thankful, it’s quite a decent accommodation to make for the audio side, whereas no one takes pity for video.  Heck, the Playstation 4 forces you to use HDMI and almost expects you to have a 720p/1080p television, but it will still send a compressed bitstream audio via optical cable out to your receiver without even warning you of the compromise.  Couple that with the stubbornness of technophiles – of which I will openly cop to – and you get a bunch of old guys looking for a DTS logo and regardless of what’s actually coming out of the speakers we stare at that state of mind logo and say, “damn straight!”


The Xbox 360 was too early into the format, HDMI wasn’t even widespread in 2005, so it flat out didn’t support the HDMI standard for video or audio (it was later added in hardware revisions).  In addition, the 360 used the DVD format for games, so naturally Microsoft re-enlisted the lovely Dolby Digital encoder they already have and boom, 360 games are all in Dolby Digital 5.1 (ever notice that you only get DTS when it’s passed from a DVD?).  The unknowing consumer puts the game in, hears 5 speakers, sees the logo, damn straight.  On the other hand, the one year delay and hefty price tag of the PS3 justified it pushing the standards much higher.  The Playstation 3 almost completely ditched the compressed audio format in terms of how it wanted to operate, future proofing the system with an HDMI port from the get that supported up to 7.1 channels of uncompressed PCM, but also humble enough to know that many early adopters would not have the tech yet.  Furthermore video games, even if they were on the blu ray disc format, were mostly multi-platform, which resulted most times in a smaller 5-10 gb game and compressed 5.1 audio, regardless of the 50 gb capacity of the blu ray.  In addition the console would pass compressed audio formats out via the optical cable like the PS2 did, so games encoded in Dolby would get that signal and games that were encoded in DTS would get that signal, and the audiophile saw the logo on the receiver and thought, “damn straight”.  This whole confusion basically came from the concept that Microsoft wanted to hide the truth by upscaling all games to 1080p on the 360 (there were like 5 total games that were in native 1080p) and encode all games in compressed 5.1 Dolby Digital.  On the PS3, the truth was forced on all games out of the box so whatever surround sound format was in the game (mostly DD compressed thanks to the 360) would output via optical and the native resolution for the game, rounded to the closest main resolution (usually 720p) would output on the screen.  Gamers and tech guys didn’t like that, which led to the concept that the PS3 wasn’t as good as the 360 from a video standpoint because all games were 1080p/5.1 on 360 and dropped down to 720p/5.1 on PS3.  The truth was that the games looked the same but techno people want more Ps, damn straight.


There was also the little case of logo fever that erupted far beyond DD and DTS.  New compression formats for 7.1 like Dolby Digital Plus and DTS-High Resolution Audio hit the market, which was just an upgrade from 5.1 to 7.1, still compressed, but it gave you that breathe easy logo on your receiver.  Not only that, but if you had a compressed audio cable and the lossless compressed Dolby TrueHD and DTS-HD Master Audio (both new codecs that compressed audio without quality loss to emulate uncompressed) was playing it would still display the logo for you, damn straight.  Therefore when these logo junkies (again, myself included) and PCM haters from the old guard started getting new HDMI receivers and saw no logo and “PCM” they lost it.  They didn’t want it.  Even though it was better and it was in 5.1/7.1, the brain could not understand that PCM > Bitstream and with all this hardware still supporting optical with Bitstream and you got the logo they went with that option.  I myself did this for the last five years until tons of research and a little patience finally saw me upgrading earlier this year.  Let’s go back to the PS3 for a second and also discuss the biggest reason that gamers thought they weren’t getting true 5.1 surround.  The PS3 is kind enough to decode all major forms of compressed audio and export it via HDMI in the linear PCM uncompressed audio format, the truest of formats.  Whether it’s old school DD/DTS 5.1, Dolby TrueHD, DTS-HD Master Audio and everything around them, the PS3 just decodes it to LPCM 5.1/7.1 and sends it to your receiver in perfect harmony.  This means that you don’t need any special type of HDMI cable and it’s basically supported by all receivers that take HDMI, no codecs on the receiver side required. It’s very similar to the way the original Xbox made everything DD 5.1 in 2001.  Unfortunately, no logo makes many people discouraged.  If you want proof, just hook your PS3 up to a receiver with HDMI, have it auto detect the sound via HDMI in your PS3 settings, and start any movie with a compressed format.  You’re receiver will simply claim PCM but you will notice that the surround is dynamic and if you choose the “display” button (either on controller or remote) you’ll see the audio formats like DTS-HD MA or Dolby TrueHD in the upper left.  See, you’re actually getting your big lossless 5.1 wonderful sound you wanted, the logo has just moved to a new place.

Fiber Optic (optical) cable

Fiber Optic (optical) cable

This is a good thing, but many can not let go of the old logo wars and dreaded “PCM means you’re getting non-surround stereo” from the earlier days that they have newer receivers and actually FORCE them back into Bitstream via optical cable.  Ugh.

Moment of Clarity

For me the moment I began to notice the difference was when I built a gaming PC.  My new Nvidia GTX 760 had an HDMI out and the setup software even said it supported 5.1/7.1 via linear PCM or Dolby Digital Uncompressed (Live Action I think it’s called), and yet I was getting no 5.1.  I had purchased a year earlier a device that would take the HDMI A/V source and strip the audio out of it so that I could send the audio via optical out to my receiver – this worked fine with my 360, PS3, and HDMI DVD player – what gives?  Well it was because the GTX series can only output PCM audio and as we’ve already discussed, optical can only handle 2-channel PCM.  I did a little research and all signs pointed to me needing an HDMI receiver, which I shrugged off and just accepted that my PC would always be in stereo, heck most PC games aren’t in 5.1 anyway because they don’t have the built-in decoders of the 360/PS3.

lpcmThe next hint was the Wii U.  The Wii just had stereo analog sound so I figured if the Wii U had any form of audio it would simply be DD or DTS 5.1.  Wrong, the Wii U actually only supports LPCM in 5.1 format, otherwise you are getting stereo PCM.  Back to the audio stripper that didn’t work with my GTX for 5.1 and I again didn’t get 5.1 out of the Wii U once the audio was stripped to the optical cable.  Even worse, the Wii U is so user unfriendly that I stupidly put “Surround” in the settings so only channels 1 and 2 out of the 5.1 were outputting to my speakers, which negates the center channel that typically carries voice on most 5.1 setups.  That basically meant that I heard little or no sound out of the voices because all it would pick up were the subtle left and right front speaker sounds of the voice that were much lower than the center channel.  I quickly went back to “stereo” and again blamed my Wii U for not being forward thinking.  Then I turned on my PS3 and watched a Netflix movie in Dolby Digital…with a logo…damn straight.

It all came to a head with the PS4 and Xbox One.  Both consoles allowed you to export the sound via optical, choose “bitstream” in your audio settings, and you even got to pick Dolby Digital or DTS!  Wow!  How did they do that?  You mean I get to pick which logo I see!  This was amazing for the first two weeks until I started to notice that it didn’t make sense that no matter what was on the screen I got the same logo, regardless of the logos on the media.  Then I started looking into this Linear PCM thing and realized that both consoles had encoders/decoders that could take any audio and export them to either compressed DD/DTS or uncompressed linear PCM 5.1/7.1 via HDMI (you know, the truest form of audio).  That did it, this logo hunting was B.S., I needed to move on.


After the setup, I was hesitant at first because again, no logos.  No matter what Blu Ray movie or game I put in, logoless.  However I did get 5.1, it sounded awesome, and things like my PC and WiiU worked (even with “surround” on!).  I had finally gotten over it and now I am able to enjoy Linear PCM 5.1 dynamic audio, without compression, despite having no logos to show for it.  But after this long 4300 convoluted history, you can see why I was so discouraged to ditch the logo.  If you are looking to upgrade your audio system and are tired of jacking around with optical cables, heed this notice to upgrade because you’re going to have to soon anyway.

As always if you have questions or discussion, please post in the comments below.  I did warn that if it spreads misinformation I may not post it, but that is limited to this post only.  Hopefully this will be nothing more than helpful.


Written by Fred Rojas

December 2, 2014 at 4:13 pm

Know This Developer: Ubisoft Montreal

leave a comment »


As I was looking into doing a history on this fantastic studio I came upon an excellent reference that was so good there’s no point in me doing one.  While it’s easy to rag on big media conglomerates, IGN’s Mitch Dyer did a fantastic story of the origins of Ubisoft Montreal that includes stories of Splinter Cell‘s origin, the reinvention of Prince of Persia, and the visual treat that is Far Cry.  It’s a fascinating story that documents the major franchises you can thank that studio for and a must read for gaming history buffs like ourselves.  Head on over and check out House of Dreams: The Ubisoft Montreal Story when you can.


Written by Fred Rojas

February 27, 2014 at 8:44 am

Hardware Profile: Commodore 64

leave a comment »

c64Release Date: August 1982
Manufacturer: Commodore
Retail Price: $595.00 (approx $1400 with inflation rates today)
Units Sold: Over 12 million (conflicting reports of 12-17 million)

Not So Humble Beginnings

Before personalized computers were called “PCs” (or MACs for all you Apple people), they were better known as “microcomputers”.  The name derives from the relatively small size and price of a computer with a microprocessor as the CPU and the same basic input/output structure for data and information.  Much like PCs of today, this allowed software and game programmers to design a title all around one basic data flow and configuration and then optimize each specific microcomputer release for the specifications of that computer.  American consumers even today are used to much lower prices than other countries and were slow to embrace the cost and concept of a microcomputer. That is, until the Commodore 64.  At the time of its release the only major competitors in the US were the Apple II and Atari 800, boasting hefty price tags of $1200 and $900 respectively.  With most game consoles priced at the time around $200 and some, like the ColecoVision, having computer add-ons for $400, the price endured for a microcomputer was restricted to certain households of higher income (and this doesn’t even include the cost for a monitor and desk to put it all in).  Commodore had a different plan and thanks to vertical manufacturing and two strong chips to handle graphics and audio, the company went about making a microcomputer that could compete with the Apple II and less than half the price.

In 1981 Commodore was celebrating wide success of its business computer the VIC-20, which was a Personal Electronic Transactor (PET) computer integrated into a keyboard, whereas previous models had been an integrated monitor/computer/keyboard and much more expensive and large (although it may have been seen as a laptop).  While the VIC-20 had worked great for at home accountants and small businesses because it could hook up to a television, Commodore had shown more interest in the side software of gaming that had been quite popular.  Unfortunately the VIC-20 only had 5kb of RAM, even small by BASIC (the programming language) standards, but allowed for many educational programs and text adventures.  Instead of bridging the gap between worlds the VIC-20 graphics processor by Robert Russell and the SID sound chip by Robert Yannes – both of whom preferred “Bob” and I like to refer to as “the Bobs” – were to be bridged together in a gaming console codenamed the Commodore Max or UltiMax, which was eventually scrapped.  Instead the Bobs went to CEO Jack Tramiel and proposed making an all-in-one solution that was a sequel to the VIC-20, codenamed the VIC-40, for the consumer market at an inexpensive price.  Tramiel retorted with a tall order: 64 kb of RAM (the VIC-20’s 64 kb RAM cart was the most popular and requested feature integrated in current versions), low cost (exact numbers unknown, rumors were a $600-$700 price tag), and a prototype ready in 3 months (it was proposed in November 1981) for Winter Consumer Electronics Show (CES).  At the show it was Atari employees blown away by the ability to generate such an impressive machine that beat the specs of the Atari 800 and around the specs of the Apple II for under $600.  As complicated as it sounded, the answer was simple vertical integration with Commodore purchasing the semiconductor plant that supported all of its parts, making the cost for a C64 a mere $135.  After an arduous holiday season with no days off to make CES, a rename to the C64 to support the new naming conventions of Commodore’s other business computers, and certifications for mass production, it proudly entered the market in late summer 1982.

Taking Over

Like so many other success stories in history, timing played the most critical role in securing the C64.  Not only was it on par with the highest competing computers of the time (the updated Apple IIe only barely beat its specs and that decision seemed almost intentional at the time, but it remained twice the price), but it had the versatility of being compatible with televisions and any S-Video monitor.  Additionally due to this output decision the C64 was priced and positioned to now take on game consoles of the time, which were drastically gaining ground, and the C64 appeared to be the all in one solution for anyone seeking both a computer and console.  Commodore even had the foresight to add an expansion slot, allowing it to support not only C64 carts but also tape and later disc drives as well.  You could even use Atari VCS/2600 controllers thanks to the port being identical (the 2600 was built solely from off-the-shelf parts to lower cost, but this choice also meant you had no rights to the hardware).  The result in America was huge, moving millions of units and at its highest peak manufacturing was over 400,000 units produced a month.  In Europe it was a bit tougher with the ZX-Spectrum boasting a price tag of less than half the price of the C64, but import costs made the computer much more on par outside of its home base of Europe.  Even then, the worldwide success of the C64 and software produced ended up with the Commodore 64 leading in sales against the Spectrum later in the 80s.

Aside from a price and hardware perspective, the programming language of BASIC and simple requirement that all games had to be 64 kb or less (the size of the RAM it would be loaded in that would need to consistently refresh after a certain period of time or power disruption) made software extremely popular both to produce and purchase.  Gaming magazines would include instructions for programming games into your C64 one line at a time – which anyone who has done this knows its 2 hours of typing for 10 minutes of demo – however it does stand as one of the earliest demo/shareware distribution models.  Later in time the C64 even had alternative operating system (OS) options, a mouse for graphical user interface (GUI), and even a modem for limited online use.

For most of us that grew up quite young in those days, myself definitely included, the C64 was first and foremost a gaming platform.  It offered experiences that seemed more in depth than the average Atari title and with a functioning keyboard could also be more story driven than an NES title.  It really was the convergence of two worlds and not until the late 90s and again in contemporary times would we see this hybrid between computers having the full package of personal, business, educational, and gaming software.  That’s why although it is hardly the most impressive microcomputer in history, it definietly stands as one of the most popular.

Written by Fred Rojas

October 1, 2013 at 2:16 pm

Posted in Hardware Profile, Lessons

Tagged with ,

Rise of the Triad Historical Context

leave a comment »

rott_wolf3d2_protoRise of the Triad is more significant than it initially seems in the annals of first-person shooter (or Doom clone) history. In fact, had it remained under its original title, Rise of the Triad: Wolfenstein 3D Part II it would probably have more awareness and fall under the pantheon of id titles still garnering praise on Steam and Good Old Games. Due to several disputes that arguably are the direct result of John Carmack, a co-founder of developer id Software and lead in milestone shooters Wolfenstein 3D, Doom, and Quake, the project was terminated in 1993 to avoid clashing with upcoming title Doom. This led to several disputes within the developer of Doom, id Software, and the planned publisher of Doom and previous publisher of several other titles, Apogee Software.

In the beginning there were two companies: developer id Software and publisher Apogee Software. For the most part Apogee was better known as its later developer 3D Realms, the team responsible for Duke Nukem 3D and originally Prey. Before that all happened, Apogee was making its money publishing id Software’s earliest successes including Commander Keen and Wolfenstein 3D. Apogee utilized the plan of “shareware” to market games, which is a method of giving people approximately 25-33 percent of a game to try out with the option to purchase the full game if interested. John Romero, the then lead designer on Doom at id Software, canceled Rise of the Triad and John Carmack decided to have id self publish so Apogee ended up not publishing Doom.  id Software’s co-founder Tom Hall (Carmack and Romero were the other founders) left id to join Apogee. Apparently Hall had concern over the amount of violence and gore in Doom, a project he assisted greatly in creating. Ironically a year later when he completed work as lead designer on Rise of the Triad for Apogee, it would have even more blood and gore than Doom, including a random occurrence where an enemy would explode into gory giblets and “Ludicrous Gibs!” would appear on the screen.


apogee_logoAfter the split, id Software would celebrate success with Doom and its next franchise, Quake, as a combination developer and publisher. id would continue3drealms_logo to utilize the shareware marketing strategy begun by Apogee and even coin the term “gibs” in Quake, meaning literally giblets of human gore and flesh. While the concept of gibs in games was started in either Wolfenstein 3D or Doom, both created by id, I don’t recall seeing the word “gibs” until Rise of the Triad and definitely know it was popularized by Quake. Apogee would release Rise of the Triad on its own as both publisher and developer, the project led by Tom Hall and his team he dubbed the “developers of incredible power”. Aside from some preliminary work in the early-to-mid 90s on Prey, which would eventually be re-developed by Human Head Studio and published by 3D Realms (Apogee) 12 years later, the team’s only title was Rise of the Triad. When Apogee renamed itself to 3D Realms (although it kept Apogee as its traded company name) in 1994, Hall would assist in the Duke Nukem series including the very popular Duke Nukem 3D before leaving to work with John Romero at Ion Storm and produce Deus Ex.

Rise of the Triad is not only significant for being a game where the point is to navigate a predominantly linear level killing everything in your path (Call of Duty says “hi”), but also as the first title to be a total conversion mod. It started as an expansion pack and became a highly modified engine of Wolfenstein 3D, but little hints like the Nazi-esque uniforms of enemies give away what it originally started life as. Furthermore the engine was a technical marvel containing features like panoramic skies, simulated dynamic lighting, fog, bullet holes, breakable glass walls, and even multi-level environments – although it faked it well, Doom was a flat plain in the eyes of the engine. Rise of the Triad was going to be even more dynamic with pre-loaded enemy packs that would randomly generate and extra levels and challenge runs, but all were scrapped due to time constraints and technical limitations. It is also one of the few games of the time that had environmental hazards so drastic that they were usually one-hit kills.


Although mostly forgotten in time, Rise of the Triad is significant in assisting to move the genre of the first-person shooter to the complex world it is today. As a transitional title, it really has a hard time holding up against the more beloved and popular shooters of the time (as our review clearly demonstrates). Still, it was in the nucleus of shooter innovation and many of the crazies and best features of contemporary FPS started almost 20 years ago with the only shooter ever to come out of Apogee software and the Developers of Incredible Power.

Written by Fred Rojas

August 3, 2013 at 11:00 am

Hardware Profile: Game Cartridges

leave a comment »


It’s hard to believe, but the typical cartridge game began to phase out of gaming in 1995 when the new wave of consoles and the subsequent movement to disc-based media began. I’m sure plenty will be quick to point out that the N64 was a cartridge-based console, but I truly believe this decision was the result of Nintendo not wanting to give up the control over manufacturing and sordid history making a machine that read discs. This change happened 18 years ago, which means there is a significant number of gamers that are now in their early to mid 20s that have never played games on a cart. This is truly a shame because the versatility of cartridges is much more abundant than most people realize, but the crutch will always be that carts offer little storage for massive prices. In today’s lesson we will discuss what makes up a cartridge, benefits/setbacks, and how the cartridge was used to literally upgrade consoles for more than two decades.

Anatomy of a Cartridge

NESCartridgeOpenFor the most part a cartridge is a plastic-encased board containing “read-only memory” or ROM data that has exposed pin connectors (male) that fit into a slot on the reading device (female). Since it’s a complete circuit board, multiple ROMs can be used to store data, which we will get to in a second, and allow for an expanse of memory based purely on the connected printed circuit board (PCB) and the device reading it. Of the most popular uses for ROM carts is software for computers that expands almost solely into gaming once game consoles begin to release. It was the dominant format for many microcomputers, mostly used outside of the United States, and almost all game consoles for two decades (mid 1970s to mid 1990s). Many believe that the use of a cartridge was due to no other decent format being available, but this is simply untrue. By the time carts were in use, programs/games could be loaded via floppy disc (5.25″), compact cassette tapes (not unlike audio cassettes), and removable circuit boards (think arcade machines). The decision to use cartridges was due to the fact that the memory was permanently stored in the chip and memory mapped into the system’s address space (ie: it was basically integrated into the machine). As a result data access was almost instant (no load times), the cart could be used for hardware expansion in addition to loading code, and due to the streamlined process it was very difficult to extract the code and thus safest for mass distribution of software. Unfortunately it was also quite expensive and thus storage space was expensive, not to mention that the code could never be altered or accessed, which made for difficulty with saving and loading.

The first console to use cartridges was the Fairchild Channel F, a game console that predates the Atari VCS/2600 by a year and featured many of the same aspects of the pop culture sensation. It is not as widely known due to the similarity of titles that were mostly archaic sports titles or educational material. Naturally in 1977 when Atari introduced the VCS with arcade ports and diverse addicting third party titles like Pitfall resulted in the streamline of the format. Due to the fact that the cartridge is an integrated part of the machine, Nintendo made heavy use of the cartridge to make both the Famicom and the NES capable of keeping up with changing times for quite a while. Not only that but some carts, especially in Japan where custom chips were allowed, were capable of acting as a game, a hardware upgrade, a save state, and an anti-piracy device all at once. This practice was pretty standard for most consoles that utilized carts until the aforementioned 32-bit era where expansions moved to actual hardware expansion ports and even the N64, which could lean on carts, used ports instead to expand the on-board RAM.

ROM types and Special Chips



The oldest ROM type is Mast ROM, which refers to information stored permanently via integrated circuit when the chip is manufactured.  The term “mask” refers to the masking pattern on the integrated circuit when it is created.  This is the oldest form of ROM and definitely what went into the creation of Nintendo carts, which were manufactured by Nintendo and worked as a supplemental business to the license fees and cut of each unit sold on the NES.  This is the cheapest way to get the most memory, however unless you have a mass production line the masking of the integrated circuits can be a costly endeavor and without the vast quality controls like Nintendo had one poorly made program or coding error can ruin an entire production.  You can understand why Nintendo was so strict back in those days, especially because masked integrated circuits cannot, by their very nature, be re-written or reused.  The up side is that there is little chance once successfully produced that the chip will suffer decay, failure, bit rot, and various other issues that can plague other ROM types, which is why you will see most classic carts last nearly forever (please note that the save battery memory is a different story).  I know that this was the most common type in all Atari consoles, NES, Turbografx-16, and Sega Master System.  Beyond that it is entirely possible that the SNES, Genesis, 32X, and portable consoles may have used other formats like  Erasable Programmable ROMs (EPROM) that allowed you to reprogram chips with ultraviolet light or Electronically EPROMs (EEPROM) that allow you to erase and re-write electronically.  There are generic PROMs that can be created with a ROM burner and remove the need to produce them like a mask ROM, but they are still one time use and were more for arcade and pinball repair, which may mean they can be found in Neo Geo carts.  As for Jaguar and N64, I’m guessing EEPROMs, but there’s still a striking possibility that these companies known for mass production of carts since the 80s still made traditional mask ROM carts, especially with the lowering price of PROM and the relative high emulation/piracy of the late 90s.  It has been said that PROM, EPROM, and EEPROM may have a higher chance of failure, but most carts don’t seem to have a problem no matter what is used and plenty of fixed arcades have had no problem whatsoever (especially because they can be wiped and reprogrammed).  ROM chips typically varied in size from 2 kbit (not to be mistaken for the large 2 kbyte) that held roughly 256 bytes all the way up to the expensive 32 megabit chip that held 4 megabytes.  This is why you saw statements on Street Fighter 2 that said things like “32 MBit Chip!” because it was touting massive storage space.  Some games are rumored to have even larger ROM chips that compressed data and justified hefty price tags like Final Fantasy III launching for $80 in the US or Street Fighter Alpha 2 having load times on the SNES while data uncompressed.  It was all par for the course when trying to get as much data on a cart as possible.  I do believe that as RAM was integrated into consoles, like we saw on the N64, that compression and temporary storage allowed for more data to be stored for the larger 3D games of that console.

In addition there can be extra chips placed into the carts to allow all kinds of extra functionality, which basically means the carts acted as hardware upgrades.  This makes sense when you think about the massive leaps between launch games and later releases.  2600 carts were pretty straightforward, but NES carts had a few extras like the anti-piracy NES10 chip that was required to be on the PCB in order for the NES to play (if it doesn’t detect this due to a loose connection you get the popular blinking console effect, which is the NES10 chip on the console rejecting the cart).  Saving became a large feature as well, which led to almost all cart-based save states to be stored on Static Random Access Memory (SRAM), which was able to keep save data stored after power is cut (uncommon for RAM) provided that a small current still passed through.  This is why a lithium button cell CR2032 battery is used in most cases and once that battery dies (typically around 20 years, but can go much longer) the SRAM no longer functions.  To fix this, simply de-solder the dead battery from the SRAM leads and solder in a fresh battery to the leads.  In Sonic 3 as well as a few others, Sega decided to use much more expensive non-volatile RAM (NVRAM), which was an early form of flash memory we have today and retains information after power is cut, which is why Sonic 3 carts should retain save data indefinitely.

As for expanding the functionality of a console, special chips could literally upgrade a console to allow it to do things it was never intended to do.  In Japan the Famicom was allowed to have special chips put into its carts so companies could go crazy on internal development – due to no video game crash, Nintendo did not force Japanese development studios to manufacture carts through them like in the US.  This explains the VRC6 chip in Akumajo Densetsu (Castlevania III) that allowed for extra channels on the Famicom’s unused sound port.  In America Nintendo began releasing special Memory Management Controller (MMC) chips that allowed for some of the Japanese innovation to happen on the NES, albeit in a stripped form due to the different hardware profile of that console.  Here are some of the popular chips:

  • UNROM: Split the program data from a 32 kbit chip into two 16 kbit chips, one that stored the data and one that transferred data to RAM chip for faster loading and effects.  This was seen early with impressive titles like Ikari Warriors and Mega Man and assisted in side scrolling of dynamic characters and certain effects.
  • MMC1: Allowed for save games.  In addition to having 16 or 32 kbit ROM programs, 4 and 8 kbit SRAM was integrated and powered with a button cell lithium battery.  This was essential to getting Famicom Disk System titles that had save data to run on NES carts such as Legend of Zelda, Metroid, and Dragon Warrior.  Although Metroid didn’t support saved checkpoints like the FDS version did, massive passwords allowed pre-stored save data.
  • MMC2: Basically split a 32 kbit chip into a 24 kbit chip with two sets of 4kb banks for pre-loaded graphical data.  It allowed more graphics to display on screen at once due to the additional banks being only referenced without assets in the main code.  Only used for one game, Mike Tyson’s Punch-Out!! to load the massive sprite opponents.
  • MMC3: This massively split the memory allocation and integrated a scanline IRQ counter to allow for certain assets to scroll while others remained stagnant.  Necessary when keeping dynamic items like consistent scores, life bars, and more fixed on the screen and moving while a second “window” was performing more dynamic gameplay.  Nintendo’s most popular chip and key in many large titles such as Mega Man 3 and Super Mario Bros. 3.
  • MMC4: Utilized only in Japan for the Fire Emblem titles to create effects similar to the MMC2.
  • MMC5: The biggest, most versatile, and most expensive chip Nintendo offered.  Developers avoided it like the plague due to the high cost and reduced profit margin.  This had several different memory allocations, IRQ counters both horizontally and vertically, allowed for very dynamic effects, and opened extra sound channels along with a 1KB of active RAM.  Since Koei was almost the sole user and none of its MMC5 titles came out in America, the only title to really use it was Castlevania III to create a similar, but still inferior, version of the Famicom title in America.  The MMC5 chip was so complex that almost all clone consoles do not support it and emulation took a long time to decipher integration of the ROM.  For this reason alone Castlevania III was one of the few games you had to have original hardware to run.  Emulation is currently no problem however clone systems that run actual carts still do not support the title.
  • MMC6: This final mapper chip extended the size of SRAM over the MMC3 chip by 1KB, which allowed for the larger save files of Startropics and its sequel to save games.

There were more custom chips that did eventually show face in America, but these were the most common and basic chips.  Nintendo would loosen their policy and generate several custom chips for the SNES as well allowing for all kinds of impressive hardware tricks.  Some of those are as follows:

  • DSP: The digital signal processor chip allowed for various 2D and 3D calculations in several iterations that allowed for asset conversion for older graphics techniques, rotation effects, raster effects, and mathematics that could all be performed independently on the cart instead of using the SNES.  Popular games that used this rotation technique are Super Mario Kart and Pilotwings.
  • sfx_snesSuper FX: This was a supplemental CPU for performing graphics calculations that the SNES simply could not do.  Think of it as an external graphics card for the 16-bit console, as it was a separate 16-bit CPU integrated into the cart.  Since it had simpler duties than the SNES, the Super FX chip’s iterations were capable of 10.5 mhz and eventually 21 mhz of processing power, which blows past the 3.5 mhz processor of the SNES and allowed for the 3D rendering of titles like Starfox.  Later updates allowed for support of larger ROM sizes (for a long time the game program had to be less than 8 mbit or 1 mbyte of data).
  • Cx4: This chip was Capcom’s way of showing off rotational, polygonal, and wire-frame effects in the Megaman X series.  While the first title used a traditional chip, Megaman X2 and X3 had special test screens and crazy title screens that to this day cannot work on cloned consoles, flash carts, or even later hardware emulation (like the Megaman X Collection on PS2/Gamecube).  Of course the emulation community has successfully gotten this working on software-based ROMs.
  • SA1: The Super Accelerator chip that independently worked with the SNES as a co-processor creating a faster 10 mhz clock speed over the 3.5 of the original hardware, faster RAM functionality, MMC capabilities (see the NES section above), data storage and compression options, and new region and piracy lock out protection.  This chip was essential in certain impressive titles like Super Mario RPG! and Kirby’s Dream Land 3, which cannot currently be replicated on flash carts.  Software emulation on both the openware PC scene and official Nintendo Virtual Console titles do support the chip.

There were several others that were utilized for specific function in addition to the Genesis having the Sega Virtual Processing (SVP) chip in Virtua Racing to make the most technically impressive 16-bit game ever created.  Unfortunately it also cost $100 at initial launch, wasn’t impressive from a gameplay standard, and doesn’t work with any clone or flash carts out there.  Emulation is possible but with varied results.

Well there you have it.  A brief breakdown of the technical marvel that was the cartridge and the hardware benefits it provided.  It’s almost difficult to imagine a world without load times, where data access is instantaneous (something even flash carts can’t currently do).  While it wouldn’t be possible with the massive memory required today and the equally massive cost of manufacturing, there was a time where a few bits in a plastic case meant to world to each and every gamer.

Written by Fred Rojas

July 30, 2013 at 8:35 pm

Interview: Super Icon

leave a comment »

Claire Hill-Whittall

Claire Hill-Whittall

Richard Hill-Whittall

Richard Hill-Whittall


Super Icon is the independent developer responsible for the impressive retro flashback title Life of Pixel.  Aside from developing this and a few other titles on the Playstation Mobile platform, this husband and wife duo has big plans afoot, not the least being a new kickstarter project to bring Life of Pixel to a wider audience on PC, Mac, iOS, and Android (with additional content).   Recently Creative Director Richard Hill-Whittall and Claire Hill-Whittall in Business Development were kind enough to answer some questions that we at Gaming History 101 and you readers were wondering about this clear appreciation for early consoles and microcomputers.

GH101: Where did the idea to make Life of Pixel come from?

Richard: Life of Pixel is essentially my pet project; an exploration of
8-bit games computers and consoles, from the graphical and audio limitations
through to references to many of the classic games I grew up with. I always
wanted to do 8-bit graphics, but missed it the first time round as I was too

GH101: Did the game always start off as an homage to 1980s gaming or did you
originally focus on one console?

Claire: It started off as a mixture of 8 and 16 bit machines, but we decided to
focus on 8-bit as there were so many machines to explore with such a wide
range of different graphical limitations and styles.

GH101: Why has your company chosen the PlayStation Mobile platform?  Any plans to port this or other titles to additional platforms such as PSN mini,  Steam, or XNA? [This question predates the recent Kickstarter reveal – ed]

pixelClaire: We had a good relationship with Sony, and were really keen to work on PSM –
the experience was good, and Sony were very supportive. The only
disappointment is the limited sales on the format.

We have just launched a Kickstarter for Pixel, to take it over onto PC, Mac,
iOS and Android. We hope to get the game out to as wider audience as
possible and to keep evolving and expanding the game.

GH101: We noticed you had specific artists for each console’s music and that you
gave them proper credit during the game instead of in the final credits.  How did you go about finding these artists?  Was it a conscious decision to put their names in before the final credits?

Claire: We posted on a couple of forums looking for chip musicians where we explained
about the game and the response was incredible. So many great chip musicians
came forward.

We definitely wanted to display their names in lights as it were, as the
audio is such a major aspect of the game, and these guys did some stellar
tunes for a very limited budget. We are very grateful to them.

GH01: Your site has released level maps for those complaining of the difficulty.
As someone who beat the game 100 Percent without the maps, and I noticed you
make no apologies for the difficulty on your
site, did you ever make tweaks to the difficulty?   I felt the levels
were tough but fair and if you explored enough, there were always safe paths
through a level.  Did you make specific level design changes based on
pre-release feedback?

Richard: Well we could complete all the levels easily enough – but you get so close
to a project, you know every part of each level. Some users however were put
off by the difficulty, and in fairness there were some “cheap” deaths.

I went through every level and tweaked them to remove unfair “luck” based
potential death (i.e. when you do a blind drop and land on a monster!).  So
while I didn’t try to make the levels particularly easier, they are now a
fair test of skill, not chance. We are a lot happier with all of the levels
now too.


GH101: I loved the bonus levels that unlock at the end and how you chose to handle them.  Was it always planned to

be in the game or later added on?

Richard: Nope [it wasn’t].  I was working on the update, loving revisiting the game, and
thought I want to add a new machine, especially for those players who
played through the entire game.  As a thank you, really.

***WARNING: Mild spoilers in the following statement, skip to the next question if you don’t want to be spoiled.***
I tweeted for suggestions on the machine and the Master System got the most
votes.  It was cool too as it had extra colors to play with over NES, which
was nice.

GH101: Do you have any plans on a sequel?  Perhaps a 1990s/16-bit era title?

Richard: We do plan on a sequel – the 16-bit systems. We’re keen to explore more
8-bits first though as there are some good ones we missed.

GH101: What is your favourite gaming console or microcomputer and why?

Richard: I think the C64 (Commodore 64).  The music swings it; SID music (and sound fx actually) is
just the best! (smiles)

GH101: What project(s) are you currently working on?  When and where can we expect
Mega Blast

Mega Blast

Claire: We have a second PSM title in the works, called MegaBlast, which is an
intense old school score-attack arcade shooter influenced by classic space
shooters such as Galaga, Galaxian, and Space Invaders.  [It] should be out this

We have a couple of Vita titles in development.  They are slow going but we
aim to finish them this year.

Also we are doing a Unity title – sort of a cross between Battlezone and
Doom – really enjoying that one.  Also Rich has done a solo project, Super
Golf, which is a 2D golfing platformer through popular culture coming very
soon for PC, Mac, iOS and Android.

Written by Fred Rojas

May 10, 2013 at 11:00 am

Genre Study: Roguelikes and MetroidVania Games

with one comment

Lately many games that embrace former genres that had fallen to the wayside are making a comeback.  As a result lots of games press and developer media contacts like to coin phrases that are based on gameplay styles not many are familiar with.  When someone tells you that Tokyo Jungle is a “roguelike” or that Guacamelee is a “MetroidVania” title, it’s entirely possible you have no idea what that means.  After this article, you will no longer have that problem.


Original Rogue on IBM compatible

Original Rogue on IBM compatible

You may or may not know that the roots of the roguelike come from a 1980 computer game called Rogue, which established the dungeon crawler.  This game was considered genre-changing when compared to the slower paced text adventures such as Zork and Dungeons & Dragons video game ports like Wizardry and Ultima.  Developers Michael Toy, Glenn Wichman, Ken Arnold, and Jon Lane site a hybrid between both types with influences from D&D as well as the text adventure Colossal Cave Adventure, which featured a detailed description of a cave system in Kentucky that was so precise it was used by a tourist to navigate parts of the actual caves it was based on.  The result was a game where an adventurer explored a multi-floored dungeon, collecting items and facing enemies, in search of a final artifact (in this case the “Amulet of Yendor”) to complete the game.  Each floor was more difficult than the last, you could not backtrack to a previous floor, and if you died you got a game over, simple as that.  Additionally the layout of the dungeon, items, and enemies were all randomly generated, which meant you would ideally never play the same game twice.  Despite the fact that you would have to start over, the experience of playing the game assisted you in handling enemies, utilizing items, and preparing for future encounters as such that you could eventually beat the game.  Needless to say the game had a tough barrier for entry and popularized itself mostly on Unix systems in colleges across the country, but the public found it too complex and difficult.

Since then the concept originally started in Rogue was expanded upon to integrate classic RPG leveling systems, bosses, save states (for longer quests), and even a way to retrieve your items from the body of your previous adventurer.  These concepts would be applied mostly to 16-bit era Super Famicom (SNES) titles in Japan known as the Mystery Dungeon series, notably Shiren the Wanderer.  Granted both these series and Shiren would get sequels on the Wii that did come stateside, which might explain your familiarity with them now, but if you get a chance look up the fan translation patches online and check out the originals.  Later on this concept would come to light in a stronger way with Diablo, although certain characteristics of that game – like the ability to revert back to a save and the entire concept of a save mechanic – are incosistent with a roguelike.

Mystery Dungeon II: Shiren the Wanderer on Super Famicom (SNES)

Mystery Dungeon II: Shiren the Wanderer on Super Famicom (SNES)

Modern day terms, and what basically defines a title known as a “roguelike”, refer to a game that has randomly generated levels/layouts, random items/enemies, and permadeath.  Permadeath means that when you die all your levels, items, experience, gold, and even save game are completely lost and you are forced to start over.  In some cases finding your body will grant you items back, but overall there needs to be significant consequences for your actions.  Best examples of games like this are Binding of Isaac and FTL on Steam, Tokyo Jungle on PS3/PSN, Spelunky on 360/Steam, and of course Shiren the Wanderer or Pokemon Mystery Dungeon on Wii.  These are true roguelikes and there are plenty more, but I wanted to demonstrate the ones you probably have heard of.  Other titles skate the line and are (mis)labeled roguelikes like Diablo III and Dark SoulsDiablo III utilizes save mechanics and no true permadeath despite having all other aspects of a roguelike and Dark Souls suffers just the opposite with its very clear system of permadeath but lack of randomization in game design.  So there you have it, when you hear someone refer to a game as a “roguelike” you will now know what they’re talking about – assuming of course that they aren’t mislabeling a title, which happens more times than not.

Fun Fact: Did you know that before first-person shooters was a genre, FPS titles were known as “Doom clones”?


This is a much simpler concept to grasp as it doesn’t quite have the rules that roguelikes do.  In truth the title “MetroidVania” is a bit of a misnomer because the genre began with Metroid, but because Konami decided to copy the format with Castlevania: Symphony of the Night and Metroid titles were few and far between at the time the term “MetroidVania” stuck.  Much like the roguelike, modern day programming has built up a concept such as this to give a hybrid between a game where you explore as well as overcome obstacles.  Additionally most titles in this vein require 2D sprites or polygonal renders on a 2D plane, which is ideal for fans of the sub-genre.  So basically the format that Metroid, Super Metroid, Symphony of the Night, and any Gameboy Advance iteration of either franchise is a MetroidVania.

The map of Super Metroid

The map of Super Metroid

The basic building blocks of a MetroidVania is to offer a large map that is completely available from the very start of the game.  There are no levels and setting changes (ie: planets, countries, etc), everything takes place in one pre-defined area that can be fully explored from the first moment the game starts.  From there items, doors, and enemies are scattered throughout the area to keep the character limited in his or her actions until certain points – it’s a creative way to offer some semblance of linearity to a game.  Additionally obstacles such as the aforementioned locked doors, high ledges, long jumps, and physical hazards will assist in telling the player that they will need to come back to this location once they’ve obtained the appropriate item/ability/weapon.  For this reason the MetroidVania sub-genre is extremely focused on exploration and finding absolutely every single nook and cranny the map has to offer without forcing the player to do so.  In Symphony of the Night, it’s possible to explore 200.6% of the map (but I won’t spoil how), and in almost every MetroidVania title in the Castlevania franchise you will only get a true ending after completely exploring a map and collecting everything.  This type of game has come back into style, but still remains somewhat niche given the old school mentality of this type of game and the frustrating planning involved in development.  Probably the most popular recent MetroidVania titles are Shadow Complex on XBLA, Dust: An Elysian Tale on XBLA/Steam, and just two weeks ago Guacamelee on PS3/Vita/PSN.

Fun Fact: Both the Metroid (Other M) and Castlevania (Lament of Innocence and Curse of Darkness) franchises migrated to a fully rendered 3D world and almost everyone, especially fans, unanimously hated it.

So there you have it.  Now you no longer have to wonder what the hell we are talking about when we discuss Roguelikes or MetroidVania titles ever again, and knowing is half the battle.

Written by Fred Rojas

May 6, 2013 at 7:29 pm

For the Love of the Light Gun

leave a comment »

zapper2I can’t explain my love for the light gun.  It’s one of the oldest forms of interactive entertainment, dating back to the carnival days where you would fire air rifles at a metal bullseye to make an old man’s hat pop up or a dog bark.  Once the gun made the transition to video games it honestly became one of the most lifelike and violent gaming tropes throughout history.  Not to get deep with it, but you are pointing a gun at a target, usually alive, and shooting it.  There is not other gesture like it, you are shooting a modern device to kill something, virtual or not.  At the same time it also doubles as the most simple form of proficiency.  I don’t think anyone will claim that being good at Duck Hunt or Lethal Enforcers relates to being a good shot in a shooting range, but it’s got a much higher chance of significance than being able to get a headshot in Call of Duty.  Whereas the FPS emulates the concept of aiming and firing a gun with 1:1 responses from a controller, a light gun truly simulates the experience.

lethalenforcersLight gun games have been a niche genre, but that doesn’t prevent them from withstanding the test of time and being available on most home consoles and one of the most popular games, even today, in arcades.  I guess it’s because despite the maturity implied behind firing a gun, it’s one of the easiest concepts for us to pick up.  I’ve been on many adventures thanks to light gun games – whether it’s cleaning up the future in T2: The Arcade Game, battling zombies in a haunted house through House of the Dead, or enjoying some of the worst acting of all time in Mad Dog McCree.

It’s also significant because the light gun is a genre nearly impossible to emulate and doesn’t translate well in today’s technology.  While there are exceptions, you will have a hard time playing Crypt Killer properly on a PC running MAME and most HDTV technologies don’t support light guns from the past.  Authenticity is as important as the genre itself.  This month I’ve decided to dedicate to a timeless style of video game that I always make first priority when buying a new (or old) system: the light gun shooter.  Come join me to learn about some of the best, worst, funniest, and definitely weirdest titles to ever grace the hobby of video games.  Thanks to my huge CRT television and original hardware, I can even show you videos.

Written by Fred Rojas

April 1, 2013 at 8:55 pm

Buying Guide: 3DO

leave a comment »

We all love our retro consoles, but in many cases the consoles we are buying are because they are cheap enough or we have enough money to purchase what we never were able to in our youth.  Unfortunately the business of making used retro items available to the masses can at times be a money grubbing market where consumers are deceived by people they will never meet in real life.  As an individual who has spent the last decade scouring the local area, conventions, eBay, and the internet as a whole I have learned many valuable lessons.  For that reason I present my buying guide series, which is a handy quick guide to knowing what to purchase and what will cost an arm and a leg to replace.

Historically the 3DO, most commonly associated with Panasonic’s license because it had the largest manufacturing numbers and advertising campaign, is the most expensive video game console of all time.  Trip Hawkins, founder of Electronic Arts (EA), formed the 3DO company for software development and developed a hardware spec that could be licensed to companies for manufacturing, much like companies have done with VCRs and DVD players.  Unfortunately since the profit for manufacturers had to come from the sale of the hardware itself – all other consoles were sold at a reduced price for a loss and software sales would close the gap for profits – and the 3DO sold for the staggering price of $700.  As a result, few consoles were actually sold and three companies (Panasonic, Sanyo, and Goldstar) had already manufactured units that weren’t selling.  This balance of supply and demand results in the 3DO being the much more reasonable $100-$150 on the used console market these days, but few know what actually came in the box.  Here’s what you need to get it working:

  • AC cord: Since it was manufactured by multiple companies and doesn’t require an AC adaptor, a simple AC cord with a two-pringed circle end (looks like a figure 8) can be used.  Replacement cables can be found at Walgreens or RadioShack for roughtly $3-$5.
  • A/V composite cables: These cables are just your standard yellow/white/red composite cables that plug directly into the ports on the back of the console.  Again, due to the multiple manufacturers there is no console specific A/V cable.  Replacements can be found everywhere for $2-$10 and I grabbed mine from unused composite cables in DVD players and other HD compatible devices.
  • Controller: Even though the plug port looks like an Atari 2600 or Genesis port, you can only plug actual 3DO controllers into it.  Almost all of the controllers look like a 3-button Genesis controller and each controller has a controller port on it for daisy chaining additional controllers (the 3DO only has one controller port).  Controllers are pretty easy to find still and you can pick one up for about $10-$20 online.  There is a 6-button controller out there, which like so many other consoles was only released because of a Street Fighter II port and isn’t worth the high price due to rarity.

Optionally the only accessory you may want is the light gun peripheral, the “gamegun”, for a handful of FMV shooters (Mad Dog McCree, Who Shot Johnny Rock?) and a mouse for certain PC ports like Myst.  Be warned, rarity makes these peripherals an expensive endeavor that may cost close to or more than the console itself.  I could easily find most of the poor light gun games on the 3DO on the Sega CD as well and they only suffer slight quality loss and are much less expensive.  The one thing you may want to pay attention to is the console type you buy.  Like other CD systems, there are both top loading and slide tray versions of the 3DO and you may want to consider dropping the extra scratch for a top loader for reliability.  Here are the different manufactured types:

  • Panasonic made two models in America, the FZ-1 and the FZ-10, and the FZ-10 is definitely the recommended model with a lighter and slimmer design and a top loading CD tray.  There was also a model, the ROBO, which was only sold in Japan and had a slide loading 5-disc CD tray.
  • Goldstar only had one model in America, the GDO-101M, and it looked almost identical to the FZ-1 and featured a slide loading tray just like that model.  Although reliable, the Goldstar is nearly impossible to find parts for so a broken belt on the tray means a required replacement console.
  • Sanyo only made one model, it was only sold in Japan, and it has a similar design to the FZ-1 and the GDO-101M.  It is probably the most rare of the consoles.

Since the 3DO is region free and will play any 3DO disc, you can really pick up any version you want but the price increase can get out of control.  I don’t trust drive trays of the 90s personally so I’ve only ever owned an FZ-10 and I’ve never had one stop functioning.  A complete console should run you $70-$100 online and hopefully even less than that if you can find a used console in a store or at a convention.  video coming soon – ed.

Written by Fred Rojas

December 24, 2012 at 12:39 pm


Get every new post delivered to your Inbox.

Join 197 other followers