Archive for the ‘Lessons’ Category
As I was looking into doing a history on this fantastic studio I came upon an excellent reference that was so good there’s no point in me doing one. While it’s easy to rag on big media conglomerates, IGN’s Mitch Dyer did a fantastic story of the origins of Ubisoft Montreal that includes stories of Splinter Cell‘s origin, the reinvention of Prince of Persia, and the visual treat that is Far Cry. It’s a fascinating story that documents the major franchises you can thank that studio for and a must read for gaming history buffs like ourselves. Head on over and check out House of Dreams: The Ubisoft Montreal Story when you can.
Not So Humble Beginnings
Before personalized computers were called “PCs” (or MACs for all you Apple people), they were better known as “microcomputers”. The name derives from the relatively small size and price of a computer with a microprocessor as the CPU and the same basic input/output structure for data and information. Much like PCs of today, this allowed software and game programmers to design a title all around one basic data flow and configuration and then optimize each specific microcomputer release for the specifications of that computer. American consumers even today are used to much lower prices than other countries and were slow to embrace the cost and concept of a microcomputer. That is, until the Commodore 64. At the time of its release the only major competitors in the US were the Apple II and Atari 800, boasting hefty price tags of $1200 and $900 respectively. With most game consoles priced at the time around $200 and some, like the ColecoVision, having computer add-ons for $400, the price endured for a microcomputer was restricted to certain households of higher income (and this doesn’t even include the cost for a monitor and desk to put it all in). Commodore had a different plan and thanks to vertical manufacturing and two strong chips to handle graphics and audio, the company went about making a microcomputer that could compete with the Apple II and less than half the price.
In 1981 Commodore was celebrating wide success of its business computer the VIC-20, which was a Personal Electronic Transactor (PET) computer integrated into a keyboard, whereas previous models had been an integrated monitor/computer/keyboard and much more expensive and large (although it may have been seen as a laptop). While the VIC-20 had worked great for at home accountants and small businesses because it could hook up to a television, Commodore had shown more interest in the side software of gaming that had been quite popular. Unfortunately the VIC-20 only had 5kb of RAM, even small by BASIC (the programming language) standards, but allowed for many educational programs and text adventures. Instead of bridging the gap between worlds the VIC-20 graphics processor by Robert Russell and the SID sound chip by Robert Yannes – both of whom preferred “Bob” and I like to refer to as “the Bobs” – were to be bridged together in a gaming console codenamed the Commodore Max or UltiMax, which was eventually scrapped. Instead the Bobs went to CEO Jack Tramiel and proposed making an all-in-one solution that was a sequel to the VIC-20, codenamed the VIC-40, for the consumer market at an inexpensive price. Tramiel retorted with a tall order: 64 kb of RAM (the VIC-20’s 64 kb RAM cart was the most popular and requested feature integrated in current versions), low cost (exact numbers unknown, rumors were a $600-$700 price tag), and a prototype ready in 3 months (it was proposed in November 1981) for Winter Consumer Electronics Show (CES). At the show it was Atari employees blown away by the ability to generate such an impressive machine that beat the specs of the Atari 800 and around the specs of the Apple II for under $600. As complicated as it sounded, the answer was simple vertical integration with Commodore purchasing the semiconductor plant that supported all of its parts, making the cost for a C64 a mere $135. After an arduous holiday season with no days off to make CES, a rename to the C64 to support the new naming conventions of Commodore’s other business computers, and certifications for mass production, it proudly entered the market in late summer 1982.
Like so many other success stories in history, timing played the most critical role in securing the C64. Not only was it on par with the highest competing computers of the time (the updated Apple IIe only barely beat its specs and that decision seemed almost intentional at the time, but it remained twice the price), but it had the versatility of being compatible with televisions and any S-Video monitor. Additionally due to this output decision the C64 was priced and positioned to now take on game consoles of the time, which were drastically gaining ground, and the C64 appeared to be the all in one solution for anyone seeking both a computer and console. Commodore even had the foresight to add an expansion slot, allowing it to support not only C64 carts but also tape and later disc drives as well. You could even use Atari VCS/2600 controllers thanks to the port being identical (the 2600 was built solely from off-the-shelf parts to lower cost, but this choice also meant you had no rights to the hardware). The result in America was huge, moving millions of units and at its highest peak manufacturing was over 400,000 units produced a month. In Europe it was a bit tougher with the ZX-Spectrum boasting a price tag of less than half the price of the C64, but import costs made the computer much more on par outside of its home base of Europe. Even then, the worldwide success of the C64 and software produced ended up with the Commodore 64 leading in sales against the Spectrum later in the 80s.
Aside from a price and hardware perspective, the programming language of BASIC and simple requirement that all games had to be 64 kb or less (the size of the RAM it would be loaded in that would need to consistently refresh after a certain period of time or power disruption) made software extremely popular both to produce and purchase. Gaming magazines would include instructions for programming games into your C64 one line at a time – which anyone who has done this knows its 2 hours of typing for 10 minutes of demo – however it does stand as one of the earliest demo/shareware distribution models. Later in time the C64 even had alternative operating system (OS) options, a mouse for graphical user interface (GUI), and even a modem for limited online use.
For most of us that grew up quite young in those days, myself definitely included, the C64 was first and foremost a gaming platform. It offered experiences that seemed more in depth than the average Atari title and with a functioning keyboard could also be more story driven than an NES title. It really was the convergence of two worlds and not until the late 90s and again in contemporary times would we see this hybrid between computers having the full package of personal, business, educational, and gaming software. That’s why although it is hardly the most impressive microcomputer in history, it definietly stands as one of the most popular.
Rise of the Triad is more significant than it initially seems in the annals of first-person shooter (or Doom clone) history. In fact, had it remained under its original title, Rise of the Triad: Wolfenstein 3D Part II it would probably have more awareness and fall under the pantheon of id titles still garnering praise on Steam and Good Old Games. Due to several disputes that arguably are the direct result of John Carmack, a co-founder of developer id Software and lead in milestone shooters Wolfenstein 3D, Doom, and Quake, the project was terminated in 1993 to avoid clashing with upcoming title Doom. This led to several disputes within the developer of Doom, id Software, and the planned publisher of Doom and previous publisher of several other titles, Apogee Software.
In the beginning there were two companies: developer id Software and publisher Apogee Software. For the most part Apogee was better known as its later developer 3D Realms, the team responsible for Duke Nukem 3D and originally Prey. Before that all happened, Apogee was making its money publishing id Software’s earliest successes including Commander Keen and Wolfenstein 3D. Apogee utilized the plan of “shareware” to market games, which is a method of giving people approximately 25-33 percent of a game to try out with the option to purchase the full game if interested. John Romero, the then lead designer on Doom at id Software, canceled Rise of the Triad and John Carmack decided to have id self publish so Apogee ended up not publishing Doom. id Software’s co-founder Tom Hall (Carmack and Romero were the other founders) left id to join Apogee. Apparently Hall had concern over the amount of violence and gore in Doom, a project he assisted greatly in creating. Ironically a year later when he completed work as lead designer on Rise of the Triad for Apogee, it would have even more blood and gore than Doom, including a random occurrence where an enemy would explode into gory giblets and “Ludicrous Gibs!” would appear on the screen.
After the split, id Software would celebrate success with Doom and its next franchise, Quake, as a combination developer and publisher. id would continue to utilize the shareware marketing strategy begun by Apogee and even coin the term “gibs” in Quake, meaning literally giblets of human gore and flesh. While the concept of gibs in games was started in either Wolfenstein 3D or Doom, both created by id, I don’t recall seeing the word “gibs” until Rise of the Triad and definitely know it was popularized by Quake. Apogee would release Rise of the Triad on its own as both publisher and developer, the project led by Tom Hall and his team he dubbed the “developers of incredible power”. Aside from some preliminary work in the early-to-mid 90s on Prey, which would eventually be re-developed by Human Head Studio and published by 3D Realms (Apogee) 12 years later, the team’s only title was Rise of the Triad. When Apogee renamed itself to 3D Realms (although it kept Apogee as its traded company name) in 1994, Hall would assist in the Duke Nukem series including the very popular Duke Nukem 3D before leaving to work with John Romero at Ion Storm and produce Deus Ex.
Rise of the Triad is not only significant for being a game where the point is to navigate a predominantly linear level killing everything in your path (Call of Duty says “hi”), but also as the first title to be a total conversion mod. It started as an expansion pack and became a highly modified engine of Wolfenstein 3D, but little hints like the Nazi-esque uniforms of enemies give away what it originally started life as. Furthermore the engine was a technical marvel containing features like panoramic skies, simulated dynamic lighting, fog, bullet holes, breakable glass walls, and even multi-level environments – although it faked it well, Doom was a flat plain in the eyes of the engine. Rise of the Triad was going to be even more dynamic with pre-loaded enemy packs that would randomly generate and extra levels and challenge runs, but all were scrapped due to time constraints and technical limitations. It is also one of the few games of the time that had environmental hazards so drastic that they were usually one-hit kills.
Although mostly forgotten in time, Rise of the Triad is significant in assisting to move the genre of the first-person shooter to the complex world it is today. As a transitional title, it really has a hard time holding up against the more beloved and popular shooters of the time (as our review clearly demonstrates). Still, it was in the nucleus of shooter innovation and many of the crazies and best features of contemporary FPS started almost 20 years ago with the only shooter ever to come out of Apogee software and the Developers of Incredible Power.
It’s hard to believe, but the typical cartridge game began to phase out of gaming in 1995 when the new wave of consoles and the subsequent movement to disc-based media began. I’m sure plenty will be quick to point out that the N64 was a cartridge-based console, but I truly believe this decision was the result of Nintendo not wanting to give up the control over manufacturing and sordid history making a machine that read discs. This change happened 18 years ago, which means there is a significant number of gamers that are now in their early to mid 20s that have never played games on a cart. This is truly a shame because the versatility of cartridges is much more abundant than most people realize, but the crutch will always be that carts offer little storage for massive prices. In today’s lesson we will discuss what makes up a cartridge, benefits/setbacks, and how the cartridge was used to literally upgrade consoles for more than two decades.
Anatomy of a Cartridge
For the most part a cartridge is a plastic-encased board containing “read-only memory” or ROM data that has exposed pin connectors (male) that fit into a slot on the reading device (female). Since it’s a complete circuit board, multiple ROMs can be used to store data, which we will get to in a second, and allow for an expanse of memory based purely on the connected printed circuit board (PCB) and the device reading it. Of the most popular uses for ROM carts is software for computers that expands almost solely into gaming once game consoles begin to release. It was the dominant format for many microcomputers, mostly used outside of the United States, and almost all game consoles for two decades (mid 1970s to mid 1990s). Many believe that the use of a cartridge was due to no other decent format being available, but this is simply untrue. By the time carts were in use, programs/games could be loaded via floppy disc (5.25″), compact cassette tapes (not unlike audio cassettes), and removable circuit boards (think arcade machines). The decision to use cartridges was due to the fact that the memory was permanently stored in the chip and memory mapped into the system’s address space (ie: it was basically integrated into the machine). As a result data access was almost instant (no load times), the cart could be used for hardware expansion in addition to loading code, and due to the streamlined process it was very difficult to extract the code and thus safest for mass distribution of software. Unfortunately it was also quite expensive and thus storage space was expensive, not to mention that the code could never be altered or accessed, which made for difficulty with saving and loading.
The first console to use cartridges was the Fairchild Channel F, a game console that predates the Atari VCS/2600 by a year and featured many of the same aspects of the pop culture sensation. It is not as widely known due to the similarity of titles that were mostly archaic sports titles or educational material. Naturally in 1977 when Atari introduced the VCS with arcade ports and diverse addicting third party titles like Pitfall resulted in the streamline of the format. Due to the fact that the cartridge is an integrated part of the machine, Nintendo made heavy use of the cartridge to make both the Famicom and the NES capable of keeping up with changing times for quite a while. Not only that but some carts, especially in Japan where custom chips were allowed, were capable of acting as a game, a hardware upgrade, a save state, and an anti-piracy device all at once. This practice was pretty standard for most consoles that utilized carts until the aforementioned 32-bit era where expansions moved to actual hardware expansion ports and even the N64, which could lean on carts, used ports instead to expand the on-board RAM.
ROM types and Special Chips
The oldest ROM type is Mast ROM, which refers to information stored permanently via integrated circuit when the chip is manufactured. The term “mask” refers to the masking pattern on the integrated circuit when it is created. This is the oldest form of ROM and definitely what went into the creation of Nintendo carts, which were manufactured by Nintendo and worked as a supplemental business to the license fees and cut of each unit sold on the NES. This is the cheapest way to get the most memory, however unless you have a mass production line the masking of the integrated circuits can be a costly endeavor and without the vast quality controls like Nintendo had one poorly made program or coding error can ruin an entire production. You can understand why Nintendo was so strict back in those days, especially because masked integrated circuits cannot, by their very nature, be re-written or reused. The up side is that there is little chance once successfully produced that the chip will suffer decay, failure, bit rot, and various other issues that can plague other ROM types, which is why you will see most classic carts last nearly forever (please note that the save battery memory is a different story). I know that this was the most common type in all Atari consoles, NES, Turbografx-16, and Sega Master System. Beyond that it is entirely possible that the SNES, Genesis, 32X, and portable consoles may have used other formats like Erasable Programmable ROMs (EPROM) that allowed you to reprogram chips with ultraviolet light or Electronically EPROMs (EEPROM) that allow you to erase and re-write electronically. There are generic PROMs that can be created with a ROM burner and remove the need to produce them like a mask ROM, but they are still one time use and were more for arcade and pinball repair, which may mean they can be found in Neo Geo carts. As for Jaguar and N64, I’m guessing EEPROMs, but there’s still a striking possibility that these companies known for mass production of carts since the 80s still made traditional mask ROM carts, especially with the lowering price of PROM and the relative high emulation/piracy of the late 90s. It has been said that PROM, EPROM, and EEPROM may have a higher chance of failure, but most carts don’t seem to have a problem no matter what is used and plenty of fixed arcades have had no problem whatsoever (especially because they can be wiped and reprogrammed). ROM chips typically varied in size from 2 kbit (not to be mistaken for the large 2 kbyte) that held roughly 256 bytes all the way up to the expensive 32 megabit chip that held 4 megabytes. This is why you saw statements on Street Fighter 2 that said things like “32 MBit Chip!” because it was touting massive storage space. Some games are rumored to have even larger ROM chips that compressed data and justified hefty price tags like Final Fantasy III launching for $80 in the US or Street Fighter Alpha 2 having load times on the SNES while data uncompressed. It was all par for the course when trying to get as much data on a cart as possible. I do believe that as RAM was integrated into consoles, like we saw on the N64, that compression and temporary storage allowed for more data to be stored for the larger 3D games of that console.
In addition there can be extra chips placed into the carts to allow all kinds of extra functionality, which basically means the carts acted as hardware upgrades. This makes sense when you think about the massive leaps between launch games and later releases. 2600 carts were pretty straightforward, but NES carts had a few extras like the anti-piracy NES10 chip that was required to be on the PCB in order for the NES to play (if it doesn’t detect this due to a loose connection you get the popular blinking console effect, which is the NES10 chip on the console rejecting the cart). Saving became a large feature as well, which led to almost all cart-based save states to be stored on Static Random Access Memory (SRAM), which was able to keep save data stored after power is cut (uncommon for RAM) provided that a small current still passed through. This is why a lithium button cell CR2032 battery is used in most cases and once that battery dies (typically around 20 years, but can go much longer) the SRAM no longer functions. To fix this, simply de-solder the dead battery from the SRAM leads and solder in a fresh battery to the leads. In Sonic 3 as well as a few others, Sega decided to use much more expensive non-volatile RAM (NVRAM), which was an early form of flash memory we have today and retains information after power is cut, which is why Sonic 3 carts should retain save data indefinitely.
As for expanding the functionality of a console, special chips could literally upgrade a console to allow it to do things it was never intended to do. In Japan the Famicom was allowed to have special chips put into its carts so companies could go crazy on internal development – due to no video game crash, Nintendo did not force Japanese development studios to manufacture carts through them like in the US. This explains the VRC6 chip in Akumajo Densetsu (Castlevania III) that allowed for extra channels on the Famicom’s unused sound port. In America Nintendo began releasing special Memory Management Controller (MMC) chips that allowed for some of the Japanese innovation to happen on the NES, albeit in a stripped form due to the different hardware profile of that console. Here are some of the popular chips:
- UNROM: Split the program data from a 32 kbit chip into two 16 kbit chips, one that stored the data and one that transferred data to RAM chip for faster loading and effects. This was seen early with impressive titles like Ikari Warriors and Mega Man and assisted in side scrolling of dynamic characters and certain effects.
- MMC1: Allowed for save games. In addition to having 16 or 32 kbit ROM programs, 4 and 8 kbit SRAM was integrated and powered with a button cell lithium battery. This was essential to getting Famicom Disk System titles that had save data to run on NES carts such as Legend of Zelda, Metroid, and Dragon Warrior. Although Metroid didn’t support saved checkpoints like the FDS version did, massive passwords allowed pre-stored save data.
- MMC2: Basically split a 32 kbit chip into a 24 kbit chip with two sets of 4kb banks for pre-loaded graphical data. It allowed more graphics to display on screen at once due to the additional banks being only referenced without assets in the main code. Only used for one game, Mike Tyson’s Punch-Out!! to load the massive sprite opponents.
- MMC3: This massively split the memory allocation and integrated a scanline IRQ counter to allow for certain assets to scroll while others remained stagnant. Necessary when keeping dynamic items like consistent scores, life bars, and more fixed on the screen and moving while a second “window” was performing more dynamic gameplay. Nintendo’s most popular chip and key in many large titles such as Mega Man 3 and Super Mario Bros. 3.
- MMC4: Utilized only in Japan for the Fire Emblem titles to create effects similar to the MMC2.
- MMC5: The biggest, most versatile, and most expensive chip Nintendo offered. Developers avoided it like the plague due to the high cost and reduced profit margin. This had several different memory allocations, IRQ counters both horizontally and vertically, allowed for very dynamic effects, and opened extra sound channels along with a 1KB of active RAM. Since Koei was almost the sole user and none of its MMC5 titles came out in America, the only title to really use it was Castlevania III to create a similar, but still inferior, version of the Famicom title in America. The MMC5 chip was so complex that almost all clone consoles do not support it and emulation took a long time to decipher integration of the ROM. For this reason alone Castlevania III was one of the few games you had to have original hardware to run. Emulation is currently no problem however clone systems that run actual carts still do not support the title.
- MMC6: This final mapper chip extended the size of SRAM over the MMC3 chip by 1KB, which allowed for the larger save files of Startropics and its sequel to save games.
There were more custom chips that did eventually show face in America, but these were the most common and basic chips. Nintendo would loosen their policy and generate several custom chips for the SNES as well allowing for all kinds of impressive hardware tricks. Some of those are as follows:
- DSP: The digital signal processor chip allowed for various 2D and 3D calculations in several iterations that allowed for asset conversion for older graphics techniques, rotation effects, raster effects, and mathematics that could all be performed independently on the cart instead of using the SNES. Popular games that used this rotation technique are Super Mario Kart and Pilotwings.
- Super FX: This was a supplemental CPU for performing graphics calculations that the SNES simply could not do. Think of it as an external graphics card for the 16-bit console, as it was a separate 16-bit CPU integrated into the cart. Since it had simpler duties than the SNES, the Super FX chip’s iterations were capable of 10.5 mhz and eventually 21 mhz of processing power, which blows past the 3.5 mhz processor of the SNES and allowed for the 3D rendering of titles like Starfox. Later updates allowed for support of larger ROM sizes (for a long time the game program had to be less than 8 mbit or 1 mbyte of data).
- Cx4: This chip was Capcom’s way of showing off rotational, polygonal, and wire-frame effects in the Megaman X series. While the first title used a traditional chip, Megaman X2 and X3 had special test screens and crazy title screens that to this day cannot work on cloned consoles, flash carts, or even later hardware emulation (like the Megaman X Collection on PS2/Gamecube). Of course the emulation community has successfully gotten this working on software-based ROMs.
- SA1: The Super Accelerator chip that independently worked with the SNES as a co-processor creating a faster 10 mhz clock speed over the 3.5 of the original hardware, faster RAM functionality, MMC capabilities (see the NES section above), data storage and compression options, and new region and piracy lock out protection. This chip was essential in certain impressive titles like Super Mario RPG! and Kirby’s Dream Land 3, which cannot currently be replicated on flash carts. Software emulation on both the openware PC scene and official Nintendo Virtual Console titles do support the chip.
There were several others that were utilized for specific function in addition to the Genesis having the Sega Virtual Processing (SVP) chip in Virtua Racing to make the most technically impressive 16-bit game ever created. Unfortunately it also cost $100 at initial launch, wasn’t impressive from a gameplay standard, and doesn’t work with any clone or flash carts out there. Emulation is possible but with varied results.
Well there you have it. A brief breakdown of the technical marvel that was the cartridge and the hardware benefits it provided. It’s almost difficult to imagine a world without load times, where data access is instantaneous (something even flash carts can’t currently do). While it wouldn’t be possible with the massive memory required today and the equally massive cost of manufacturing, there was a time where a few bits in a plastic case meant to world to each and every gamer.
Lately many games that embrace former genres that had fallen to the wayside are making a comeback. As a result lots of games press and developer media contacts like to coin phrases that are based on gameplay styles not many are familiar with. When someone tells you that Tokyo Jungle is a “roguelike” or that Guacamelee is a “MetroidVania” title, it’s entirely possible you have no idea what that means. After this article, you will no longer have that problem.
You may or may not know that the roots of the roguelike come from a 1980 computer game called Rogue, which established the dungeon crawler. This game was considered genre-changing when compared to the slower paced text adventures such as Zork and Dungeons & Dragons video game ports like Wizardry and Ultima. Developers Michael Toy, Glenn Wichman, Ken Arnold, and Jon Lane site a hybrid between both types with influences from D&D as well as the text adventure Colossal Cave Adventure, which featured a detailed description of a cave system in Kentucky that was so precise it was used by a tourist to navigate parts of the actual caves it was based on. The result was a game where an adventurer explored a multi-floored dungeon, collecting items and facing enemies, in search of a final artifact (in this case the “Amulet of Yendor”) to complete the game. Each floor was more difficult than the last, you could not backtrack to a previous floor, and if you died you got a game over, simple as that. Additionally the layout of the dungeon, items, and enemies were all randomly generated, which meant you would ideally never play the same game twice. Despite the fact that you would have to start over, the experience of playing the game assisted you in handling enemies, utilizing items, and preparing for future encounters as such that you could eventually beat the game. Needless to say the game had a tough barrier for entry and popularized itself mostly on Unix systems in colleges across the country, but the public found it too complex and difficult.
Since then the concept originally started in Rogue was expanded upon to integrate classic RPG leveling systems, bosses, save states (for longer quests), and even a way to retrieve your items from the body of your previous adventurer. These concepts would be applied mostly to 16-bit era Super Famicom (SNES) titles in Japan known as the Mystery Dungeon series, notably Shiren the Wanderer. Granted both these series and Shiren would get sequels on the Wii that did come stateside, which might explain your familiarity with them now, but if you get a chance look up the fan translation patches online and check out the originals. Later on this concept would come to light in a stronger way with Diablo, although certain characteristics of that game – like the ability to revert back to a save and the entire concept of a save mechanic – are incosistent with a roguelike.
Modern day terms, and what basically defines a title known as a “roguelike”, refer to a game that has randomly generated levels/layouts, random items/enemies, and permadeath. Permadeath means that when you die all your levels, items, experience, gold, and even save game are completely lost and you are forced to start over. In some cases finding your body will grant you items back, but overall there needs to be significant consequences for your actions. Best examples of games like this are Binding of Isaac and FTL on Steam, Tokyo Jungle on PS3/PSN, Spelunky on 360/Steam, and of course Shiren the Wanderer or Pokemon Mystery Dungeon on Wii. These are true roguelikes and there are plenty more, but I wanted to demonstrate the ones you probably have heard of. Other titles skate the line and are (mis)labeled roguelikes like Diablo III and Dark Souls. Diablo III utilizes save mechanics and no true permadeath despite having all other aspects of a roguelike and Dark Souls suffers just the opposite with its very clear system of permadeath but lack of randomization in game design. So there you have it, when you hear someone refer to a game as a “roguelike” you will now know what they’re talking about – assuming of course that they aren’t mislabeling a title, which happens more times than not.
Fun Fact: Did you know that before first-person shooters was a genre, FPS titles were known as “Doom clones”?
This is a much simpler concept to grasp as it doesn’t quite have the rules that roguelikes do. In truth the title “MetroidVania” is a bit of a misnomer because the genre began with Metroid, but because Konami decided to copy the format with Castlevania: Symphony of the Night and Metroid titles were few and far between at the time the term “MetroidVania” stuck. Much like the roguelike, modern day programming has built up a concept such as this to give a hybrid between a game where you explore as well as overcome obstacles. Additionally most titles in this vein require 2D sprites or polygonal renders on a 2D plane, which is ideal for fans of the sub-genre. So basically the format that Metroid, Super Metroid, Symphony of the Night, and any Gameboy Advance iteration of either franchise is a MetroidVania.
The basic building blocks of a MetroidVania is to offer a large map that is completely available from the very start of the game. There are no levels and setting changes (ie: planets, countries, etc), everything takes place in one pre-defined area that can be fully explored from the first moment the game starts. From there items, doors, and enemies are scattered throughout the area to keep the character limited in his or her actions until certain points – it’s a creative way to offer some semblance of linearity to a game. Additionally obstacles such as the aforementioned locked doors, high ledges, long jumps, and physical hazards will assist in telling the player that they will need to come back to this location once they’ve obtained the appropriate item/ability/weapon. For this reason the MetroidVania sub-genre is extremely focused on exploration and finding absolutely every single nook and cranny the map has to offer without forcing the player to do so. In Symphony of the Night, it’s possible to explore 200.6% of the map (but I won’t spoil how), and in almost every MetroidVania title in the Castlevania franchise you will only get a true ending after completely exploring a map and collecting everything. This type of game has come back into style, but still remains somewhat niche given the old school mentality of this type of game and the frustrating planning involved in development. Probably the most popular recent MetroidVania titles are Shadow Complex on XBLA, Dust: An Elysian Tale on XBLA/Steam, and just two weeks ago Guacamelee on PS3/Vita/PSN.
Fun Fact: Both the Metroid (Other M) and Castlevania (Lament of Innocence and Curse of Darkness) franchises migrated to a fully rendered 3D world and almost everyone, especially fans, unanimously hated it.
So there you have it. Now you no longer have to wonder what the hell we are talking about when we discuss Roguelikes or MetroidVania titles ever again, and knowing is half the battle.
I can’t explain my love for the light gun. It’s one of the oldest forms of interactive entertainment, dating back to the carnival days where you would fire air rifles at a metal bullseye to make an old man’s hat pop up or a dog bark. Once the gun made the transition to video games it honestly became one of the most lifelike and violent gaming tropes throughout history. Not to get deep with it, but you are pointing a gun at a target, usually alive, and shooting it. There is not other gesture like it, you are shooting a modern device to kill something, virtual or not. At the same time it also doubles as the most simple form of proficiency. I don’t think anyone will claim that being good at Duck Hunt or Lethal Enforcers relates to being a good shot in a shooting range, but it’s got a much higher chance of significance than being able to get a headshot in Call of Duty. Whereas the FPS emulates the concept of aiming and firing a gun with 1:1 responses from a controller, a light gun truly simulates the experience.
Light gun games have been a niche genre, but that doesn’t prevent them from withstanding the test of time and being available on most home consoles and one of the most popular games, even today, in arcades. I guess it’s because despite the maturity implied behind firing a gun, it’s one of the easiest concepts for us to pick up. I’ve been on many adventures thanks to light gun games – whether it’s cleaning up the future in T2: The Arcade Game, battling zombies in a haunted house through House of the Dead, or enjoying some of the worst acting of all time in Mad Dog McCree.
It’s also significant because the light gun is a genre nearly impossible to emulate and doesn’t translate well in today’s technology. While there are exceptions, you will have a hard time playing Crypt Killer properly on a PC running MAME and most HDTV technologies don’t support light guns from the past. Authenticity is as important as the genre itself. This month I’ve decided to dedicate to a timeless style of video game that I always make first priority when buying a new (or old) system: the light gun shooter. Come join me to learn about some of the best, worst, funniest, and definitely weirdest titles to ever grace the hobby of video games. Thanks to my huge CRT television and original hardware, I can even show you videos.