Gaming History 101

Know Your Roots

Sound’s Good: Your Video Game Audio Buying Guide

with 2 comments

header

This week I decided to take on another technical escapade and look into the sound options for video games.  This requires you to know quite a bit about the concept of analog sound vs digital sound, then compressed audio vs. uncompressed, stereo vs. surround, and all the wonderful tidbits mixed in-between.  Just to make things more complicated, the Internet forums are chock full of people who have no idea what they are talking about and will pollute decent message boards with misinformation only to be ignored by the elite knowledgeable on that board, thus making anyone who does a search end up on a page where the misinformation is the only answer in town.  Additionally companies like Dolby, DTS, and a whole group of fun little logos that can appear as stickers on your receiver’s box, case, or display fill you with the joy and satisfaction that what you see is what you are hearing and that it’s better.  Well guess what, it’s not.  In fact, probably the best surround sound you can possibly get is LPCM (or Linear PCM), which is uncompressed audio that has been around since before CDs and still stands as the best surround sound format – albeit at the cost of TONS of storage space that most consumer products refuse to utilize (remember that TitanFall’s uncompressed audio weighed in around 40 GBs).  With all the mess and bull that exists, I figured why not enlighten my fine readers with a lesson and best practices so that you can easily determine the sound options for your consoles and get them up and running and sounding great.

Please Note: As previously mentioned, there’s tons of misinformation on the web about sound profiles.  For that reason I may be more restrictive about comments that I know are incorrect and whether you choose to disregard this post for that reason is up to you.  Additionally sound, like visuals, is a subjective medium and therefore it won’t be the same for everyone.  Some swear 1080i looks better than 720p and visa versa, the same can be said for compressed DTS 5.1 and uncompressed DTS-HD Master Audio.  Despite the research and blatant facts suggesting otherwise, pick what helps you sleep at night, this is merely a guide of options.

The Set Up

The first thing you need to decide is how you want to set all your game systems up and what kind of sound setup you want.  If you are going to do TV speakers (which I don’t recommend on non-tube TVs), a soundbar, or any stereo (2.0 or 2.1) receiver, then most of the work is done for you because almost every digital sound format supports uncompressed stereo.  In the rare event that it doesn’t, you’ll just have to roll with it.  Also keep in mind that many video game consoles are analog audio so you’re stuck with stereo at best.  If you have a 5.1/7.1 receiver then you can adjust surround options, but I’ll get to that in a sec.  Here’s a quick breakdown on sound setups and definitions.

2.0/2.1/5.1/7.1: These refer to the specific number of speakers you have.  Anytime you see these numbers in relation to stereo/surround, that’s what they mean.  Anytime you see a “.1” at the end, it means that you have a subwoofer added for base.  For example, a 2.0 setup is a two speaker stereo setup, whereas a 2.1 is two speakers plus a subwoofer.  Subwoofers can be powered (ie: separate plug for power) or unpowered (the receiver sends out the power) and I personally prefer powered so that I know it gets enough power for the sound output I want.

If you are going to use a receiver, always send the sound to the receiver separately or first before going to the television.  Almost all televisions will downgrade or strip sound with modern connections, it’s best to get used to hooking up your components to audio first.  In many retro consoles the cords have video and audio together, forcing you to hook up to them and then out to the receiver.  People grew up thinking that the television was the sole and initial source for everything, it isn’t, it’s the location of the finished product.

Retro Consoles

comp_svidEvery console from Pong clones to the Sega Dreamcast (and including the Gamecube) use analog audio for sound.  This is most typically with RCA plug-type composite cables (the yellow/red/white) connection, but you will definitely have some with S-Video also integrated or alternatively from the yellow video connection and of course most consoles before the NES and even today have coaxial (screw connection) Radio Frequency (RF) adaptors that output the sound and video as one.  This makes the audio part easy: it’s the white (left stereo) and red (right stereo) connections.  This is either mono or stereo analog audio that will automatically push through the connection whenever the console is powered on.  If you have a coaxial RF connection I recommend upgrading the output cable or in the rare even of the Atari VCS/2600 era and the Turbografx-16, finding a way to adapt to RCA (either by console modification or by simply hooking it up to a VCR with RCA composite a/v out).  Then all you do is hook the video up to your source and often in the menu of the game you’re playing you can set either mono or stereo (some more modern games will also have “surround” but more on that later).  That’s it, that’s all you need to do.  Most modern receivers will take in your composite, extract the sound, and then send the video out to the TV in either the same plug or high end models will upscale/upgrade the signal and output to better resolution (like most recently HDMI).  If you have a lot of consoles you will be swapping out and do not want to keep accessing the back of the receiver, simply get a composite A/B switch and hook the receiver in the “output” part and leave that out so that you can plug any of your many a/v cables into it.  Like it or not, almost no consoles share the same a/v out plug (save for SNES/N64/Gamecube and only for composite video).

Now we will move on to surround sound, the next step in audio and the first speed bump.

Early Surround Sound and the 5.1 Compressed Audio Format

Starting with the Playstation 2 and with almost every game on the original Xbox, there was support for 5.1 digital audio.  Earlier consoles like the N64 and especially the Gamecube liked to tout surround sound, especially with the “Dolby Pro Logic” logo, but that was merely a decoder for analog stereo sound that would emulate 5.1 on those types of setups.  Today every receiver will have modes similar to that and there are even competitors.  My receiver features Dolby Pro-Logic IIz, DTS Neo:6, and it’s own proprietary format – these formats turn a stereo setup into surround, and they are quite good at faking it.  For the most part, an analog stereo source will merely feed the stereo to the left and right side of speakers according to the setup and then use both speakers in the center channel.  For example, Eternal Darkness when sent to my receiver from the Gamecube analog a/v cable will send all the left stereo to the two or three speakers of my left side, the same on the right, and the center channel that has two speakers will act like a typical stereo speaker.  That’s all.

compressed_audio

Now if you have a digital fiber optic cable (also known as a TOSlink, Optical, or S/PDIF) or a digital coaxial cable, you could get 5.1/7.1 compressed audio or 2.1 uncompressed audio.  This was first used by high end CD players, then later by laser disc, moved on to DVD, and of course was integrated into the PS2 and Xbox.  Digital coaxial was mostly used in CD players to send a digital uncompressed stereo sound (known as Pulse Code Modulation or PCM) that was said to sound richer.  On laser discs and DVD players, there would often be either digital coaxial and/or optical so that you could hook your component up to whatever port your receiver had and get compressed surround sound (known as Bitstream).  The benefit was that you could get 5.1 (and later 7.1) surround sound that was dynamic and came out of all the speakers, but at a price.  The bitsream sound format had to be compressed and lost quality, however it was the only way to fit these massive sound files onto a single laser disc or DVD.  Two major encoders emerged that were already doing similar formats in theaters to help get the best and most dynamic versions of compressed audio: those were Dolby (Dolby Digital) and Digital Theater Systems (DTS).

Digital Coaxial Cable

Digital Coaxial Cable

Back then the only way to get access to these compressed sound files were to have devices that could extract and send out the audio signal and a receiver that could receive it.  Think of it as a conversation – you can only speak English if you know the language and you can only use it with someone else who knows it.  This created the big compatibility wars of the 90s and 2000s that people have been so affected by they can’t seem to let go of it today (and frankly that’s why so few people even understand what’s going on).  You had all kinds of outlandish issues, each one that failed would result in the same effect: the digital cable would output stereo PCM uncompressed signal and you would basically lose surround sound.  This would happen if all your devices and media didn’t match – so to watch Day of the Dead in DTS you would need a version of the movie with DTS on it (look for the logo), a player that could read and output DTS (look for the logo), and a receiver that could accept and decode DTS (again, look for the logo).  This became such a headache that logo hunting is probably the biggest marketing ploy of home audio today.

Dolby Digital (DD) was the first widespread home compressed 5.1 format so most laser disc players, DVD players, and receivers could receive and decode it, dropping into uncompressed stereo (PCM) if for any reason the receiver couldn’t detect or read a DD 5.1 signal.  DTS came on the scene later and was basically the Pepsi to Dolby’s Coke, but many swore it was a more rich and better sounding format.  In truth I think it was just that DD was the go-to so you had more people using it and thus a larger range of quality differences, whereas DTS was a very specific format often only utilized when extra cost and quality were at stake.  In truth if you prefer one over the other it probably has to do with the goals of each format, because they did have different ones.  DD was set on creating a more discrete experience where the audio seemed all around you but not from the perspective of a single audio channel or direction, it wanted full immersion of the whole room, which is why hints of each sound were typically in all speakers.  DTS wanted directional based sound output so that when the Predator ran around the room the “whoosh!” of his run would jump from speaker to speaker in a circle, which any audiophile looking to show off his new equipment was probably in love with (myself included).  Back then, however, it was seen as a feature jump to include DTS because you typically had to buy more expensive equipment to support it and a more expensive version of the movie (DTS cuts were all the rage once DVD learned you could re-sell the same movie with a new feature and customers would buy it up).  In all the fun and trickery of creating a 5-7 speaker experience to replace the 2 speaker stereo, something it seems was lost because the overall sound quality was poorer.  Think of compressed as an MP3 with a low bitrate, just doesn’t sound as good as the original, whereas uncompressed is the full rich sound but only in two speakers.  It was the trade off that you had to make and everyone from Dolby, to DTS, to receiver manufacturers had an opinion.  Bose touted for years that their stereo systems using uncompressed PCM audio and then re-encoded for a 5.1 or 7.1 speaker setup was far superior to the compressed audio of DD and DTS, which given the current state of things may have been true.

Uh, Video Games?

Right, of course, so how does this relate to video games?  Well the Playstation 2 featured a fiber optic connection on the back of each iteration of that console and with the “HD AV Pack” the Xbox also could output fiber optic audio.  In many cases that only meant the game delivered uncompressed stereo, or even worse, compressed stereo.  On Playstation 2, provided you had a receiver that supported it, limited games like The Bouncer and Metal Gear Solid 2 supported Dolby Digital 5.1 in the cutscenes (you could watch the logo appear and disappear on your receiver if you had it set up properly) and the infamous Grand Theft Auto Vice City did have true 5.1 DTS (I remember hearing the world of random sounds all around me when I set this up).  There were other games, though, like SSX Tricky that would output DTS but only in 2.1, it was weird.  This was because the Playstation 2 didn’t do any encoding or decoding in the system; just like a DVD player it would simply strip the audio and send it out to the receiver in the form it was on the disc.  This meant that if your receiver could read DTS and DTS was in the game, you would get DTS.

Xbox HD AV Pack

Xbox HD AV Pack

Xbox was different thanks to the specific Nvidia chipset that made up the hardware configuration.  In this case it would take the the audio and actually construct (or reconstruct) it into Dolby Digital format.  This meant that no matter how the audio came (stereo, dolby digital, DTS, uncompressed stereo) the Xbox would take it and re-create a Dolby Digital 5.1 audio signal for it and send that to the receiver.  Honestly that’s how most of the consoles work from this point on, but I’m getting ahead of myself.  The benefit was that no matter what type of sound the developer gave, the Xbox would convert it to a compressed 5.1 Dolby Digital that would work 100 percent of the time provided your receiver could decode Dolby Digital.  It got rid of the guess work and also assured that every game you put into the system would have sound coming out of every speaker every time.  While it may have been a great deal of smoke and mirrors, the Xbox did it very well and those that had a 5.1 system and wanted to play a video game with surround sound usually opted for the always 5.1 Xbox version.

This all changed with the High Definition Multimedia Interface (HDMI) that eliminated compressed audio, bitstream, and optical cables but didn’t really bother to tell the main consumer.  And thus, the war that wages on even today of trying to understand what the hell your device is doing was begun.

Uncompressed Linear PCM

Remember way back at the beginning of this article when I mentioned the earliest form of digital audio was uncompressed stereo (aka PCM)?  Well secretly the home audio world converted from that lovely compressed “faking it” 5.1/7.1 of Dolby Digital and DTS and migrated back to uncompressed now that things like Blu Ray discs, digital video formats, and the HDMI plug were around.  See fiber optic and digital coax cables only compressed the 5.1 because they had to, they couldn’t carry the proper signal of uncompressed with more than 2 channels (stereo).  HDMI was different, it could carry a full uncompressed 5.1/7.1 sound.  Couple that with Blu Ray, which had a 50 GB capacity and you could finally fit the full high definition video and uncompressed audio experience.  Unfortunately it basically meant that all of us “suckers” who had signed on for the old school format were left in the dust with nothing to show for it.  Instead of openly copping to it, hardware manufacturers of all kinds opted to sweep it under the rug instead of force it forward.  In truth we should be thankful, it’s quite a decent accommodation to make for the audio side, whereas no one takes pity for video.  Heck, the Playstation 4 forces you to use HDMI and almost expects you to have a 720p/1080p television, but it will still send a compressed bitstream audio via optical cable out to your receiver without even warning you of the compromise.  Couple that with the stubbornness of technophiles – of which I will openly cop to – and you get a bunch of old guys looking for a DTS logo and regardless of what’s actually coming out of the speakers we stare at that state of mind logo and say, “damn straight!”

dd

The Xbox 360 was too early into the format, HDMI wasn’t even widespread in 2005, so it flat out didn’t support the HDMI standard for video or audio (it was later added in hardware revisions).  In addition, the 360 used the DVD format for games, so naturally Microsoft re-enlisted the lovely Dolby Digital encoder they already have and boom, 360 games are all in Dolby Digital 5.1 (ever notice that you only get DTS when it’s passed from a DVD?).  The unknowing consumer puts the game in, hears 5 speakers, sees the logo, damn straight.  On the other hand, the one year delay and hefty price tag of the PS3 justified it pushing the standards much higher.  The Playstation 3 almost completely ditched the compressed audio format in terms of how it wanted to operate, future proofing the system with an HDMI port from the get that supported up to 7.1 channels of uncompressed PCM, but also humble enough to know that many early adopters would not have the tech yet.  Furthermore video games, even if they were on the blu ray disc format, were mostly multi-platform, which resulted most times in a smaller 5-10 gb game and compressed 5.1 audio, regardless of the 50 gb capacity of the blu ray.  In addition the console would pass compressed audio formats out via the optical cable like the PS2 did, so games encoded in Dolby would get that signal and games that were encoded in DTS would get that signal, and the audiophile saw the logo on the receiver and thought, “damn straight”.  This whole confusion basically came from the concept that Microsoft wanted to hide the truth by upscaling all games to 1080p on the 360 (there were like 5 total games that were in native 1080p) and encode all games in compressed 5.1 Dolby Digital.  On the PS3, the truth was forced on all games out of the box so whatever surround sound format was in the game (mostly DD compressed thanks to the 360) would output via optical and the native resolution for the game, rounded to the closest main resolution (usually 720p) would output on the screen.  Gamers and tech guys didn’t like that, which led to the concept that the PS3 wasn’t as good as the 360 from a video standpoint because all games were 1080p/5.1 on 360 and dropped down to 720p/5.1 on PS3.  The truth was that the games looked the same but techno people want more Ps, damn straight.

dts

There was also the little case of logo fever that erupted far beyond DD and DTS.  New compression formats for 7.1 like Dolby Digital Plus and DTS-High Resolution Audio hit the market, which was just an upgrade from 5.1 to 7.1, still compressed, but it gave you that breathe easy logo on your receiver.  Not only that, but if you had a compressed audio cable and the lossless compressed Dolby TrueHD and DTS-HD Master Audio (both new codecs that compressed audio without quality loss to emulate uncompressed) was playing it would still display the logo for you, damn straight.  Therefore when these logo junkies (again, myself included) and PCM haters from the old guard started getting new HDMI receivers and saw no logo and “PCM” they lost it.  They didn’t want it.  Even though it was better and it was in 5.1/7.1, the brain could not understand that PCM > Bitstream and with all this hardware still supporting optical with Bitstream and you got the logo they went with that option.  I myself did this for the last five years until tons of research and a little patience finally saw me upgrading earlier this year.  Let’s go back to the PS3 for a second and also discuss the biggest reason that gamers thought they weren’t getting true 5.1 surround.  The PS3 is kind enough to decode all major forms of compressed audio and export it via HDMI in the linear PCM uncompressed audio format, the truest of formats.  Whether it’s old school DD/DTS 5.1, Dolby TrueHD, DTS-HD Master Audio and everything around them, the PS3 just decodes it to LPCM 5.1/7.1 and sends it to your receiver in perfect harmony.  This means that you don’t need any special type of HDMI cable and it’s basically supported by all receivers that take HDMI, no codecs on the receiver side required. It’s very similar to the way the original Xbox made everything DD 5.1 in 2001.  Unfortunately, no logo makes many people discouraged.  If you want proof, just hook your PS3 up to a receiver with HDMI, have it auto detect the sound via HDMI in your PS3 settings, and start any movie with a compressed format.  You’re receiver will simply claim PCM but you will notice that the surround is dynamic and if you choose the “display” button (either on controller or remote) you’ll see the audio formats like DTS-HD MA or Dolby TrueHD in the upper left.  See, you’re actually getting your big lossless 5.1 wonderful sound you wanted, the logo has just moved to a new place.

Fiber Optic (optical) cable

Fiber Optic (optical) cable

This is a good thing, but many can not let go of the old logo wars and dreaded “PCM means you’re getting non-surround stereo” from the earlier days that they have newer receivers and actually FORCE them back into Bitstream via optical cable.  Ugh.

Moment of Clarity

For me the moment I began to notice the difference was when I built a gaming PC.  My new Nvidia GTX 760 had an HDMI out and the setup software even said it supported 5.1/7.1 via linear PCM or Dolby Digital Uncompressed (Live Action I think it’s called), and yet I was getting no 5.1.  I had purchased a year earlier a device that would take the HDMI A/V source and strip the audio out of it so that I could send the audio via optical out to my receiver – this worked fine with my 360, PS3, and HDMI DVD player – what gives?  Well it was because the GTX series can only output PCM audio and as we’ve already discussed, optical can only handle 2-channel PCM.  I did a little research and all signs pointed to me needing an HDMI receiver, which I shrugged off and just accepted that my PC would always be in stereo, heck most PC games aren’t in 5.1 anyway because they don’t have the built-in decoders of the 360/PS3.

lpcmThe next hint was the Wii U.  The Wii just had stereo analog sound so I figured if the Wii U had any form of audio it would simply be DD or DTS 5.1.  Wrong, the Wii U actually only supports LPCM in 5.1 format, otherwise you are getting stereo PCM.  Back to the audio stripper that didn’t work with my GTX for 5.1 and I again didn’t get 5.1 out of the Wii U once the audio was stripped to the optical cable.  Even worse, the Wii U is so user unfriendly that I stupidly put “Surround” in the settings so only channels 1 and 2 out of the 5.1 were outputting to my speakers, which negates the center channel that typically carries voice on most 5.1 setups.  That basically meant that I heard little or no sound out of the voices because all it would pick up were the subtle left and right front speaker sounds of the voice that were much lower than the center channel.  I quickly went back to “stereo” and again blamed my Wii U for not being forward thinking.  Then I turned on my PS3 and watched a Netflix movie in Dolby Digital…with a logo…damn straight.

It all came to a head with the PS4 and Xbox One.  Both consoles allowed you to export the sound via optical, choose “bitstream” in your audio settings, and you even got to pick Dolby Digital or DTS!  Wow!  How did they do that?  You mean I get to pick which logo I see!  This was amazing for the first two weeks until I started to notice that it didn’t make sense that no matter what was on the screen I got the same logo, regardless of the logos on the media.  Then I started looking into this Linear PCM thing and realized that both consoles had encoders/decoders that could take any audio and export them to either compressed DD/DTS or uncompressed linear PCM 5.1/7.1 via HDMI (you know, the truest form of audio).  That did it, this logo hunting was B.S., I needed to move on.

bitstream_vs_pcm

After the setup, I was hesitant at first because again, no logos.  No matter what Blu Ray movie or game I put in, logoless.  However I did get 5.1, it sounded awesome, and things like my PC and WiiU worked (even with “surround” on!).  I had finally gotten over it and now I am able to enjoy Linear PCM 5.1 dynamic audio, without compression, despite having no logos to show for it.  But after this long 4300 convoluted history, you can see why I was so discouraged to ditch the logo.  If you are looking to upgrade your audio system and are tired of jacking around with optical cables, heed this notice to upgrade because you’re going to have to soon anyway.

As always if you have questions or discussion, please post in the comments below.  I did warn that if it spreads misinformation I may not post it, but that is limited to this post only.  Hopefully this will be nothing more than helpful.

Cheers.

Written by Fred Rojas

December 2, 2014 at 4:13 pm

2 Responses

Subscribe to comments with RSS.

  1. Long winded comment/question: I recently started playing Mirror’s Edge Catalyst on the PS4 and the load screen has DTS all over it. So I’m assuming the game was mastered in DTS. What happens if my PS4 is set to output Dolby Digital. Is the console recompressing the signal to lossy DD? If I set the console to linear PCM, am I getting a true uncompressed signal or is it still just the lossy DTS? Do games even support lossless HD audio? Because, in terms of fidelity, I can’t hear a difference between DD, DTS, or LPCM when I’m gaming – except that the DD signal is louder and has more punch than the others.

    I have an Onkyo Dolby Atmos/DTS:X receiver and I usually leave the PS4 outputting DD, this way I can stream movies from Vudu and hear the Atmos track when it’s supported. If I’m on LPCM, I miss out on the DD Atmos track.

    Seth Johnson

    July 28, 2016 at 11:46 am

    • Not a problem Seth, thanks for writing in. Those logos on games like DTS and Dolby Digital are present and probably how the game was mastered, but with consoles (and PC even) that’s almost a non-point because it’s all uncompressed PCM in the lossless category. The reason that consoles have “Dolby Digital” and “DTS” on both PS4 and Xbox One is because those are the lossy old school encoders, but the sound signal of the 5.1/7.1 PCM is downsampled into those encoders. That’s why you can pick DD or DTS for your non-PCM signals because the console can literally take the PCM and turn it into whatever sound system you have. If you want lossless you’ll need to send an HDMI from the console through an appropriate receiver that can decode everything, but I will say that the PS3, PS4, Xbox One, and both PC graphics cards (GeForce/Radeon) can send every signal from lossy Dolby Digital/DTS for my DVDs/Streaming and lossless PCM (in either true PCM, Dolby True, or DTS Master Audio) automatically and my receiver on audio receives and decodes these appropriately. For your Onkyo it looks like it’s more than prepared to handle all that. My advice is to set your PS4 to PCM (uncompressed) and then listen to your speakers to verify true 5.1/7.1. It’s easy to tell, if you are only doing 2 channel then sound won’t come from center or back channels.

      One more side note if you’re juggling HDMI ports, one goes bad (happened on my Pioneer), or just plain wish to use optical instead on a console: they aren’t that bad. I have noticed that when I use my PS4 via optical and encode to 5.1 (I usually use DTS out of habit because I personally believe it’s better but it’s more of a Coke/Pepsi decision) that it does quite well. Sure, the audio is lower so I have to crank more power to get the same volume and I’m sure somewhere in there I’m losing some quality and special effects, but overall most games are about the same. I have to swap to optical when doing captures because my capture device only supports 2.0 so I send the surround sound out separately through optical.

      In direct answer to your questions: Yes the console is re-compressing to lossy DD/DTS unless you’re playing a piece of media that is naturally supporting it (Netflix, DVD, etc.). Linear PCM is true uncompressed 5.1 out of consoles but there’s no need to set the decoder (Dolby True, DTS-MA, etc.) because the consoles just break it down to the most compatible format, 5.1 PCM, which is still the best way to get audio so bully for Sony/Microsoft. Games all support lossless 5.1 audio at this point, it’s basically been the case in PC since about 2012 and on consoles since they launched in 2013. If you launch a game and are getting 2.0, look in the menu, it probably just needs to be activated (this was the case with Transformers Devastation and it’s much more common on PC than consoles). As explained above, the consoles just convert the lossless 5.1 to either lossy 5.1 or linear 2.0 PCM for optical cables, user settings, or if it detects and HDMI source that doesn’t support 5.1 (TVs, capture cards, incompatible receiver, etc.) I am with you, I can very rarely hear the difference, especially with such strong tech encoding, decoding, and re-encoding audio all over the place. Science and experts will tell you lossless PCM is the way to go. Personally I feel about audio the same I feel about resolution: what seems best to you?

      Hope this helps, let me know if you have future questions.

      Fred Rojas

      July 29, 2016 at 10:53 am


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: