MetalTabs.com - your source for Metal tabs
Home Forum FAQ Contact Us Link to Us


Go Back   MetalTabs.com Forum > Musicians > Gear & Recording


 
 
Old 2008-05-21, 11:47
Soeru's Avatar
Soeru
Post-whore
 
Join Date: Oct 2004
Location: Land of Dust
Posts: 3,551
24 vs 16 bit recording.

I've been googling the subject of what effect bit depth has on audio recordings done on digital workstations and this seems to be a subject of endless debate. A lot of modern soundcards/capture devices and digital recorders boast the ability of being able to record with a 24-bit depth.

Using 24 bit as well as higher bitrates(kHz, think 44.1, 48, 96, etc) make the file size of the recordings bigger and on computers running recording programs, put more stress on the CPU and RAM. Running cubase with the EZdrummer vst plugin, if you start a project in 24 bit/48-192khz it makes the vst plugin load the higher filesize drum samples into your ram instead of the standard 16bit/44.1khz, which takes much more time to load. Personally though, I can't tell the difference in sound. Maybe I'm not paying enough attention.

A lot of people argue that there's no need to record in 24 bit, 16 bit is fine because after all, all commercial music CD's are mastered to and burned in 16 bit, 44.1khz, so there's no point in recording at super high quality settings if you're just going to dumb it down later apparently. But then some people say it's still a good idea to record in 24 bit even if you're going to bog it down later(something about an increased signal-to-noise ratio?).

What do the experts here think? Anyone here done any pro or semi-pro CD recordings on digital equipment and what bit depth and bit rates did you use?

Should I just keep to 16 bit for quicker performance when recording on a PC or is using 24-bit worth the extra loading times and larger file sizes, even if I'm just going to convert all my recordings to mp3s later on?
__________________
Quote:
Originally Posted by far_beyond_sane

(Did you know In Flames had a 2005 album called "Come Clarity"? How prophetic. I think they're trying to tell us all their sperm are dead.)
 
Old 2008-05-21, 12:18
Soulinsane's Avatar
Soulinsane
Pirate Lawd
 
Join Date: May 2004
Location: Hanger 18
Posts: 6,520
Damn dude. That is a good question. I've heard it explained being similar to taking a digital picture. Its better to take a picture using high resolution even if its only to be edited and reduced in size later because it will always be cleaner and look better. Try editing of touching up a low resolution picture and you will notice a major fake look to things.

At normal speeds it is nearly impossible to tell the difference between 16 and 24 bit ADA conversion but when you slow things down to a fraction of a second and try to edit something there will be a huge difference in detail.
__________________
Authorized Mercury Magnetics tech/dealer
 
Old 2008-05-21, 15:56
Soeru's Avatar
Soeru
Post-whore
 
Join Date: Oct 2004
Location: Land of Dust
Posts: 3,551
Say you take a picture with a resolution of 1024*768 pixels and another at 640*480, but then reduce the size of the 1024 one down to 640*480. (both res's are perfect rectangles). Would it be possible to tell the difference in quality between the two?
__________________
Quote:
Originally Posted by far_beyond_sane

(Did you know In Flames had a 2005 album called "Come Clarity"? How prophetic. I think they're trying to tell us all their sperm are dead.)
 
Old 2008-05-21, 16:28
Sycophant's Avatar
Sycophant
Supreme Metalhead
 
Join Date: Jul 2006
Location: Netherworlds Of The Mind
Posts: 685
When I used to record on Cubase, I did everything in 24-bit and then just exported .wavs of all the songs from that program. Working with these .wavs and burning onto CDs turns them into really good quality 16-bit, better than if you would start out recording on 16-bit and left it. The difference may not be noticable but it's there, in the numbers. The equipment you're using to playback your recordings makes a huge noticable difference between 16-bit and 24-bit mixes, and 24-bit will always sound "truer" to what you recorded. On higher end monitors the difference becomes staggering. Now when you take that 24-bit mix and burn it onto a CD or make it an MP3, it's going to be a hell of alot more pure than if you started and stayed at 16-bit (although on a budget sometimes you got no choice!)

If I could record at 32-bit without hangups/latency or technical issues, and then just save those master tracks so I can make MP3's or .wavs/CD Audio files out of them later (i.e. changing them over to 16-bit audio) I totally would, no questions asked. It's a truer, more real audio recording.

It's smart to do it this way rather than convert it to a higher bitrate later, which is ass-backwards and will most likely fuck up the quality, pitch and/or speed of your original track. Always convert down, and never convert up (duh). It also depends on how good your equipment is, you can even go up to 96khz sampling rate for recording on some soundcards without latency or choking. 96khz for recording is insane pro-quality. At sampling rates that high, it's not so much a difference in quality issue, as it is just simply a quality issue. Because afterwards it's all going to be turned into 16-bit audio anyway!
But, even though everything's going to end up 16-bit audio at 44100 (or a compressed audio file), it's going to be clean as fuck from downgrading from 32-bit. Recording at the highest bitrate you can possibly get will guarantee the music sounding as you intended. That's what I've seen.

EDIT ::: To answer your question about the photos... the bigger one resized down will look alot better than the other one. Probably by alot.

Last edited by Sycophant : 2008-05-21 at 16:42.
 
Old 2008-05-22, 02:13
aslkvbiwbegv
Metalhead
 
Join Date: Feb 2007
Posts: 74
My understanding is that 24 and 16 bit are only discernible in extreme situations such as listening to extremely quiet sine waves with the volume all the way up on very nice equipment. I think the argument is that digital audio processing works better (less quantization noise) on 24 bit audio. So people record at 24, processes it, and then convert it to 16 bits.

Really though, unless you are doing very high quality recordings with very nice equipment, it's not going to make any noticeable difference.
 
Old 2008-05-22, 07:29
Soeru's Avatar
Soeru
Post-whore
 
Join Date: Oct 2004
Location: Land of Dust
Posts: 3,551
Quote:
Originally Posted by Sycophant
When I used to record on Cubase, I did everything in 24-bit and then just exported .wavs of all the songs from that program. Working with these .wavs and burning onto CDs turns them into really good quality 16-bit, better than if you would start out recording on 16-bit and left it. The difference may not be noticable but it's there, in the numbers. The equipment you're using to playback your recordings makes a huge noticable difference between 16-bit and 24-bit mixes, and 24-bit will always sound "truer" to what you recorded. On higher end monitors the difference becomes staggering. Now when you take that 24-bit mix and burn it onto a CD or make it an MP3, it's going to be a hell of alot more pure than if you started and stayed at 16-bit (although on a budget sometimes you got no choice!)

If I could record at 32-bit without hangups/latency or technical issues, and then just save those master tracks so I can make MP3's or .wavs/CD Audio files out of them later (i.e. changing them over to 16-bit audio) I totally would, no questions asked. It's a truer, more real audio recording.

It's smart to do it this way rather than convert it to a higher bitrate later, which is ass-backwards and will most likely fuck up the quality, pitch and/or speed of your original track. Always convert down, and never convert up (duh). It also depends on how good your equipment is, you can even go up to 96khz sampling rate for recording on some soundcards without latency or choking. 96khz for recording is insane pro-quality. At sampling rates that high, it's not so much a difference in quality issue, as it is just simply a quality issue. Because afterwards it's all going to be turned into 16-bit audio anyway!
But, even though everything's going to end up 16-bit audio at 44100 (or a compressed audio file), it's going to be clean as fuck from downgrading from 32-bit. Recording at the highest bitrate you can possibly get will guarantee the music sounding as you intended. That's what I've seen.

EDIT ::: To answer your question about the photos... the bigger one resized down will look alot better than the other one. Probably by alot.


I see. What about just recording at higher bit rates (khz) instead of both bit rates and bit depths(24 bit)? I read somewhere that bit rates(khz) are the only thing that determine the frequency spectrum(all the subtle details the Eq, the crisp highs and sub-low bass and the thousands of frequencies in between) of an audio recording.

Wouldn't it be more sane just to record in 16bit/96khz since higher bit rates actually give your EQ more detail or does 24 bit also contribute to it? I'm still not sure exactly what higher bit depth does to the sound. Btw I read somewhere that using 32-bit floating point mode in Cubase actually uses less cpu power than 24 bit because of the way the cubase engine is designed, I might try it and see if it runs faster.

I'm gonna start doing my projects in 24/96 and see how well they go(I have quadcore CPU, s-ata hard disks and 4gb's of ram so my PC is fast, it's just that my cracked version of Cubase 4 is unstable as fuck ).
__________________
Quote:
Originally Posted by far_beyond_sane

(Did you know In Flames had a 2005 album called "Come Clarity"? How prophetic. I think they're trying to tell us all their sperm are dead.)
 
Old 2008-05-22, 08:33
sqol's Avatar
sqol
Post-whore
 
Join Date: Mar 2005
Location: London, UK
Posts: 1,841
The samplerate is the number of times per second that a sample of music is taken, so 44.1KHz is 44,100 samples/sec. The bitrate represents the depth of each sample, so 16bit means there are 65,536 possible values for every sample. Quantization is where the hardware (or software) has to pick between two values- what if the sample is between two of the 65,536 values? Then the quantization kicks in and moves that sample to the closest one. If you have a larger bitrate, less quantization will have to be performed, because you've got more possible values for the sample. Increasing the samplerate will give you more samples/second, which would be a more accurate representation for the sound.

So which samplerate and bitrate would i recommend? Probably 16/44.1 to be honest. You won't really be able to tell the difference unless you're recording classical music, (which has an enormous dynamic range compared to most other genres of music). And, as has been said, you've got to eventually downsample back to 16/44.1 for a CD, so if it's already in that format, there's no extra conversion needed. If it was in something else, you'd have to downsample, and thats more quantizing, which could cause loss.

If you're gonna change your samplerate, you'll have to keep it at that for the whole project... if you record one thing on 44.1, and another at 48, when you play them back, they will be at different pitches, and it's something that isn't the easiest to spot. So keep the samplerate consistent throughout, so that you don't have issues with playback
__________________

The Freedom of Chaos
The Secret of The Secret
The Truth of The Truth

Quote:
Originally Posted by Undone
moonraven?....more like ass raven

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Top

========

Contact Us | Privacy Policy | Disclaimer
Copyright © 2001-2014 MetalTabs.com. All Rights Reserved.
Powered by vBulletin
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.