Dumb Question about CD Ripping

Centropolis

Enlightened
Joined
Mar 17, 2008
Messages
710
Location
Mississauga, Canada
This is supposed to be simple but I can't get over what I think is the answer.

If I am ripping a MONO CD, is there any reason why the quality of my MP3s be better if I rip in stereo mode? I can't logically think of why...I mean if it's mono...why rip stereo right? Or am I missing something?

Now, I am sure most of you will not encounter this problem....who has CDs in mono anyway? (Well, I do!)

I told you this is a dumb question.
 
I don't remember there being any option to rip in mono or stereo. I usually just select the disc or tracks and let 'er rip. What ripping program are you using?
 
If I am ripping a MONO CD, is there any reason why the quality of my MP3s be better if I rip in stereo mode?
Are you saying it is better, from your own observations? If so, it might be because of the data rate you end up with. When you rip stereo it'll end up at (for example) 128 kbps, but if you change to mono with nothing else changed it'll probably be compressing the single sound channel down to 64 kbps. So you have half the information. A stereo signal doesn't have completely different information on the two channels (and a stereo MP3 can be set to share the bass anyway) so halving the data rate for mono could be a significant drop in quality. That's the best I can come up with.
 
+1 for data rate; the quality of your recording isn't determined by mono or stereo, but what the data rate is (if you're using some cheap program that only tells you if you're ripping mono/stereo and not what the data rate is, it's probably using a higher data rate for the stereo option, hence better quality.)

In my experience, 192Kbps is the rate where the notable difference is; anything lower will result in reduced audio quality. A good ripping program will let you specifically set a single-channel/mono 192Kbps (or better) rip.
 
Bitrate alone is insufficient information to determine sound quality. The encoder is much more important. In my experience, LAME and Fraunhofer (which is not free anymore, IIRC) are the highest quality, and for me (and most other people form a double-blind test I read about a few years ago) 128 kbps is pretty much indistinguishable from CD. Then again, that depends on the source, your sound equipment and the state of your ears... mine are getting old.

AAC is much better with the same bitrate than mp3.

And in my experience it does not matter if you set the encoder to mono or stereo for the sound quality. If you're sure the source is mono, you'll get a smaller file with mono, but that's about it.
 
Ripping, converting, and importing via iTunes will automatically use half the bit rate for monophonic files compared to stereophonic files. There is no benefit to converting a mono track to a stereo MP3 or MP4 file. You'll just use twice the space for the same sound quality.
 
128 kbps is pretty much indistinguishable from CD. Then again, that depends on the source, your sound equipment and the state of your ears... mine are getting old.

A few years back I did a double-blind test with this and the Lame algorithm with CDEX . As I recall, I was testing Ogg Vorbis against .MP3.

What I found is if you used a high quality CD with good mastering then the jump to 192kbps was well worth it. If it wasn't a high quality source (80% of all recordings) then 128 is fine.
 
Some programs change the encoding bitrate if you change the ripping from stereo to mono, I suspect this might be what you are experiencing.

(making these numbers up)
If you select stereo over mono, it might change the bitrate to 128(stereo) instead of 64(mono).
You would think that they will have the same audio quality because stereo is twice as big but has 2 channels instead of one.

However this is not the case.

Since the original source is mono and the encoder is asked to convert it into stereo, the left and right channels will be identical.
The encoder will know that the channels are identical and won't waste space with 2 sets of identical data.
Instead it will pack in extra information from the source to make up its 128 bitrate it was asked to do.

That's more or less what happens, I could be completely wrong though and like some others said, the encoder does matter alot.

Anyway, I wouldn't make too much of a fuss over encoding.
Simply pump up the bitrate on your favourite encoder and just enjoy the music, just look for anything using the LAME encoder and you'll be fine.

For bitrates to select:
Mono source or most music = 128kpbs
Very 'busy' music such as rock or symphony orchestra = 192kbps

If you have tons of hard drive space, I'd just do everything in 320kbps so you won't have to worry about missing anything

And try not to get caught up with the 'lossless' encoding craze, I struggle to hear a difference between lossless and 192kbps (or 128 even!), and that's with a headphone setup worth a couple o' Mcgizmo's.
Then again, I'm no sound engineer so I don't know what to listen for :laughing:
 
I use EAC + FLAC to rip and store. Then, to use in the car or a portable device, I encode the FLAC to MP3 with Lame -V2 (VBR, about 190 kbps. More in "busy" music)

Lol, I am not the one to use the "busy" thing to reffer hard to encode music :D
 
128kbps to me is too low. Even with the Lame encoder, I start to hear that "watery" artifact to the sound in some instruments like piano notes. I use 160 for economy and 192 mainly.

I agree that some music, it is hard to tell even at 128.

Experiment: Rip a mono tune to stereo at 128kbps. Open in a audio editor like Audacity. Split the channels, invert on and combine. This will mathematically create a "difference" channel. ideally, a mono source would be completely silence. You should here a funky SciFi like sound due to the compression artifacts.
 
Last edited:
Thanks everyone.

I've read that for archival purposes, a lot of people rip CDs into FLACs and then from the FLACs, you can pretty much do whatever you want with them...converting to MP3s to put on iPods...converting them back to WAVs and burn them back on CDs.

I am wondering....although FLACs are lossless, when you convert them to WAVs and back on CDs, do you lose any quality? I have a hard thinking understanding how FLACs are lossless....but they are smaller than WAVs....so converting back to WAV to me would lose something.

So isn't it true the only way to not lose ANYTHING is just rip them WAV?
 
And try not to get caught up with the 'lossless' encoding craze, I struggle to hear a difference between lossless and 192kbps (or 128 even!), and that's with a headphone setup worth a couple o' Mcgizmo's.
Then again, I'm no sound engineer so I don't know what to listen for :laughing:

Personally, on certain types of music, I can hear the difference between 128 and 192 (or above). Now I can't hear any difference between 192 and 320 but from 128 to 192, I can.
 
I suppose you can say FLAC is like a zip or rar file, it just eats up processing power to compress and extract, all the information is still there!

You won't lose any quality converting between FLAC or WAV, it's just that FLAC has those tags for song titles and what not.

I agree 128kbps is absolutely no good for some songs (but those some songs can be most of the songs you listen to)
But since hard drive space is so plentiful nowadays, there's no reason not to go for 192 at the very least.

These days I rarely 'just' listen to music, I'm always doing something else and if somebody secretly changed all my music to 128kbps, I probably wouldn't realise :nana:
 
I agree 128kbps is absolutely no good for some songs (but those some songs can be most of the songs you listen to)
I don't know if I should :laughing: or :shakehead when I hear people say things like that. No good? Come on. Consider how and where you actually listen to the stuff.
These days I rarely 'just' listen to music, I'm always doing something else and if somebody secretly changed all my music to 128kbps, I probably wouldn't realise :nana:
Oh, you have. :sssh:

Much of my listening to MP3s is in my car - far from an ideal listening environment; there's engine noise, road noise, traffic noise, other environment noise - for example today there'll be weather and windscreen wiper noise - all competing with the MP3. Realistically, does it stand a chance? Or rather, would I stand a chance of being able to tell?
 
A few years back I did a double-blind test with this and the Lame algorithm with CDEX . As I recall, I was testing Ogg Vorbis against .MP3.

What I found is if you used a high quality CD with good mastering then the jump to 192kbps was well worth it. If it wasn't a high quality source (80% of all recordings) then 128 is fine.

I recall reading a year or two ago about a recording engineer doing A B testing between VBR AAC and lossless formats using monitor speakers (i.e., very high quality speakers, not computer monitor speakers), and found 128 bit VBR AAC to be indistinguishable from lossless. Of course, that's VBR MP4/M4A, not MP3.

I use 160 bit/s VBR M4A for my own encoding. I tend to avoid any MP3 files under 192 bit/s.
 
I use EAC and 192K VBR high quality with some songs bouncing up to 320k in some places as needed. I put 5000 songs on a 30 gig ipod that way.
 
I use EAC and 192K VBR high quality with some songs bouncing up to 320k in some places as needed. I put 5000 songs on a 30 gig ipod that way.

EAC? Is that a previously unmentioned codec or is that a typo? (Curious, not criticizing)
 
Top