I broke down and got a Blu-Ray

LuxLuthor

Flashaholic
Joined
Nov 5, 2005
Messages
10,654
Location
MS
Onuris, it is the Samsung LN40B630, so it is 40" but full 1080p. We only sit about 8-9 feet away, so that seemed a pretty good size, but now I wish I had a 46." In any case it is right on that edge size you mentioned where it may not be as noticeable. But also I guess for DVD upcoding, that is probably a better size.
 

js

Flashlight Enthusiast
Joined
Aug 2, 2003
Messages
5,793
Location
Upstate New York
According to one of our vendor reps, several manufacturers will be displaying TVs and monitors that have 2048p and 2160p quad HD resolution at the CES next year. It will not be very long after that they will be on the market.

I believe that soon after we will also be seeing A/V media on non-volatile SSDs at those resolutions. In this day and age of ever increasing technology and the demand for it, progress in inevitable.

Most movies these days are produced digitally at 4k resolution, many at 8k, and the manufacturers have the technology and know-how to produce displays at those resolutions- in fact there are some out there right now, just not for mass consumer purchase. The industry only releases advancements at certain increments in relation to what they are capable of, in part due to development costs, and also to guarantee future upgrades and profits.

. . .

Whoop-de-freaking-do! So we will be "seeing" such things at conventions as proof-of-concept? So what? That absolutely does NOT mean that 2048p TV's are just around the bend and that they will become the new standard. Yes, technology is always on the march and standards will, in the long run, get better and better. I agree.

But not any time soon, Onuris. Not any time soon. Not for a decade or more. Heck, it will be another decade before the majority of people have 1080p TV's.

And 4k or 8k? Holy cow! And another format change? SSD?

I think you're underestimating the inertia of the consumer base, and I stand by what I said above, every word. Convenience and ability to be streamed will win hands down over fantastic resolution. 1080p is already too high for that.
 

Onuris

Newly Enlightened
Joined
Jan 31, 2009
Messages
157
Location
NW Indiana
Thanks Onuris. It makes sense in that arena that you'd run nice everything, including cables, if for nothing more than the eye candy effect. It would be kind of strange to have a generic drug store cable coming out of the back of a Krell, Mcintosh, or similar flavor...lol. It would be like installing a vinyl interior on a Ferrari.

Regarding the HDMI format, I know it has received a lot of criticism by the industry primarily but on the other hand, someone had to decide upon something. I know that component cables, contrary to some popular info, is in capable of carrying a 1080p signal but as I understand it the RCA connector could be the limiting factor. Is it true that RCA could limit the data transfer rate or that it's at least pushing the rate capabilities now? I guess the industry could have gone with component but they might have been forced to change connector types for future proofing. Plus, if component manufacturers were to use optical again for sound some would want the option of digital coax. I guess what I'm saying is that by not going with the HDMI format, manufacturers would be scrambling to get the correct number of component, optical, and digital coax's just like they did ten years ago, where few consumers were ever perfectly happy. The simplicity is a major advantage to 95+% of the consumer market with absolutely no downside for that same percentage. For those rare situations where sparklies and dropouts occur the owner need only to get a new cable.

Lastly, I have to ask if any of the high end HDMI cables you're commonly using have ferrite cores attached and if not, why any manufacturer would be using them on digital signal carriers?

Thanks Onuris. :)


P.S. Thank you for those links too!

Yes a similar flavor- Lexicon, Legacy, etc. Krell and McIntosh make some really nice stuff, we used a Krell Evo 707 processor and Evo mono amps in the big $1.2 mil install we did. Have seen but not worked with any McIntosh stuff, but I know it is nice. I like their classic retro look as well, we have considered carrying their products.

As far as an RCA connector limiting the data transfer rate I don't believe it would on a cable designed for digital signals. In fact, I believe that having an HD digital coaxial cable using RCA or BNC connectors could be designed to have the same or greater bit rates and pixel clock rates than HDMI and would be far superior to it, esp in the area of carrying a digital signal over longer distances. And while length is also a limiting factor, optical cables are the way to go for true high-end digital signal transmissions between components. I am using HDMI, but the cables b/t the equipment are only 1 meter. The longest run I have is b/t the video processor in my equipment rack and the projector in the ceiling in my theater, and I am using component video for that.

While it is true that an HDMI cable is carrying digital data, it is still using an electrical signal to do so, just instead of a sine wave as with an analog signal, it is a series of pulses that represent the bitstream. So it is susceptible to the same high frequency/radio frequency interference as an analog signal.

Most of the high-end HDMI cables are sufficiently shielded that a ferrite suppressor is probably overkill. Kimber cable is the only high-end manufacturer that I am aware of that uses ferrites in addition to heavy shielding in their cables. Ferrites are mostly used in the lower cost cables b/c they are effective and less costly than heavy shielding.

Oh, and a few hundred bucks for a cable is nothing in the high-end world. Check these out.

http://transparentcable.com/products/pdf/prices/retail_prices_04-2009.pdf

And for the record, LCDs are still inferior to plasmas :)

Not necessarily, it depends on the viewing environment and the quality of the component.

In a dark room, top of the line plasma vs top of the line LCD, your statement would be true. But in a well-lit room, the LCD will be capable of a superior picture. As far as overall picture quality, plasmas excel in black levels/contrast, color accuracy and vibrancy, depth perception, motion accuracy/stability, and viewing angle. LCDs typically excel in gray scaling accuracy and brightness.

LCDs also produce a much better static image than a plasma. Compare the two with channels such as Weatherscan or one of the music channels for instance. The plasma image will look jagged, while the LCD would be sharp and well-defined. This is one reason why plasmas are not used for computer monitors. But for moving video, plasma is far superior. Plasmas may also be susceptible to image burn in, where static images are eventually etched into the glass element and remain there even when the signal is no longer there. An example would be the station icons used on the lower right corner. While in the past images could burn in in as little as 15 min, plasmas have gotten much better now, with ghosting that can be removed/washed out by displaying a gray screen to take over an hour, and permanent burn in well over 10 hours. But LCD is immune to this completely. Another reason why computer monitors are LCD. Another area where LCDs are superior is in longevity, although plasmas are getting better, esp. the high-end models. An LCD will last as long as its backlight, and most of those can be replaced. And many new sets use LED backlighting that can last over 100,000 hours. Not so with a plasma, which uses electronic currents to excite a combination of noble gases (xenon, neon, argon) which have a limited life span, about 100,000 hours to their half-life, meaning at that time, the phosphors will glow half as bright as they did when new. There is a long period of time where the phosphors will remain as bright as when first new/broken in, but after that will continue to degrade over time. And there is no way to recharge or replace these gases once that starts to happen. Lastly, LCDs are much more energy efficient, typically their power consumption is about half that of a plasma of the same size.

All that said, as the manufactures continue to improve their products, the differences b/t the two formats are becoming incresingly less noticeable, esp. with LCDs.

The true superior TV/monitor format would have to be Mitsubishi's Laser Vue DLP. As with a front DLP projector you simply cannot get a better picture at this time.
 

Onuris

Newly Enlightened
Joined
Jan 31, 2009
Messages
157
Location
NW Indiana
Whoop-de-freaking-do! So we will be "seeing" such things at conventions as proof-of-concept? So what? That absolutely does NOT mean that 2048p TV's are just around the bend and that they will become the new standard. Yes, technology is always on the march and standards will, in the long run, get better and better. I agree.

But not any time soon, Onuris. Not any time soon. Not for a decade or more. Heck, it will be another decade before the majority of people have 1080p TV's.

And 4k or 8k? Holy cow! And another format change? SSD?

I think you're underestimating the inertia of the consumer base, and I stand by what I said above, every word. Convenience and ability to be streamed will win hands down over fantastic resolution. 1080p is already too high for that.

Hmm, this is like deja vu all over again, and again. Anytime a new type of tv, resolution or format is revealed, many claim it will be at least 10 or 15 years before we see it. But in reality it always ends up being just a few years, sometimes sooner. Was true of LCDs and plasmas taking over CRT and DLP sales, 720 replacing 480, 1080 replacing 720. I have been in this industry a long time, and have seen the trends. Just a few years ago many were saying that there is no way we will be seeing 1080p anytime soon, that 720p was more than good enough. Now the top selling tvs are 1080p VISIOs. I am willing to bet that within 3-5 years 2048 or 2160 will be as common as 1080 is now, and media on SSDs will share the spotlight alongside BR and eventually replace it. The demand for new technology these days leads to it being marketed at an ever increasing rate. And I am also willing to bet that these new 2k sets will be OLEDs as well.
 

Onuris

Newly Enlightened
Joined
Jan 31, 2009
Messages
157
Location
NW Indiana
Onuris, it is the Samsung LN40B630, so it is 40" but full 1080p. We only sit about 8-9 feet away, so that seemed a pretty good size, but now I wish I had a 46." In any case it is right on that edge size you mentioned where it may not be as noticeable. But also I guess for DVD upcoding, that is probably a better size.

Those Samsungs are at the top of their class for the price point. Great color accuracy with their color enhancer and nice fast refresh rate and response time. Decent black levels as well. I know how you feel about the size, to me anything under 50' or so seems too small. Yeah, if you are upconverting a DVD signal, the smaller screen will give more detail to the image.
 

LuxLuthor

Flashaholic
Joined
Nov 5, 2005
Messages
10,654
Location
MS
Onuris, if I were to find an excuse to move that 40" up to the bedroom and get a 46-50" for the main family room that would give nice performance in daylight (curains open) and ideal performance in dark room about 15'x15' and under $2K, what would you recommend?

I was looking at the newer 240Hz models, and wonder if that makes that much more difference for motion performance, such as this Samsung LN46B750 that is $1530

I don't know how much difference going from 46 to 50" makes either.
 

js

Flashlight Enthusiast
Joined
Aug 2, 2003
Messages
5,793
Location
Upstate New York
Hmm, this is like deja vu all over again, and again. Anytime a new type of tv, resolution or format is revealed, many claim it will be at least 10 or 15 years before we see it. But in reality it always ends up being just a few years, sometimes sooner. Was true of LCDs and plasmas taking over CRT and DLP sales, 720 replacing 480, 1080 replacing 720. I have been in this industry a long time, and have seen the trends. Just a few years ago many were saying that there is no way we will be seeing 1080p anytime soon, that 720p was more than good enough. Now the top selling tvs are 1080p VISIOs. I am willing to bet that within 3-5 years 2048 or 2160 will be as common as 1080 is now, and media on SSDs will share the spotlight alongside BR and eventually replace it. The demand for new technology these days leads to it being marketed at an ever increasing rate. And I am also willing to bet that these new 2k sets will be OLEDs as well.

Well, you may turn out to be spot on, Onuris! We shall see, n'est pas?

But, I would point out that nearly ALL of the people I know right now have standard definition TV's. Not even 720. So if in 3-5 years 2048 is "as common as" 1080p is right now, it would definitely NOT fit my own definition of mainstream.

What I was predicting was that 1080p will indeed eventually become mainstream--as in the majority of homes having 1080p sets--and probably in only 3-5 years. BUT, I was (and still am) predicting that something like 2048 will not become mainstream for many many years to come, because the source media format will change from discs to internet streams or iTunes type file formats. And unless my calculations are off, not even the most extreme high-end internet connection can stream a BR disc level quality video clip! Even standard DVD quality isn't able to be streamed over most peoples internet connections. But, that's OK. Fewer fps, somewhat lower resolution? Not a big deal down to a certain point, really. But the convenience factor IS a big deal for most people.

Anyway, I'm sure you understand my point so I will stop belaboring it!
 

Onuris

Newly Enlightened
Joined
Jan 31, 2009
Messages
157
Location
NW Indiana
Onuris, if I were to find an excuse to move that 40" up to the bedroom and get a 46-50" for the main family room that would give nice performance in daylight (curains open) and ideal performance in dark room about 15'x15' and under $2K, what would you recommend?

I was looking at the newer 240Hz models, and wonder if that makes that much more difference for motion performance, such as this Samsung LN46B750 that is $1530

I don't know how much difference going from 46 to 50" makes either.

I don't really see any difference in the 240hz models as compared to those that refresh at 120hz. With either, depending on the source material and the brand or individual set, there still may be some slightly noticeable motion blur.

First a little tech lesson on the whole issue. The main reasons that LCDs suffer from motion blur is due to the fact that they flip from one frame to the next at a rate of 60 frames per second, since that is the frequency of the household current supplied to the set. Part of the blur perception occurs due to the fact that the image for each frame is held frozen on the screen for 1/60th of a second, and then abruptly shifted to the next frame. This is known as "sample and hold". Plasmas and DLPs don't hold the frame like that, but rather pulse more smoothly from one frame to the next with room for some downtime. That downtime is part of what contributes to their ability to have outstanding high contrast/black levels.To compound the issue is the fact that movies are recorded at and shown in theaters at 24 fps (frames per second). But the NTSC standard for TV, DVD, BR, and other formats is 29.97 fps. This creates an issue since there are in effect 5 frames missing in the conversion. To compensate for this, a technique known as 3:2 pulldown is used to fill in the missing frames. These extra frames create a jittery motion artifact know to us video geeks as "judder". Originally all LCDs had a refresh rate of 60hz. Our original source material had a rate of 24 fps, which does not divide into 60 evenly, so the 3:2 pulldown was required to display the information on a set refreshing at 60hz. But 24 does go into 120 and 240 evenly, so if the film is displayed at its native frame rate of 24 fps, which most BR players are capable of, the judder is eliminated. But there is still the issue of the image being held static, and flipped from frame to frame. It does not matter how fast the refresh rate, 240hz, 480hz, or higher, it will still not be as smooth as a plasma or DLP.

So the older 60hz sets were notoriously bad in regards to image blur, and going to 120hz, while not completely eliminating it was such a great improvement that it caused the spec to be important. I think the manufacturers are taking advantage of that with the new 240hz specs. While I am sure that it does not hurt to have a 240hz refresh rate and I am sure it will show better performance in a bench test, it is my opinion that this is just one more spec to justify higher prices and make people want to upgrade to what is perceived to be the next best thing. Our eyes really cannot see the difference b/t the two, as we can only really notice differences at about 70hz and under.

That is a good price for the 46" B7500 series set, but you are still paying for the 240hz feature as opposed to screen size. If you really want an LCD or did not want to go too big on the screen size, I would suggest the 55" Samsung LN55B650 or 52" Sony KDL52XB9R over that one, I did a quick search, they are both available at several places for under $2k.

But if you don't think the screen size is too much, my top choice under 2k by far would have to be the 65" Mitsubishi WD-65837 DLP. Much better picture than the LCDs. The only drawbacks are that like an LCD there is a sweet spot, so you don't see as good of a picture off to the sides or at an angle. And there is the fact that in a couple/few years you will have to replace the lamp, at about $120 currently.

Well, you may turn out to be spot on, Onuris! We shall see, n'est pas?

But, I would point out that nearly ALL of the people I know right now have standard definition TV's. Not even 720. So if in 3-5 years 2048 is "as common as" 1080p is right now, it would definitely NOT fit my own definition of mainstream.

What I was predicting was that 1080p will indeed eventually become mainstream--as in the majority of homes having 1080p sets--and probably in only 3-5 years. BUT, I was (and still am) predicting that something like 2048 will not become mainstream for many many years to come, because the source media format will change from discs to internet streams or iTunes type file formats. And unless my calculations are off, not even the most extreme high-end internet connection can stream a BR disc level quality video clip! Even standard DVD quality isn't able to be streamed over most peoples internet connections. But, that's OK. Fewer fps, somewhat lower resolution? Not a big deal down to a certain point, really. But the convenience factor IS a big deal for most people.

Anyway, I'm sure you understand my point so I will stop belaboring it!

Oui, je vous comprends! When put that way I will have to agree that as far as the 1080p format becoming mainstream where most people will have them, then yes, it will be a few more years I am sure. I was almost going to definitively post that 1080p is mainstream right now, who has 720p anymore, but I have to consider that I hang in different circles than most, so my definition of mainstream is a bit higher-end. And I also realized I would be somewhat of a hypocrite, as the 40" Sony in the kid's playroom is still 720p. As far as the 2k formats yes it will be quite some time before they are mainstream, 10 or more years would be about right, but they will be available within the next couple of years for those of us who want them.
 

Flashanator

Flashlight Enthusiast
Joined
Jan 19, 2007
Messages
1,203
Location
The 11th Dimension
hey Lux! bout time you went Blu. :cool: Come into the light.

some randoms id recommend.

north by north west, looks fantastic,
superman returns 2008 release, its the old warner bros encode from hd-dvd but still looks great. the film is meant to look soft anyway.
 

da.gee

Enlightened
Joined
Aug 30, 2007
Messages
733
Thanks Onuris. I've always wondered what 3:2 pulldown was. Good explanation.

I have had the 52" Sammy 650 for about a year and half and it is a fantastic set. I did a ton of research before purchasing and it was very well regarded in the price range. Very pleased. Plenty of inputs and a great picture. Highly adjustable. You can tweak to your hearts content. Everyone comments on the excellence of the set up. It's well below $2K now. I think I purchased for ~$2,200 in June 2008.

Lux: get the bigger screen if in doubt! I went back and forth between the 46" and 52" and so glad I went with the larger. It seemed huge at first but now just blends in and I'm never sorry I did it. Viewing distance in our world is 8' - 11' and I find my old eyes like the size just fine. Of course, one of my criteria was I had to be able to see the score from the kitchen 20' away so YMMV. One consideration, if you do any gaming that uses split screen for two or more people the extra inches really help.

Right now my audio setup is sorely in need of some upgrading but those funds always seem to get directed at things like food, kid's tuition, gas, electricity, flashlights, etc. I'm limping along with a Yamaha surround system from ten years ago. The receiver has one coax and ONE optical in (I use a three way splitter to it at least), no HDMI, one component in and many other deficiencies. The horror!!!! I need a solid mid-range receiver/amp and begin my audio build out from there.
 
Last edited:

LuxLuthor

Flashaholic
Joined
Nov 5, 2005
Messages
10,654
Location
MS
Lux: get the bigger screen if in doubt! I went back and forth between the 46" and 52" and so glad I went with the larger. It seemed huge at first but now just blends in and I'm never sorry I did it. Viewing distance in our world is 8' - 11' and I find my old eyes like the size just fine. Of course, one of my criteria was I had to be able to see the score from the kitchen 20' away so YMMV. One consideration, if you do any gaming that uses split screen for two or more people the extra inches really help.

After this thread, I have had the tape measure out a number of times....including while the wife is watching...trying to demonstrate how much more screen space we could have. When asked about getting to the blocked cabinets behind it, I reminded her they are filled with obsolete VHS tapes, and also mentioned how nice this 40" TV would look in the bedroom! LOL! :whistle:
 

Mjolnir

Flashlight Enthusiast
Joined
Dec 19, 2008
Messages
1,711
My sitting position is about 7 feet away from my 50 inch plasma, and I still wouldn't mind a larger TV. This is partly because most of the blu-ray movies are not filmed with a 16:9 aspect ratio, so there are bars wasting space on my TV.

Onuris, do you know why are movies only filmed at 24fps? I have never really understood why they use such a low framerate, especially considering it is lower than that of normal television. I know that with a video game, a framerate of 24 FPS is not all that tolerable.

I haven't really seen LCD TV companies advertising anything about the response times for the panels. Doesn't a slow response time cause more problems than a lower refresh rate?
Also, why aren't the TVs designed with the capability to have a 48Hz refresh rate? Many smaller LCD computer monitors can switch between 60 Hz and 75 Hz, so why can't they do this with an LCD TV?
 

LEDninja

Flashlight Enthusiast
Joined
Jun 15, 2005
Messages
4,896
Location
Hamilton Canada
Onuris, do you know why are movies only filmed at 24fps? I have never really understood why they use such a low framerate, especially considering it is lower than that of normal television. I know that with a video game, a framerate of 24 FPS is not all that tolerable.
Many movies are filmed using real film - a piece of celluloid with holes down both sides. Sprockets push the film forward one frame at a time. Then the film stops and the shutter opens to take a still picture. then the film is pushed forward another frame. There is a limit to how fast this start-stop can be done without ripping the film.
In addition to the danger of ripping the film, going to a higher film rate means using more film per minute resulting in bigger cameras. The cost of the film for distribution has to be considered. Going from 24 to 30 fps means adding another 43,200 frames to a 2 hour film. That is the equivalent of 600 rolls of 35mm*36 shots still camera film. Multiply that by 3000 theaters*** and 52 new releases per year and the cost is huge. (*** I am assuming 3000 North American theaters & a similar # of foreign theaters. Where theaters have sufficient security against bootlegging the studios will ship the movie on reusable HDs but for the rest real film is used to prevent high quality bootlegging. Also a lot of movie theaters only have film equipment, have not upgraded to digital projectors)
A few years ago someone developed a film system running at double the frame rate. Roger Ebert was given a demo and reported a smoother motion picture. But the studios nixed it due to cost.

Why 30, 60, 120, 240 Hz? It is 60Hz line frequency only in North America. The rest of the world has 50 Hz. 30, 60, 120, 240 do not divide well into 50. That is why North America had 525i @ 30 Hz NTSC TVs and most of the rest of the world has 625i @ 25 Hz PAL or SECAM TVs.
BTW movies in Europe are filmed at 25 fps which matches the 25 Hz of their TVs exactly.

You have to remember 24 fps was standardized when film was still black and white and people wrote letters with fountain pens. I don't think manual typewriters were invented yet, let alone electric typewriters or word processors or computers or txt-ing cellphones with keyboards.
 
Last edited:

blasterman

Flashlight Enthusiast
Joined
Jul 17, 2008
Messages
1,802
A few years ago someone developed a film system running at double the frame rate

Douglas Trumball developed the system way back in the 80's.

As for movie theaters, almost all in my area have upgraded to DLP. I could care less how many are still using film projectors because they are typically second run houses. I won't pay for a ticket in a film based house because the technology is terrible and always has been, except for maybe 70mm. When watching feature films I used to count the number of emulsion changes used with the interpositive if I was bored with the film.

If anything, it's film based houses holding up motion picture technology. Shooting faster than 24fps really improves quality, particularly with action sequences, but older houses can't display this.

Also, most people I know running HD sets have a maxiumum of 1080i display. Personally I'd rather watch regular DVD in progessive rather than endure 2hours of 1080i strobing and motion artifacts. As soon as an actor walks across the screen or there's a pan shot I get a migraine.
 

Onuris

Newly Enlightened
Joined
Jan 31, 2009
Messages
157
Location
NW Indiana
Many movies are filmed using real film - a piece of celluloid with holes down both sides. Sprockets push the film forward one frame at a time. Then the film stops and the shutter opens to take a still picture. then the film is pushed forward another frame. There is a limit to how fast this start-stop can be done without ripping the film.
In addition to the danger of ripping the film, going to a higher film rate means using more film per minute resulting in bigger cameras. The cost of the film for distribution has to be considered. Going from 24 to 30 fps means adding another 43,200 frames to a 2 hour film. That is the equivalent of 600 rolls of 35mm*36 shots still camera film. Multiply that by 3000 theaters*** and 52 new releases per year and the cost is huge. (*** I am assuming 3000 North American theaters & a similar # of foreign theaters. Where theaters have sufficient security against bootlegging the studios will ship the movie on reusable HDs but for the rest real film is used to prevent high quality bootlegging. Also a lot of movie theaters only have film equipment, have not upgraded to digital projectors)
A few years ago someone developed a film system running at double the frame rate. Roger Ebert was given a demo and reported a smoother motion picture. But the studios nixed it due to cost.

Why 30, 60, 120, 240 Hz? It is 60Hz line frequency only in North America. The rest of the world has 50 Hz. 30, 60, 120, 240 do not divide well into 50. That is why North America had 525i @ 30 Hz NTSC TVs and most of the rest of the world has 625i @ 25 Hz PAL or SECAM TVs.
BTW movies in Europe are filmed at 25 fps which matches the 25 Hz of their TVs exactly.

You have to remember 24 fps was standardized when film was still black and white and people wrote letters with fountain pens. I don't think manual typewriters were invented yet, let alone electric typewriters or word processors or computers or txt-ing cellphones with keyboards.

Pretty much hit the nail on the head there, I would like to also add that 24fps is the lower limit of where most people do not see any noticable flickering of the film from frame to frame, so that factored into the spec as well.

Some digital HD programs are recorded and broadcast at much higher speeds, such as 180 fps for NASCAR races, and 600 fps for NHRA races. Makes for some very smooth and detailed slow motion playback.

There are cameras capable of recording at 1 million fps. Check this out:

http://www.youtube.com/watch?v=QfDoQwIAaXg
 

Onuris

Newly Enlightened
Joined
Jan 31, 2009
Messages
157
Location
NW Indiana
I haven't really seen LCD TV companies advertising anything about the response times for the panels. Doesn't a slow response time cause more problems than a lower refresh rate?
Also, why aren't the TVs designed with the capability to have a 48Hz refresh rate? Many smaller LCD computer monitors can switch between 60 Hz and 75 Hz, so why can't they do this with an LCD TV?

The main reason that response times are not advertised is because there is no industry standard definition for them. Most of the big name manufacturers define it as the time it takes for the pixels to go from full off (black) to full on (white) and back to full off again. Some of the cheaper manufactures use a gray to white to gray measurement, or black to white only or white to black only, which are only half the level to make their sets appear faster than what they actually are. Also many manufacturers will give a value that is only accurate under the most ideal circumstances. It is better to have a set with a good video processor that has a spec of 8ms which is stable holding close to 8ms to 12ms with actual program material, than one that claims to be 4ms, but that will actually vary b/t 8ms to 24ms with program material.

The reason that TVs have refresh rates of 60 hz, 120 hz, 240 hz, is because they are refreshing off of the household power line frequency, which is 60 hz, so the value has to be a multiple of that. Computer monitors on the other hand are refreshing off of the video card, so the refresh rate can be set to whatever value the card supports.

The response times and refresh values do not necessarily mean much anyway. Nothing beats actually sitting down and viewing and comparing. A set can look great on paper, but not look that great when auditioned. And some of the sets that have the best pictures have fairly mediocre specs on paper.
 
Last edited:

LuxLuthor

Flashaholic
Joined
Nov 5, 2005
Messages
10,654
Location
MS
I have known about the lack of standards for defining response time, and find it shocking that a clear defined measurement term has not been established after all this time.

The problem is that the average person cannot get a fair & optimal viewing in most locations. Going to your local Best Buy, Sears, or some other electronics store does not mean they actually setup the display correctly. Then there is the issue of looking at a wall of displays all being fed a signal from whatever splitter switch setup is used.

You almost have to rely on reviews from various websites.
 

Mjolnir

Flashlight Enthusiast
Joined
Dec 19, 2008
Messages
1,711
I have known about the lack of standards for defining response time, and find it shocking that a clear defined measurement term has not been established after all this time.

The problem is that the average person cannot get a fair & optimal viewing in most locations. Going to your local Best Buy, Sears, or some other electronics store does not mean they actually setup the display correctly. Then there is the issue of looking at a wall of displays all being fed a signal from whatever splitter switch setup is used.

You almost have to rely on reviews from various websites.

The last time I went to circuit city, even the higher end 1080P panels (both LCD and plasma) had a significantly worse picture than my 720P plasma because the signal was split between over a dozen TVs.
On top of that, TVs in those stores often have the image settings calibrated completely differently than you would in your home, so it is harder to tell the actual differences in color and contrast between sets.
 
Top