720p vs 1080i rips

^- I could reply to that, but i'll just get me some popcorn and wait for narf to come home.

I don't need narf to tell me I'm wrong, because I know I'm not.

Think about why the combing effect exists. Would it really be a problem if everything was filmed at 25 fps? No. Would there be any need for companies like Faroudja to develop advanced deinterlacing algorithms? No.

I posted an example above. It is really not that difficult, but just to clarify:

The video is stored in 25 frames per second, but each frame contains two fields. These fields can be used to render one full resolution image, och two images with a lower vertical resolution.

A good deinterlacer knows when to switch between (or combine) the two modes.

When interlaced video has been deinterlaced properly, the result will be 50 progressive frames per second.
 
You need to learn what interlacing is, and how it works. right...

The studio footage is shot at 50 frames per second. really 50 frames? I'd say 25 frames, 50 fields Interlacing is used to save bandwidth. yep Why do you think it looks so much smoother when watching it on BBC HD? It has nothing to do with post processing like the "100Hz" modes on some TVs. those are created to suck money from your pocket, some now come with 600Hz :lol: Turn it off and see for yourself.

Your TV has built in deinterlacing that switches between "film" and "video" mode automatically. my TV does not deinterlace, there is no need for deinterlacing with cathode ray tubes

Video = 50 frames per second, stored using interlacing (as 50 fields). 50 frames or 50 fields, make up your mind. Theres a huge difference, one frame consists of two fields. The downside of using this method is that the resolution will decrease when there is motion in the scene. That is why 720p50 is better for TV broadcast. You get 50 frames at "full" resolution.

Film = 25 frames per second, stored using interlacing, but they can be restored to full resolution when the two fields in each frame are combined.

Top Gear uses both of these methods. The studio footage is 50 fps, I thought you said 50 fields per second, that'd be 25fps (frames per second). and the films are shot at 25 fps. You can clearly see the difference when watching it on BBD HD.

This is how interlacing works:

One interlaced frame:
attachment.php


Two deinterlaced frames:
attachment.php
attachment.php


As you can see, the two deinterlaced frames contain different information. If you display the frame without deinterlacing, you will get the infamous combing effect. If you deinterlace the two fields to 25 fps, the resulting frame will be a combination of the two. Proper deinterlacing like the example above will result in 50p video. You cannot turn 50 fields per second into 50 frames per second without making up half the information in those 50 frames. As you say yourself, a field only has half the vertical resolution. The vertical resolution of the two resulting frames is 1920x540 wouldn't 1920 (or 1440) be the horizontal resolution? (or 1440x540 on broadcasts like the ones on BBC HD).

^- I could reply to that, but i'll just get me some popcorn yummy and wait for narf to come home.

I don't need narf to tell me I'm wrong nobody needs me, because I know I'm not. especially if failure is not even considered a potential possibility of an option. Your last name must start with C and end in larkson.

Think about why the combing effect exists. It exists because simple displays show both fields at once to make one full frame. Do I get a happy face sticker now? Would it really be a problem if everything was filmed at 25 fps? No. Would there be any need for companies like Faroudja to develop advanced deinterlacing algorithms? No.

I posted an example above. It is really not that difficult, but just to clarify:

The video is stored in 25 frames per second, but each frame contains two fields. These fields can be used to render one full resolution image, och two images with a lower vertical resolution.

A good deinterlacer knows when to switch between (or combine) the two modes.

When interlaced video has been deinterlaced properly, the result will be 50 progressive frames per second. Again, to get 50 proper progressive frames you would need 100 fields. Imagine 2x2 video at one frame per second, progressive. It's interlaced counterpart would have two fields per second with 2x1 pixels per field. Converting those fields to progressive video cannot result in two frames per second with 2x2 pixels each unless you make up half of the pixels.
 
Feel free to disagree, but please don't be dicks about it. ;)
 
I seems a little odd to discuss with someone who tries so hard to misunderstand as you do, but anyway:

really 50 frames? I'd say 25 frames, 50 fields

That's just a stupid comment, as I pointed out that the 50 frames (or images or whatever you want to call them) are stored using interlacing, which obviously results in two fields in each frame. I expected you to understand but you didn't. Sorry.

those are created to suck money from your pocket, some now come with 600Hz

Couldn't agree more. The result will always be rubbish when you try to create images that don't exist.

my TV does not deinterlace, there is no need for deinterlacing with cathode ray tubes

Well, I thought most people had flat screens by now. However, good (100Hz) CRTs deinterlace like any progressive display. If you don't want deinterlacing, you have to go for an ancient 50Hz TV.

50 frames or 50 fields, make up your mind. Theres a huge difference, one frame consists of two fields

I don't need to make up my mind, since you obviously know what I mean. But just to make it easier: 50 images per second. Ok?

You cannot turn 50 fields per second into 50 frames per second without making up half the information in those 50 frames. As you say yourself, a field only has half the vertical resolution.

I didn't say anything about full resolution - you made that up. Why didn't you comment what I wrote about 720p50 being a better solution? I know perfectly well that the resulting frames (yes, frames, more on that later) don't have full 1080p resolution (the picture information that is). I never said they did.

nobody needs me

Dr_grip does :D

It exists because simple displays show both fields at once to make one full frame. Do I get a happy face sticker now?

A smiley will have to do: :cool:

I don't really know why you keep telling me I'm wrong, when it seems that you actually agree. That is the whole point of my posts - when you deinterlace to 25 progressive frames like the scene does, you loose information that can never be restored. The studio scenes will stutter because 50 images have been reduced to 25 per second. Please note that I'm saying images here, to avoid confusion :rolleyes:

Again, to get 50 proper progressive frames you would need 100 fields.

Well, that depends on what you mean by "proper".

As I said before, after the 50 video fields have been properly deinterlaced, you end up with 50 unique images. Since they are now progressive images, it is not at all wrong to call them frames. The deinterlacer/scaler will resize them to full vertical resolution but that doesn't mean the actual resolution is better. It's just like digital zoom on a camera - it just gets bigger, not better.

Having said that, you still end up with 1080p50 after deinterlacing the video. It's not really a great solution - 720p50 is better (and easier) for progressive displays.

If you think that's not the case, look at how the image processing in modern TVs/projectors/whatever works. They use a fixed frame rate of either 50 och 60 fps (if you turn the 100Hz/800Hz/whatever :rolleyes: mode off).

Interlaced video is deinterlaced to progressive, and the display renders 1080p50.

The resolution of the actual picture information is not relevant here. The interesting thing is that so many people seem to misunderstand what interlacing is. The most common method of "getting rid of it" is to simply merge the two fields into one progressive frame, and that's not a very smart thing to do when you are dealing with 50Hz video.
 
I'd just like to report that I'm another *very* happy downloader of the 1080i rips. I downloaded both the 720p and the 1080i for the first episode, and there is no doubt the 1080i looks much sharper and the motion is smoother than with the 720p. I don't know what it is about the 720p rip, but it just doesn't have the same smooth flow as the 1080i, and certainly not the excellent resolution. Maybe the deinterlacer in my WDTV Live is very good so I don't see the tearing effects, but whatever it is; there is NO DOUBT in my mind the 1080i rip looks much better.

I'm not interested in hearing all the technical reasons as to why I'm wrong about it; as long as they're offered, and I hope they are, I'm only downloading the 1080i in the future.
 
Don't worry, already collected my anonymous zero-power negrep labelled "dickishness" from him :lol:

http://lurkertech.com/lg/fields/#whatsafield

Could you please quote the part of my text that fits your description of "dickishness"?

You didn't even try to anwer my last post, and the link you provided proves my point. If you deinterlace video-material to 25 progressive frames, you loose information that can not be restored, and the result will be stuttering.

Progressive displays are not designed to render an interlaced image. You need to deinterlace it, and the result, that is the image that the display shows, is progressive, and it does contain 50 unique images per seconds. If the source is video, every frame rendered will contain different data, and if you don't want to call that 50 frames per second, well that's your problem.

I know that you understand this, but for some reason you try to make fun of me. Could you please tell me why you do this?

What happens before you see the video is irrelevant to most people. What should be interesting is whether you loose information or not. With 1080i you have the possibility of getting 50 unique images per second, with 25p you don't.

It's really that simple (even though deinterlacing is far from a simple procedure).

For those who wan't to see the difference between the two rips, remember to use a screen refresh rate of 50, 75 och 100 Hz. Otherwise you will get a small amount of stuttering when the display converts from 50 to 60 (or whatever you use) Hz.

Also remember to use proper deinterlacing, not the simple ones that just merge two fields into one image. In Media Player Classic, choose "filters" and "video mixing render" to confirm that the output is 50 fps (it usually takes a while for it to reach 50, but 49.xx is ok).
 

Attachments

  • framerate.jpg
    framerate.jpg
    36.5 KB · Views: 129
Last edited:
In response to the first post - the BBC HD channel in the UK (via satellite - not sure about cable or Freeview HD when it's launched) is indeed 1440x1080i. It's simply stretched to 1920x1080 when it arrives, which the set top box does. The reason it's an odd resolution is so that the BBC can save a little bit of bandwidth. Luxe TV on Sky also do it, and I'm sure many other channels around the world do too. If you view BBC HD using a PC-based satellite tuner, you can generally get it to show the technical info (depending on the program you're using to watch it), and it'll show it as being in that resolution.

The rest of the channels on Sky+HD (and Freesat probably) are 1920x1080i though - such as Channel 4, Sky Sports, Sky Movies, and so on.
 
Basically all i'm trying to say is I don't think it is worth your time to make 1080 rips. Just stick with the 720p. Thoughts?

I agree with your quality assessment 100%

BUT

I really really really seriously 120312398129038% prefer the transport stream file.

Why? Because I have to re-encode it anyways to get it to work with all my devices (primarily the apple tv) and so I'd rather start with the "raw" data.

Additionally, I frequently have NIGHTMARES dealing with the .mkv files. I can PLAY the .mkv files fine, but TRANSCODING them takes FOREVER. And simply "extract the video without transcoding" doesn't work: every single time I've done this I've ended up with audio out of sync.

So, transport stream: larger, slower download, more user-end (my end) work, but it's "set it and forget it" work that always seems to work 100%

720p mkv: quality is fine, but I've had nothing but headaches dealing with these files. This goes includes all 3 of the HD episodes so far (all the way back to the original HD episode: the 1440x1080 north pole special).
 
Feel free to disagree, but please don't be dicks about it. ;)

They're not being dicks, they're just both more pedantic than May :lol: :p

I agree with your quality assessment 100%

BUT

I really really really seriously 120312398129038% prefer the transport stream file.

Why? Because I have to re-encode it anyways to get it to work with all my devices (primarily the apple tv) and so I'd rather start with the "raw" data.

Additionally, I frequently have NIGHTMARES dealing with the .mkv files. I can PLAY the .mkv files fine, but TRANSCODING them takes FOREVER. And simply "extract the video without transcoding" doesn't work: every single time I've done this I've ended up with audio out of sync.

So, transport stream: larger, slower download, more user-end (my end) work, but it's "set it and forget it" work that always seems to work 100%

720p mkv: quality is fine, but I've had nothing but headaches dealing with these files. This goes includes all 3 of the HD episodes so far (all the way back to the original HD episode: the 1440x1080 north pole special).

You should have gotten a popcorn hour player, they play mkvs (and a TON of other obscure fomats) just fine. Can't help you about any other transcoding as I don't ever do that, just play it directly on my PC, and Badaboom does a good job at trancoding stuff for my phone with everything in sync.
 
Last edited:
I play TG directly on my PC... which is hooked up to my 50 inch plasma tv :p

As a new worlder I just have to say that I utterly despise the euro super-smooth 50fps (fields per second :p ). It's just unnatural, 't'aint right.

It was my understanding that 1080i is always 1440x1080. And my cable provider's HD is only 1080i, I thought that was the norm?

Anyways, 720p vs 1080i, they're so close it doesn't matter. Just give me the 720p. 1080p vs 720p is completely different, but BBC isn't in 1080p so it doesn't matter.
 
As a new worlder I just have to say that I utterly despise the euro super-smooth 50fps (fields per second :p ). It's just unnatural, 't'aint right.

60 fieldsps are more natural and less super-smooth then? :lol:

(classic NTSC = almost 30 framesps, almost 60 fields for interlaced video)



pedanticness FTW.
 
It was my understanding that 1080i is always 1440x1080. And my cable provider's HD is only 1080i, I thought that was the norm?

The US/ATSC spec is 1920x1080 interlaced so that is what your cable provider should be sending you (and the broadcaster sending that to your cable provider) if the broadcast is 1080 interlaced.
 
60 fieldsps are more natural and less super-smooth then? :lol:

(classic NTSC = almost 30 framesps, almost 60 fields for interlaced video)



pedanticness FTW.

Irregardless of the interlacing, everything from here for television is filmed at 29.97fps, not 60. The only unnatural smoothness is on (typically lower budget) british shows.

assumptions FTL.
 
For those who hate dealing with the mkv files, I use a program called Gotsent to convert it to an MP4 file (takes about 5 minutes) which at least the xbox360 plays fine straight off my NAS (using Twonky as the media server on the NAS)

I agree though MKV is a complete pain in the ass.

Somebody seriously needs to sort out an XBMC-like appliance that can play HD videos (unfortunately the old xbox 1 is a long, long way from being capable).
 
Irregardless of the interlacing, everything from here for television is filmed at 29.97fps, not 60. The only unnatural smoothness is on (typically lower budget) british shows.

assumptions FTL.

British shows tape at 25 frames per second, or 50 interlaced fields per second.

American shows tape at 30ish frames per second, or 60ish interlaced fields per second.

30ish/60ish should be smoother than 25/50.
 
Well I don't know what's bloody well happening, but there's just that overly-smoothishness from things not intended for NTSC. Just like what people are describing with the studio segments of Top Gear only playing at the intended smooothness with the interlaced rips and not the 25p (I would assume) rips. It's like they're getting one frame per interlace or something? All I know is euros do it, we don't.
 
As a new worlder I just have to say that I utterly despise the euro super-smooth 50fps (fields per second :p ). It's just unnatural, 't'aint right.

Reality must be unnatural then. Fortunately we don't see the world around us at a low frame/field/image/whatever rate ;)

Take a look at the 1080i clips from Tonight Show with Jay Leno that are floating aound on the internet. They are even smoother (59,94 fields per second).
 
Are the mkv versions of the 1080i rips any better than the 1080i TS rips? Probably not, I'm assuming, but the sound quality of the TS files could be better.

The EU standard is 50FPS while in NA it's 60FPS. The previous Top Gear seasons were aired in 60FPS but they weren't HD programming.
 
Last edited:
Top