gabebalazs wrote in post #16784672
This topic actually made me do test videos with my 6D and 70D to see which one has more moire (interesting alliteration

)
I think the 70D is a tad better in this test, the 6D is a bit sharper but has more moire.
It makes me wonder if that's something Canon could (but do they want to?) fix via firmware. I mean the 5DIII does not seem to have any stronger AA filter, so I suspect the difference in aliasing and moire are a result of different processing and not something physical inside the camera.
Maybe you guys already know why this is...
So perhaps Canon could help us out with a firmware.
It's a bandwidth issue, more than anything. Reading a full sensor of pixels 60 times a second requires a lot of bandwidth, so cameras with less bandwidth are forced to use line-skipping. Aliasing of panned shots or moving clothing prints could be reduced by reading 1/3 of the lines in a pseudo-random selection, but that would also make still scenes look "busy" at the pixel level, reading something different for each output pixel in successive frames.
Interlace is a possibility, but interlace works best when the display device plays the lines back in the same way they are recorded; that's why some video in the old days looked much more realistic than others; the camera scanned the scene in the same way it was displayed, suppressing potential artifacts. Many years ago I created an animation on my Commodore Amiga that had a ball bouncing on one half of the display, not taking the interlaced display into account (30 frames per second), and the other side, the mirror image of the bouncing ball but drawn at the timing for the interlace fields (60 fields per second). The difference was amazing. The Amiga synced everything to the video, though, so that was the reason the interlace worked well. There are very few things as ugly, though, as interlaced video being played back asynchronously or converted to non-interlaced video, or a different frame rate. IMO, everything should be played back with the timing at which it was recorded, ideally. I hope that everyone in the media industries are saving the originals, because someday our displays will be capable of any arbitrary timing. I could see a computer desktop refreshing asynchronously, depending on content change, and video windows being drawn at 24 fps, 50 fps, and 30i at the same time. I am very sensitive to video jitter, and so much of what I see is distracting.