Well that was probably one of the most useless reviews of the differences between the S and the SR that I have ever read. The author seemed to have absolutely zero comprehension of the operation of an Optical Low Pass Filter. To my mind the inclusion of the circuit for an analogue electronic low pass filter, and references to turning it on and off in software to seem to suggest that the OPLF is part of the cameras electronics or subsequent digital processing, and not a physical component in the camera systems optical path, before reaching the light sensitive part of the sensor.
Architecture would I think mostly be OK without the OPLF operation, but you might have some issues with large distant areas of brickwork, thanks to the repeating nature of them. It is anywhere that you have high frequency repeating patterns, with the repeating bit being important, that you are likely to run into aliasing issues with a digital sensor. The classic places to find moir caused by aliasing is in brickwork, and very fine fabrics such as silk.
The real issue with getting rid of it afterwards is that it is impossible to apply an automatic filter to remove it digitally. Even when you use manually applied digital filters to the image you will also remove any wanted signal that is of the same low spatial frequency as the aliasing. So it is impossible to remove aliasing without detriment to the image, therefore ideally you will want to ensure that you don't let too high a spatial frequency reach the sensor in the first place.
Things wouldn't be so bad if we had continued to asses images based on fixed size output, as we did with film. If we had an output device that was capable of very high resolution, but was set up so that you could only output at a fixed size, then a very high resolution sensor that didn't OPLF would look great in comparison to a low resolution sensor. Since the size was fixed, edges would still look sharp, but the lens would come nowhere near close to matching the limit of the sensor, so no aliasing.
Instead we now use a fixed resolution output device, and so we look at images from different resolution sensors at different physical sizes. This is what we are doing when we look at an image on screen at 100%, and mostly we seem to have fixed the display to around 100 PPI, so pretty low output resolution at that. In this situation a sensor with very high resolution, that is much greater than the resolution of the lens, which is what you really need, will be judged as very poor. This is because it will appear to blur sharp high contrast edges in comparison to a lower resolution sensor.
Is it really fair to compare images at different levels of optical magnification? We would never have compared an image printed at 10×7 with one printed at 80×53 would we? But that happens nearly every day now with digital. It's not unusual to see a 5DS compared with an old camera like the 300D/original Rebel at 100%, which is exactly the comparison I made originally with the 10×7 compared to the 80×53.
So down to the cameras, with the best L series lenses now seemingly capable of delivering 120 LP/mm to the sensor I would want to see a sensor that had a cut off resolution of 180 LP/mm before removing the OLPF completely. The thing is that the 50MP sensors limit is pretty close to being 120 LP/mm, so there is almost no room for error, and so aliasing problems are more than possible. I really don't think at any realistic output size for the whole image you will notice any additional softness from the use of the OLPF.