I realise measuring image quality in software is going to be really difficult, and I'm not looking for a quick-fix. Googling this is largely showing up research papers and discussions that go a bit over my head, so I was wondering if anyone in the SO community had any experience with doing any rough image quality assessment?
I want to use this to scan a few thousand images and whittle it down to a few dozen images that are most likely of poor quality. I could then show these to a user and leave the rest to them.
Obviously there are many metrics that can be a part of whether an image is of high/low quality, I'd be happy with anything that could take an image as an input and give some reasonable metrics to any of the basic image quality metrics like sharpness, dynamic range, noise, etc., leaving it up to my software to determine what's acceptable and what isn't.
Some of the images are poor quality because they've been up-scaled drastically. If there isn't a way of getting metrics like I suggested above, is there any way to detect that an image has been up-scaled like this?