Java error on bilinear interpolation of 16 bit data
Posted
by Jon
on Stack Overflow
See other posts from Stack Overflow
or by Jon
Published on 2010-03-11T19:43:00Z
Indexed on
2010/03/11
19:44 UTC
Read the original article
Hit count: 391
I'm having an issue using bilinear interpolation for 16 bit data. I have two images, origImage and displayImage. I want to use AffineTransformOp to filter origImage through an AffineTransform into displayImage which is the size of the display area. origImage is of type BufferedImage.TYPE_USHORT_GRAY and has a raster of type sun.awt.image.ShortInterleavedRaster. Here is the code I have right now
displayImage = new BufferedImage(getWidth(), getHeight(), origImage.getType());
try {
op = new AffineTransformOp(atx, AffineTransformOp.TYPE_BILINEAR);
op.filter(origImage, displayImage);
}
catch (Exception e) {
e.printStackTrace();
}
In order to show the error I have created 2 gradient images. One has values in the 15 bit range (max of 32767) and one in the 16 bit range (max of 65535). Below are the two images
15 bit image
16 bit image
These two images were created in identical fashions and should look identical, but notice the line across the middle of the 16 bit image. At first I thought that this was an overflow problem however, it is weird that it's manifesting itself in the center of the gradient instead of at the end where the pixel values are higher. Also, if it was an overflow issue than I would suspect that the 15 bit image would have been affected as well.
Any help on this would be greatly appreciated.
© Stack Overflow or respective owner