Should java try blocks be scoped as tightly as possible?
- by isme
I've been told that there is some overhead in using the Java try-catch mechanism. So, while it is necessary to put methods that throw checked exception within a try block to handle the possible exception, it is good practice performance-wise to limit the size of the try block to contain only those operations that could throw exceptions.
I'm not so sure that this is a sensible conclusion.
Consider the two implementations below of a function that processes a specified text file.
Even if it is true that the first one incurs some unnecessary overhead, I find it much easier to follow. It is less clear where exactly the exceptions come from just from looking at statements, but the comments clearly show which statements are responsible.
The second one is much longer and complicated than the first. In particular, the nice line-reading idiom of the first has to be mangled to fit the readLine call into a try block.
What is the best practice for handling exceptions in a funcion where multiple exceptions could be thrown in its definition?
This one contains all the processing code within the try block:
void processFile(File f)
{
try
{
// construction of FileReader can throw FileNotFoundException
BufferedReader in = new BufferedReader(new FileReader(f));
// call of readLine can throw IOException
String line;
while ((line = in.readLine()) != null)
{
process(line);
}
}
catch (FileNotFoundException ex)
{
handle(ex);
}
catch (IOException ex)
{
handle(ex);
}
}
This one contains only the methods that throw exceptions within try blocks:
void processFile(File f)
{
FileReader reader;
try
{
reader = new FileReader(f);
}
catch (FileNotFoundException ex)
{
handle(ex);
return;
}
BufferedReader in = new BufferedReader(reader);
String line;
while (true)
{
try
{
line = in.readLine();
}
catch (IOException ex)
{
handle(ex);
break;
}
if (line == null)
{
break;
}
process(line);
}
}