Auditing front end performance on web application
- by user1018494
I am currently trying to performance tune the UI of a company web application. The application is only ever going to be accessed by staff, so the speed of the connection between the server and client will always be considerably more than if it was on the internet.
I have been using performance auditing tools such as Y Slow! and Google Chrome's profiling tool to try and highlight areas that are worth targeting for investigation. However, these tools are written with the internet in mind. For example, the current suggestions from a Google Chrome audit of the application suggests is as follows:
Network Utilization
Combine external CSS (Red warning)
Combine external JavaScript (Red warning)
Enable gzip compression (Red warning)
Leverage browser caching (Red warning)
Leverage proxy caching (Amber warning)
Minimise cookie size (Amber warning)
Parallelize downloads across hostnames (Amber warning)
Serve static content from a cookieless domain (Amber warning)
Web Page Performance
Remove unused CSS rules (Amber warning)
Use normal CSS property names instead of vendor-prefixed ones (Amber warning)
Are any of these bits of advice totally redundant given the connection speed and usage pattern? The users will be using the application frequently throughout the day, so it doesn't matter if the initial hit is large (when they first visit the page and build their cache) so long as a minimal amount of work is done on future page views.
For example, is it worth the effort of combining all of our CSS and JavaScript files? It may speed up the initial page view, but how much of a difference will it really make on subsequent page views throughout the working day?
I've tried searching for this but all I keep coming up with is the standard internet facing performance advice. Any advice on what to focus my performance tweaking efforts on in this scenario, or other auditing tool recommendations, would be much appreciated.