Foreword By Alberto Savoia
Testing the performance of web applications is easy. It’s easy to design unrealistic scenarios. Easy to collect and measure irrelevant performance data. And, even if you manage to design a sound scenario and collect the right data, it’s easy to use the wrong
statistical methods to summarize and present the results.
Starting in the late 90s, through the peak of the Internet bubble and beyond, I spent a lot of time testing the performance of web applications. During that period, I designed and led several mission-critical web performance and load tests for high-profile
Internet companies. Working with the in-house performance experts at each company was very revealing – and quite frightening. Most of the engineers assigned to work on web application performance were smart, hard-working, and dedicated; they invested
in expensive software and hardware, read the right books, and followed the best practices of the day. But, somehow, the results of their performance measurements and predictions did not match reality. In some cases the performance tests
overestimated the performance and scalability of the web applications – leading to embarrassing and costly crashes when the web application was deployed. In other cases, they
underestimated capacity and scalability – leading to unnecessary spending on hardware and infrastructure. The errors in these tests were not small; some tests overestimated or underestimated actual performance and capacity by an order of magnitude or
more! How is this possible?
Based on my experience, the majority of gross errors in web application performance testing are the result of oversimplification. More precisely, they are the result oversimplification of user behavior and oversimplification in summarizing and reporting test
results. Imagine a transportation engineer estimating traffic patterns for a proposed stretch of highway by assuming that most drivers will drive at the same average speed, break and accelerate with the same response time and at the same rate, and never change
lanes. A simple – but completely worthless – scenario. Or imagine the same transportation engineer reporting that there are no traffic issues because the average speed is 57mph – without bringing up that during rush-hour the average speed is 25mph. A simple,
but very misleading, result. Unfortunately, most web application performance testers commit errors of oversimplification as bad, or worse, as the ones committed by our hypothetical transportation engineer.
I am all for simplicity but, as Albert Einstein once said: “Make everything as simple as possible, but not simpler.” When it comes to testing the performance of web applications, that’s exactly what this remarkable – and much needed – book teaches you. The
authors leverage their passion, experience, and hard-earned knowledge and provide you with the broad, thorough, and extensible foundation you need to tackle web performance testing the right way.
Performance Testing Guidance for Web Applications does not get bogged down with unnecessary details, but it does make sure that you know about – and don’t overlook – the key parameters and variables that you need to take into account in designing, conducting,
and analyzing your tests.
If you are new to web performance testing, this book will get you started on the right path and save you a lot of time and embarrassment. Even if you are a seasoned web performance testing veteran, I am confident that this book will provide you with new insights
and, most likely, have you slap your forehead a few times as you read about some common and familiar mistakes. In either case,
Performance Testing Guidance for Web Applications, is a must-have for any web performance engineer bookshelf.
Founder and CTO, Agitar Software Inc.
Author of: “The Science and Art of Web Site Load Testing”, “Web Load Test Planning”, and “Trade Secrets from a Web Testing Expert”.