High performance open source: Inside the exciting future of automated load testing

Pixelart.at In preparation for the first TechSummit conference in Berlin this month, we spoke with keynote speaker Patrik Karisch, a software engineer with pixelart.at, about how open source and cloud technologies are combining to create a new era in application load testing.

We’ve all been there at some point. The developers and marketing staff have created a new online tool or app and just as it gets noticed around the Internet it grinds to a painful halt. It’s time for a fresh approach to avoid such catastrophes.

Load testing a new application can be a long and complex process where a combination of servers, software and networks are used to simulate real-life stress to identify any problems. This traditional approach doesn’t cut it in the fast-paced world of agile development and on-demand cloud provisioning.

TechSummit Berlin keynote speaker Patrik Karisch is working at the coalface of load testing’s future, and it’s looking a lot more automated.

How does load testing help with improving the quality of your hosting architecture or app development?

You can test the performance of the architecture, application, scalability and how many users you can handle. If you expect 10,000 people per hour you should test this. With the results of the testing you can go back to your architecture. If you only get an overview of the performance you need to read the reports carefully to make right decisions – that’s the tricky part. For example, after a load test of 10,000 users and your app gets slow you can make decisions like adding more Web servers or load balancers.

If you use profiling and load testing tools together you can make even better decisions. You can see which parts of the architecture are slow, for example, the database or the code. Load testing is not the only tool for architecture decisions.

To benefit from load testing and develop the required knowledge takes time and you need to work on methods to load test. This is theoretical and part of it is learning the tools.

Why are open source tools a good fit for load testing? Where do they have shortcomings?

With open source tools you can simulate thousands of users at low cost. Most open source tools have good communities and good documentation. And there are many tutorials on how to use them. Apache JMeter is a mature tool and dates back to 1998. Another option is Gatling which is relatively young. I prefer JMeter as it has a wide ecosystem of plugins to enhance its features.

If you’re used to the fancy tools of today you might not like JMeter as it’s not fancy. Open source tools don’t have fancy interfaces, but they are still usable. User interfaces are only used to create the test scripts and the test themselves are “headless”.

What are some of the benefits of using the cloud for load testing?

Using open source tools in the cloud allows you to build a cluster for testing at a lower cost than many SaaS tools. JMeter has a distributed primary-secondary testing architecture where the primary is on a single server which coordinates all the load generator instances, which can be generated in the cloud.

You can run load testing in a private cloud as long as you can get enough servers to simulate the load. The good thing about using a public cloud is it’s cost effective. Deploy servers as you need them and destroy them when you’re finished.

The load testing components are pretty generic and can be used on all servers – on-premise and cloud. There are provisioning tools like Terraform and Ansible which can be used to start a JMeter test in the cloud.

What are some good practices for utilising multiple clouds for dealing with large loads?

Some load testing tools use multiple clouds. Terraform is a cloud-independent orchestration tool which helps you emulate real loads from different locations around the world at the same time.

In a hybrid architecture you can load test specific APIs used by multiple apps if you want to see how your app handles it. With a hybrid approach it is less dependent on using a cloud or not.

What are some characteristics of clouds to look out for when load testing?

The most important thing is how many instances are available and does the cloud have the instances to generate the load. Also, look for high network bandwidth with the cloud, the performance and, if you need to do large scale load testing, you should notify the provider. The cloud provider might clamp down on you if they detect a two to 3Gbps spike in network traffic – you don’t want them to think you are DDoS attacker or a botnet. Make sure they don’t kill your instances because you using too much bandwidth during a short time.

What will you be speaking about at the TechSummit event?

This year’s TechSummit is all about building reliability at scale which fits perfectly to my job and topics of interest. At pixelart.at we load test for clients, create different scenarios and load testing reports to be sure our projects can handle an expected load. This helps in the decision making around which infrastructure architecture we need.

We use Platform.sh to help manage our platform as a service (PaaS), cloud hosting, and load testing. I’ll be talking about how to you use Terraform and Ansible to create instances, deploy JMeter and run the load testing. My talk is not just theoretical, it’s about using the cloud and what tools and scripts people can use. I’ll also show how to load test and profile an application under load. This helps narrow down performance issues to determine if you need to scale the database or Web server.

What is the future of load testing and how is it demonstrating emerging technology?

Most organizations are doing load testing, but no one has really automated it. My talk will include the emerging field of automating load testing with Jenkins in a continuous delivery pipeline. For continuous performance testing every change or feature you deploy must be load tested in a continuous way. With automation you can test every feature and bug fix before production.

This methodology complements DevOps and is part of continuous delivery and is something quite emerging. We haven’t perfected it yet and are changing things a lot in order to improve it. I think the success of automated testing will depend on the tools you use and how much you want to automate. There are products like Blackfire marketing continuous performance with a tool so it’s definitely an emerging field.

To see Patrik’s keynote presentation, register for TechSummit Berlin online here: http://www.techsummit.io/berlin/