神刀安全网

Five Simple Steps To Test Your Varnish Cache Deployment Using Varnishtest

Five Simple Steps To Test Your Varnish Cache Deployment Using Varnishtest

Advertisement

Varnish Cache is an open-source HTTP accelerator that is used to speed up content delivery on some of the world’s top content-heavy dynamic websites. However, the performance or speed that a newcomer to Varnish Cache can expect from its deployment can be quite nebulous.

This is true for users at both extremes of the spectrum: from those who play with its source code to create more complex features, to those who set up Varnish Cache using the default settings.

Five Simple Steps To Test Your Varnish Cache Deployment Using Varnishtest
Caching isn’t always as simple as we think. A few gotchas and problems could take quite some time to master. (Image: Varnish Cache )

Imagine you have deployed Varnish Cache according to the default settings to deliver web and application content quickly, reliably and at scale. Suddenly, it stops behaving as it should. Either it’s not caching or, worse, because of bad caching or incorrect cookie policies, cached personal content is delivered to the wrong client.

In another scenario, TTL (time to live) limits might have been set incorrectly. Set the TTL limit too short and you’ll get too many unwanted back-end fetches, which will slow down the website. Set it too long and objects will stay in cache until the TTL expires, using far more storage than necessary.

Getting the best performance from Varnish Cache is about using it not only for caching, but also for invalidation. A previous article here on Smashing Magazine explains the different ways to docache invalidation with Varnish Cache. Not every method works well for every website. You might find that a piece of content you’ve removed from your website is still delivered to mobile users.

We can easily avoid all of these mistakes by running tests. A little-known fact about Varnish Cache is that it comes with its own testing tool, called Varnishtest. Varnishtest is a script-driven program that can be used to create client mockups, to simulate transactions, to fetch content from mockups or real back ends, to interact with the actual Varnish Cache configuration, and to assert expected behaviors.

To ensure optimal performance from a Varnish Cache deployment, one should integrate Varnishtest into the design. Varnishtest can be used by system administrators in two scenarios: (1) when configuring a Varnish Cache installation, and (2) when writing complex caching policies in the Varnish Configuration Language (VCL) or when tuning Varnish Cache.

Code tinkerers who work on extensions written for Varnish Cache (called VMODs) can use Varnishtest to define and test their modules. The same goes for web developers who write applications that take full advantage of Varnish Cache. As mentioned, Varnishtest can be used to test a cache invalidation method or to reproduce bugs when filing a bug report.

The Varnish Test Case Language

Note that Varnishtest does not follow the unit-testing framework ( setUp , test , assert , tearDown ), nor does it follow behavior-driven development (“given,” “when,” “then”). Varnishtest has its own language: Varnish Test Case (VTC).

VTC files follow a naming convention. Files starting with b (for example, b00001.vtc ) contain basic functionality tests. (The full naming scheme for test scripts is available on GitHub.)

varnishtest "Varnish as Proxy"   server s1 {   rxreq   txresp } -start  varnish v1 -arg "-b ${s1_addr}:${s1_port}" -start  client c1 {  txreq  rxresp   expect resp.http.via ~ "varnish" } -run 

All VTC programs start by naming the test:

varnishtest "Varnish as Proxy" 

You need to define three components to run a Varnishtest: a server, a Varnish Cache instance and a client.

2. Declare The Origin Server

server s1 {   rxreq   txresp } -start 

All server declarations must start with s . In the code above, server s1 receives a request ( rxreq ) and transmits a response ( txresp ). The commands rxreq and txresp describe the behavior of the server; rxreq means the server will accept an incoming request, and txresp means the request will be responded to.

The command -start boots server s1 and makes available the macros ${s1_addr} and ${s1_port} , with the IP address and port of the simulated back end.

3. Declare The Varnish Cache Instance

varnish v1 -arg "-b ${s1_addr}:${s1_port}" -start 

Here, varnish v1 declares an instance of our real Varnish server, (i.e. varnishd ). The names of Varnish servers must start with v . This instance is controlled by the manager process, and -start forks a child, which is the actual cacher process.

There are many ways to configure varnishd . One way is to pass arguments with -arg , as in -arg -b ${s1_addr}:${s1_port} . Here, -b is a varnishd option to define the back end. In this case, the IP address and port of the simulated back end s1 are used. However, using a real back end is also possible, thus making it possible to use Varnishtest as an integration tool when testing the real back end.

4. Simulate The Client

client c1 {   txreq   rxresp    expect resp.http.via ~ "varnish" } -run 

Simulated clients in Varnishtest start with c . In this example, c1 transmits one request and receives one response. Because Varnish is a proxy, the response should be received from the back end via Varnish Cache. Therefore, c1 expects varnish in the via HTTP header field. The tilde ( ~ ) is used as a match operator of regular expressions, because the exact text in resp.http.via depends on the Varnish version installed. Finally, the client c1 is started with the -run command. (Note that Varnish servers are typically started with the -start command, and clients with the -run command.)

$varnishtest b00001.vtc #  top TEST b00001.vtc passed (1.458) 

To run the test, simply issue the command above. By default, Varnishtest outputs a summary of passed tests, and a verbose output for failed tests only. A passed test means that a most basic Varnish Cache configuration is correct.

Easy Tests for Top Performance

The performance problems mentioned at the beginning of this article could have been prevented if we had used Varnishtest before going live. Here are two simple test examples I wrote to help you test your cache invalidation method and cookie policies:

  • Test cache invalidation method
    Ensure your TTL limits are set correctly. This test checks that you are able to purge objects from the cache, thus ensuring that not too much memory is used.
  • Test cookies configuration
    Because cookies are used to identify unique users and may contain personal information, by default Varnish does not cache a page if it contains a Cookie or Set-Cookie header. This case represents a scenario for testing cache objects with a cookie header to ensure that each of three defined clients gets the correct object.

However, there are many more documented test examples for different scenarios.

The best way to learn how to create tests is by running the examples linked to above in Varnish Cache, and then write your own tests based on these.

Ultimately, these tests will help you understand in advance what will happen if Varnish Cache is put into production, ensuring optimal performance of your web architecture and helping you avoid the pitfalls described at the beginning of this article.

Web performance is business-critical across industries, and you can easily take the guesswork out of performance by baking Varnishtest into your design and following these five easy steps to incorporate testing into the configuration of your Varnish instances.

(rb, ms, al, il)

转载本站任何文章请注明:转载至神刀安全网,谢谢神刀安全网 » Five Simple Steps To Test Your Varnish Cache Deployment Using Varnishtest

分享到:更多 ()

评论 抢沙发

  • 昵称 (必填)
  • 邮箱 (必填)
  • 网址