Recording of Load Tests
Post-Processing of the Recording
Creating and Starting Load Test Programs
Real-Time Monitoring and Real-Time Error Analyses
Measurement Results per Test Run
Comparison of Test Runs
Results per Test Run
In-Depth Measurement of HTTP(S) Response-Streams / Detection of Jerky Video Playback
Proxy Sniffer supports the in-depth measurement of HTTP(S) response-streams. This feature is especially useful for Web sites that contain videos and allows to detect if jerky video playback occurs during viewing of a video, respectively to diagnose if enough network bandwidth is available for all users so that the video can be viewed by each user without interruption.
Note that Proxy Sniffer can measure only video streams that are delivered within a single URL response (such as from YouTube).
This feature can also commonly used as a reference for the optimization of any response data. The corresponding charts are showing in different colors the times elapsed for receiving fragments of user data (in red color) and the times elapsed for receiving the overhead data of the chunked protocol (in blue color).
Image 1 of 5: Enabling the option "Resp. Throughput Chart per Call" when starting the load test.
Image 2 of 5: Calling the captured data of the HTTP streams.
Image 3 of 5: Measured internal throughput of a video on a preset viewing time of 3 minutes (180,000 milliseconds).
The linear flow and the flow rate peak at the beginning of receiving the data indicates that the delivery is made by a special video server which prevents on the one hand network peaks and ensures on the other hand that no jerky video playback occurs.
Image 4 of 5: Throughput measurement of a PDF document which should be received in 30 seconds by a linear network throughput, in order that the beginning of the document can already be viewed after some few seconds.
The second measured sample does not meet this requirement.
Image 5 of 5: Throughput measurement of a HTML response received from a Web portal server. It is conspicuous that the most response time is spent in the chunked protocol overhead, but that the user data (payload) is received in a relatively short time.
One explanation could be that the Web page is "calculated" piece by piece by the portal server (page navigation, page main content, page footer), and that some server internal delay times occurred during the calculations..
Copyright 2010, 2011, 2012, 2013
Engineering Office David Fischer AG, Switzerland
All rights reserved.