Visual Studio Performance Testing Quick Reference Guide 3_6

253
Visual Studio Performance Testing Quick Reference Guide Page 1 MICROSOFT Visual Studio Performance Testing (not so) Quick Reference Guide A quick reference for users of the Team Testing performance features of Visual Studio Visual Studio Performance Testing Quick Reference Guide 6/20/2011

Transcript of Visual Studio Performance Testing Quick Reference Guide 3_6

Page 1: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 1

MICROSOFT

Visual Studio Performance Testing (not so) Quick Reference

Guide A quick reference for users of the Team Testing

performance features of Visual Studio

Visual Studio Performance Testing Quick Reference Guide

6/20/2011

Page 2: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 2

Summary This document is a collection of items from public blog sites, Microsoft® internal discussion aliases

(sanitized) and experiences from various Test Consultants in the Microsoft Services Labs. The idea is to

provide quick reference points around various aspects of Microsoft Visual Studio® performance testing

features that may not be covered in core documentation, or may not be easily understood. The different

types of information cover:

How does this feature work under the covers?

How can I implement a workaround for this missing feature?

This is a known bug and here is a fix or workaround.

How do I troubleshoot issues I am having?

The document contains two Tables of Contents (high level overview, and list of every topic covered) as

well as an index. The current plan is to update the document on a regular basis as new information is

found.

About Article Format I am compiling information as fast as I can between my primary engagements so I do not try to make the

formatting of every article look the same. You will see ISSUE/RESOLUTION items which are usually

snippets from internal email threads, short summaries with links to the full article, etc. However, there

are a few articles in the document that were written from scratch to cover fundamental and important

topics. I have tried to make these fairly professional in look and feel. Feedback is always welcome on the

format, and on any problems or incorrect information. Send email to [email protected].

The information contained in this document represents the current view of Microsoft Corporation

on the issues discussed as of the date of publication. Because Microsoft must respond to changing

market conditions, it should not be interpreted to be a commitment on the part of Microsoft, and

Microsoft cannot guarantee the accuracy of any information presented after the date of

publication.

This document is for informational purposes only. MICROSOFT MAKES NO WARRANTIES, EXPRESS,

IMPLIED OR STATUTORY, AS TO THE INFORMATION IN THIS DOCUMENT.

Microsoft grants you a license to this document under the terms of the Creative Commons

Attribution 3.0 License. All other rights are reserved.

2010 Microsoft Corporation.

Microsoft, Active Directory, Excel, Internet Explorer, SQL Server, Visual Studio, and Windows are

trademarks of the Microsoft group of companies.

All other trademarks are property of their respective owners.

Page 3: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 3

Revision History Version 2.0

o Released 2/16/09

o Available externally on CodePlex

o Major reformat of document

o Added comprehensive index

Version 3.0

o Release Candidate published 3/23/2010

o Added many VS 2010 performance testing articles

o Added and updated articles about VS 2010 how-to's, issues, etc.

o Added or updated articles for features "changed in 2010"

o Updated many articles on issues with VS 2008

o Added some deep dive articles about how VS performance testing works (both 2008 and

2010)

Version 3.0a

o Final release version for 3.0. This is the official release that should be used.

o Published on 4/1/2010

Version 3.5

o Added more content and updated some existing content.

o Added --NEW-- tag to all new article entries.

o Added --UPDATED-- tag to all article entries that were corrected or enhanced.

o Created a new section that contains full copies of some in depth blog posts.

Version 3.6

o Added more content and updated some existing content.

NOTE All items that are not marked with a version note should be considered to apply to both VS 2008 and VS 2010

Page 4: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 4

List of Topics

NOTE FROM THE AUTHOR 10

HOW IT WORKS 11

How Web Tests Handle HTTP Headers 11

General Info (including order of execution) of load and web test plugins and rules 11

Client Code does not execute because Web Tests Work at the HTTP Layer 14

When is the "Run unit tests in application domain" needed? 14

How the "Test Iterations" Setting impacts the total number of tests executed 14

Test timeout setting for load test configuration does not affect web tests 15

How user pacing and "Think Time Between Test Iterations" work 15

Load test warmup and cool down behaviors 15

What is the difference between Unique, Sequential and Random Data Sources 16

Simulation of Browser Caching during load tests 17

Comparing new users to return users 18

How cache settings can affect your testing and app performance 20

Goal based user behavior after the test finishes the warmup period 22

Threading models in Unit tests under load 23

The difference between Load Test Errors and Error Details 24

How parameterization of HIDDEN Fields works in a webtest 25

Testing execution order in Unit Tests 27

How machines in the test rig communicate 29

How a Load Test Plugin is executed on a test rig 30

Sharing State across agents in a load test rig is not supported out of the box 31

--NEW--Order of execution of components in a webtest 32

--NEW--401 Access Denied responses and "Requests per second" measurements 44

--UPDATED-- File Downloads, Download Size and Storage of files during Web Tests 45

--NEW-- Info on how VS records and generates web tests. 46

--NEW--IE9 and other browser emulation in VS2010 46

ITEMS NEW TO VS 2010 48

"Find" feature now available in Webtest playback UI 48

"Go To Web Test" feature now available in Webtest playback UI 49

Recorder Log Available 50

Add extraction rule directly from the playback UI 51

New "Reporting Name" property for web requests 52

LoadTestResultsTables now differentiate between GET and POST requests 53

--UPDATED-- Virtual user visualization now available 54

New Excel reporting features built into load test results 60

New Load Test and Load Test Rig Licensing and configurations 61

New test mix: "Sequential Test Mix" 65

Query String and FORM POST URLs get parameterized 67

New options on Load Test Scenarios 68

Loops and Conditionals 69

Page 5: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 5

--NEW--Load test distributions/Network emulation 71

--NEW--SilverLight recording info 71

--NEW--Unlimited agent licenses for certain MSDN subscribers 71

CONFIGURATIONS AND SETTINGS 72

How to Change the Location Where Agents Store Run Files 72

How to set a proxy server for web tests 72

How to configure Web Tests so Fiddler can capture playback info 72

Controlling the amount of memory that the SQL Server Results machine consumes 73

How to configure the timeouts for deployment of load tests to agents 73

How to set the number of Load Test Errors and Error Details saved 74

Multi-proc boxes used as agents should have .NET garbage collection set to server mode 75

Location of list of all agents available to a controller 75

--NEW--Setting the Agents to run in 64 bit mode 76

--NEW--Managing Test Controllers and Agents from VS *AND* from Lab Center 76

NETWORKS, IP SWITCHING, TEST STARTUPS 77

IP Address Switching anatomy (how it works) 77

Gotcha: IP Address Switching is ONLY for WEB TESTS 77

Gotcha: IP Addresses used for switching are not permanent 77

How to Setup IP Switching 78

Startup: Slowness Restarting a Test Rig with Agents Marked as "Offline" 81

Startup: Multiple Network Cards can cause tests in a rig to not start 81

Startup: Slow startup can be caused by _NT_SYMBOL_PATH environment variable 82

Startup: tests on a Rig with Agents on a Slow Link 82

--NEW-- Startup: tests on a Rig with several Agents 82

"Not Bound" Exception when using IP Switching is not really an error 82

How to configure the timeout for deployment of load tests to agents 84

Customizing and extending the available network emulation settings 85

--NEW--Socket Exception: "An operation on a socket could not be performed because…" 87

--NEW--Failed scenarios when client is in a non-trusted domain 87

PERFORMANCE COUNTERS AND DATA 90

Customizing the Available Microsoft System Monitor counter sets 90

Performance Counter Considerations on Rigs with slow links 92

Increase the performance counter sampling interval for longer tests 93

Changing the default counters shown in the graphs during testing 93

Possible method for fixing "missing perfmon counters" issues 93

How and where Performance data gets collected 94

--NEW-- How to remove Controller and Agent countersets 95

--NEW-- Controller and Agent countersets show up in test even after removing them 96

DATA AND RESULTS 98

Custom Data Binding in UNIT Tests 98

Verifying saved results when a test hangs in the "In Progress" state after the test has finished 98

The metrics during and after a test differ from the results seen. 99

How new users and return users affect caching numbers 100

Page 6: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 6

data sources for data driven tests get read only once 101

Consider including Timing Details to collect percentile data 102

Consider enabling SQL Tracing through the Load Test instead of separately 103

How to collect SQL counters from a non-default SQL instance 103

How 90% and 95% response times are calculated 103

Transaction Avg. Response Time vs. Request Avg. Response Time 104

Considerations for the location of the Load Test Results Store 104

--UPDATED-- Set the recovery model for the database to simple 104

How to clean up results data from runs that did not complete 105

InstanceName field in results database are appended with (002), (003), etc. 105

Layout for VS Load Test Results Store 105

How to view Test Results from the GUI 106

SQL Server Reporting Services Reports available for download 106

How to move results data to another system 106

Load Test Results without SQL NOT stored 107

Unable to EXPORT from Load Test Repository 107

Web Test TRX file and the NAN (Not a Number) Page Time entry 108

Proper understanding of TRX files and Test Results directory 109

Understanding the Response Size reported in web test runs 110

ERRORS AND KNOWN ISSUES 111

--UPDATED-- CSV files created in VS will not work as data sources 111

Incorrect SQL field type can cause errors in web tests 112

Leading zeroes dropped from datasource values bound to a CSV file 112

--UPDATED-- Recorded Think Times and paused web test recordings 112

After opening a webtest with the VS XML Editor, it will not open in declarative mode. 113

Calls to HTTPS://Urs.Microsoft.Com show up in your script 113

Possible DESKTOP HEAP errors when driving command line unit tests 113

Goal based load tests in VS 2008 do not work after applying SP1 114

Using Named Transactions in a Goal-Based Load Profile can cause errors 114

Debugging Errors in Load Tests 115

Debugging OutOfMemory Exceptions in Load Tests 115

Memory leak on load test when using HTTPS 115

"Not Trusted" error when starting a load test 116

Detail Logging may cause "Out of disk space" error 117

Error details and stack traces no longer available in VS 2010 117

VS does not appear to be using more than one processor 117

Changes made to Web Test Plugins may not show up properly 117

--UPDATED-- Socket errors or "Service Unavailable" errors when running a load test 118

Error "Failed to load results from the load test results store" 119

Hidden Field extraction rules do not handle some fields 119

Test results iteration count may be higher than the max test iterations set 119

In flight test iterations may not get reported 120

Completion of Unit Test causes spawned CMD processes to terminate 120

Bug with LoadProfile.Copy() method when used in custom goal based load tests 121

Errors in dependent requests in a Load Test do not show up in the details test log 122

Page 7: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 7

WCF service load test gets time-outs after 10 requests 124

Loadtestitemresults.dat size runs into GBs 124

Content-Length=0 Header not sent resulting in HTTP 411 Length Required Error 125

Error that test could not run because the network emulation is required 126

Error/Crash in "Open and Manage Load Test Results" dialog 126

Calls to CaptchaGenerator.aspx fail during playback 127

Request failure with improperly encoded query strings calling SharePoint 2010 127

Network Emulation does not work in any mode other than LAN 127

Error that Browser Extensions are disabled when recording a web test 128

Error: Request failed: No connection could be made because the target machine actively refused it 129

MaxConnection value in App.Config is not honored when running a load test 129

--NEW-- Cannot change the "Content-Type" header value in a webtest 129

--NEW-- VS does not expose a method for removing specific cookies from requests. 130

--NEW-- File Upload" feature in VS does not allow you to use a stream to send file 130

--NEW--Unit tests that consume assemblies requiring MTA will fail with default test settings 130

--NEW--MSTest tests that consume assemblies requiring MTA will fail with default test settings 130

--NEW--Load Test Agent Error "Failed to open the Visual Studio v10.0 registry key" 131

--NEW--"Agent to use" property of a Load Test not acting as expected 131

--NEW--Fiddler 2 not seeing application traffic 132

--NEW--BUG: Microsoft.VisualStudio.TestTools.WebStress.LoadTestResultsCollector 132

--NEW--ASP.NET Profiling sometimes does not report after a run. 133

--NEW--System.InvalidCastException: Unable to cast COM object 133

--NEW-- Assembly could not be loaded and will be ignored. 134

--NEW--Issue with webtest login when getting 307 Temporary Redirect 134

--NEW--Data bound validation rule fails when set at the TEST level 135

--NEW--"Could not read result repository" 136

--NEW--Page response time counters disappear after test is completed 136

--NEW--WebTestContext.Clear() not clearing cookies 137

--NEW--SQL Tracing error "Could not stop SQL tracing" 137

--NEW--LoadTestCounterCategoryNotFoundException 138

IN DEPTH BLOG POSTS ON DEBUGGING AND TROUBLESHOOTING 140

Web Test Authoring and Debugging Techniques 140

Troubleshooting Network Emulation 159

Troubleshooting Guide for Visual Studio Test Controller and Agent 163

Best Practice: Understanding the ROI for test automation 176

Best Practice: Blog on various considerations for web tests running under load 176

User Account requirements and how to troubleshoot authentication 177

TROUBLESHOOTING 178

How to enable logging for test recording 178

--UPDATED-- Diagnosing and fixing Web Test recorder bar issues 178

How to enable Verbose Logging on an agent for troubleshooting 180

Troubleshooting invalid view state and failed event validation 180

Troubleshooting the VS Load Testing IP Switching Feature 182

--NEW--Performance Counters in .NET 4.0 help with analysis of Agent machines 183

Page 8: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 8

HOW TO, GOTCHAS AND BEST PRACTICES 184

How to call one coded web test from another 184

How to use methods other than GET and POST in a web test 184

How to filter out certain dependent requests 184

How to handle ASP.NET Cookie-less Sessions 185

How to use Client-side certificates in web tests 185

How to remove the "If-Modified-Since" header from dependent requests 186

How to handle custom data binding in web tests 186

How to add a datasource value to a context parameter 186

How to test Web Services with Unit Tests 187

How to add random users to web tests 187

How to add think time to a Unit Test 187

How to add details of a validation rule to your web test 188

How to mask a 404 error on a dependent request 189

How to parameterize Web Service calls within Web Tests 190

How to pass Load Test Context Parameters to Unit Tests 190

How to create Global Variables in a Unit Test 190

How to use Unit Tests to Drive Load with Command Line Apps 191

How to add Console Output to the results store when running Unit tests under load 191

How to add parameters to Load Tests 192

How to Change the Standard Deviation for a NormalDistribution ThinkTime 192

How to programmatically access the number of users in Load Tests 193

How to create a webtest plugin that will only execute on a predefined interval 193

How to support Context Parameters in a plug-in property 194

How to stop a web test in the middle of execution 195

How To: Modify the ServicePointManager to force SSLv3 instead of TLS (Default) 195

How To: Stop a Test in the PreRequest event 196

How to make a validation rule force a redirection to a new page 196

How to add a Web Service reference in a test project - testing services in Unit Tests 200

How to remotely count connections to a process 202

How to hook into LoadTest database upon completion of a load test 202

How to deploy DLLs with MSTEST.EXE 203

How to authenticate with proxy before the test iteration begins 204

How to enumerate WebTextContext and Unit TestContext objects 205

How to manually move the data cursor 205

How to programmatically create a declarative web test 206

How to modify the string body programmatically in a declarative web test 207

How to Add Agents To A Test Rig 207

How to Change the Default Port for Agent-Controller Communication 207

How to create guaranteed unique user IDs for UNIT tests 208

How to create a sync point for starting load tests 210

How to set default extensions that the WebTest recorder will ignore 210

How to get the LoadTestRunId from a load test 210

--NEW--How To: Add comments to a web recording where IE is in KIOSK mode 211

--NEW--How to access a data source before it is bound to an object 212

--NEW--How to store and view transaction times for Unit and Coded UI tests 213

Page 9: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 9

--UPDATED--HOW TO: Handle 404 errors in dependent requests so the main request does not fail. 214

--NEW--HOW TO: Minimize the amount of data a webtest retains for Response Bodies 215

--NEW--HOW TO: Schedule tests to execute 215

--NEW--HOW TO: NOT send an "accept-language" in webtests 216

--NEW--How to upload a file in a Web test 217

Gotcha: Check Your Validation Level in the Load Test Run Settings 223

Gotcha: Do not adjust goals too quickly in your code 223

Gotcha: Response body capture limit is set to 1.5 MB by default 224

Gotcha: Caching of dependent requests is disabled when playing back Web Tests 224

Gotcha: VS 2008 and out of memory 224

Gotcha: Timeout attribute in coded web test does not work during a load test 224

--NEW--Gotcha: Cannot programmatically set .counterset mappings at runtime 225

Best Practice: considerations when creating a dynamic goal based load test plugin: 225

Best Practice: Coded web tests and web test plug-ins should not block threads 225

Best Practice: Add an Analysis Comment 226

EXTENSIBILITY 226

New Inner-text and Select-tag rules published on Codeplex 226

How to Add Custom Tabs to the Playback UI 228

How to extend recorder functionality with plugins 235

ITEMS NOT SPECIFIC TO THE VS TESTING PLATFORM 243

Stand-Alone Network Emulation and CodePlex 243

Using the VS Application Profiler 244

VS 2008 Application Profiler New Features 244

Using System.NET Tracing to debug Network issues 244

Logparser tips and tricks 245

Logparser WEB Queries 245

LogParser Non-Web Queries 246

--NEW--Keyboard shortcut for adding "USING" statements automatically 247

OLDER ARTICLES 248

Content-Length header not available in Web Request Object 248

SharePoint file upload test may post the file twice 248

Some Hidden Fields are not parameterized within AJAX calls 248

(FIX) Unit Test threading models and changing them 248

Bug in VS 2008 SP1 causes think time for redirected requests to be ignored in a load test 249

New Load Test Plugin Enhancements in VS 2008 SP1 249

Four New Methods added to the WebTestPlugin Class for 2008 SP1 249

INDEX 250

Page 10: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 10

Note from the author This new version of the Quick Reference Guide has been affectionately renamed the "NOT SO Quick

Reference Guide" because it keeps getting longer and (IMHO) it is not as easy to find information as it

used to be. I am still looking for an alternate delivery platform, but I will continue to put out data this

way until I find one.

There is a full section near the beginning just on new features in Visual Studio 2010. This list is not even

close to complete WRT all of the new Performance Testing features, let alone the tons of other testing

features in general. You will also find information about changes to 2010 and issues with 2010

throughout the rest of the document. All of these should have a balloon stating that it is new or

different.

Also please note that the Microsoft Visual Studio team has renamed the suite. The following changes

apply:

Testing for 2008 is included in "Visual Studio Team System".

Testing for 2010 is included in "Visual Studio Ultimate".

I refer to the testing suite as VS throughout the document.

Thanks to all of the people who have contributed articles and information. I look forward to hearing

feedback as well as suggestions moving forward.

Sincerely,

Geoff Gray, Senior Test Consultant – Microsoft Testing Services Labs

Page 11: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 11

How It Works

How Web Tests Handle HTTP Headers

There are three different types of HTTP headers handled by Web tests: 1) Recorded Headers and headers explicitly added to the request. By default, the Web test

recorder only records these headers:

"SOAPAction"

"Pragma"

"x-microsoftajax"

"Content-Type"

2) You can change the list of headers that the Visual Studio 2008 and 2010 web test recorder

records in the registry by using regedit to open:

HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\9.0\EnterpriseTools\QualityTools\Web

LoadTest

Add a string value under this key with the name "RequestHeadersToRecord" and

value="SOAPAction;Pragma;x-microsoftajax;Content-Type; Referrer"

If you do this and re-record your Web test, the Referrer header should be included in the request like

this:

Referrer header in a declarative web test

3) Headers handled automatically by the engine. Two examples: 1) headers sent and received as

part of authentication. These headers are handled in the Web test engine and can't be

controlled by the test. 2) cookies, which can be controlled through the API.

General Info (including order of execution) of load and web test plugins and rules

WebTestPlugins get tied to a webtest at the main level of the test. The order of precedence is: class WebTestPluginMethods : WebTestPlugin

{

public override void PreWebTest(object sender, PreWebTestEventArgs e) { }

public override void PreTransaction(object sender, PreTransactionEventArgs e) {}

public override void PrePage(object sender, PrePageEventArgs e) {}

Page 12: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 12

public override void PreRequestDataBinding(object sender,

PreRequestDataBindingEventArgs e) {}

public override void PreRequest(object sender, PreRequestEventArgs e) {}

public override void PostRequest(object sender, PostRequestEventArgs e) {}

public override void PostPage(object sender, PostPageEventArgs e) {}

public override void PostTransaction(object sender, PostTransactionEventArgs e) { }

public override void PostWebTest(object sender, PostWebTestEventArgs e) { }

}

PreWebTest fires before the first request is sent.

PreTransaction is fired before all user defined transaction in the test.

PrePage fires before any explicit request in the webtest. It also fires before any

PreRequest method.

PreRequestDataBinding fires before data from the context has been bound into

the request. Gives an opportunity to change the data binding.

PreRequest fires before ALL requests made, including redirects and dependant

requests. If you want it to act on only redirects, or skip redirects. use the

e.Request.IsRedirectFollow property to handle code flow.

All Post<method> follow the exact opposite order as the Pre<method>

WebTestRequestPlugins get set at an individual request level and only operate on the request(s) they

are explicitly tied to, and all redirects/dependant requests of that request.

class WebTestRequestPluginMethods : WebTestRequestPlugin

{

public override void PreRequestDataBinding(object sender,

PreRequestDataBindingEventArgs e) {}

public override void PreRequest(object sender, PreRequestEventArgs e) { }

public override void PostRequest(object sender, PostRequestEventArgs e) { }

}

ValidationRules can be assigned at the request level and at the webtest level. If the rule is assigned at

the webtest level, it will fire after every request in the webtest. Otherwise it will fire after the request it

is assigned to.

public class ValidationRule1 : ValidationRule

{

public override void Validate(object sender, ValidationEventArgs e) { }

}

ExtractionRules can be assigned at the request level. It will fire after the request it is assigned to.

public class ExtractionRule1 : ExtractionRule

{

public override void Extract(object sender, ExtractionEventArgs e) { }

}

NOTE: If you have multiple items attached to a request, then the order of

precedence is:

1) PostRequest (request plugins fire before WebTestRequest plugins)

2) Extract

3) Validate

Page 13: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 13

LoadTestPlugins get tied to the load tests directly. With VS 2005 and VS 2008, there can be only 1

plugin per loadtest, while VS 2010 adds >1 per test as well as LoadTestPlugin properties such that they

are consistent with WebTestPlugins. The methods available are divided into three categories as shown

below:

class LoadTestPlugins : ILoadTestPlugin

{

void LoadTest_LoadTestStarting(object sender, EventArgs e) { }

void LoadTest_LoadTestFinished(object sender, EventArgs e) { }

void LoadTest_LoadTestAborted(object sender, LoadTestAbortedEventArgs e) { }

void LoadTest_LoadTestWarmupComplete(object sender, EventArgs e) { }

void LoadTest_TestFinished(object sender, TestFinishedEventArgs e) { }

void LoadTest_TestSelected(object sender, TestSelectedEventArgs e) { }

void LoadTest_TestStarting(object sender, TestStartingEventArgs e) { }

void LoadTest_ThresholdExceeded(object sender, ThresholdExceededEventArgs e) { }

void LoadTest_Heartbeat(object sender, HeartbeatEventArgs e) { }

}

1) These fire based on the load test (meaning each one will fire only once during a full test run)

2) These fire once per test iteration, per vUser.

3) Heartbeat fires once every second, on every agent.

4) ThresholdExceeded fires each time a given counter threshold is exceeded.

NOTE: Each method in section 1 will fire once PER physical agent machine, however since the agent

machines are independent of each other, you do not need to worry about locking items to avoid

contention.

NOTE: If you create or populate a context parameter inside the LoadTest_TestStarting method, it will

not carry across to the next iteration.

In VS 2010, you can have more than one LoadTest plugin, although there is no guarantee about the

order in which they will execute.

You can now control whether a validation rule fires BEFORE or AFTER dependent requests.

at the end of recording a Web test, we now automatically add a Response Time Goal Validation rule

at the Web test level, but this doesn't help much unless you click on the Toolbar button that lets you

edit the response time goal as well as Think Time and Reporting Name for the Page for all recorded

requests in a single grid

1

2

3

Changed in 2010

Page 14: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 14

Client Code does not execute because Web Tests Work at the HTTP Layer

The following blog outlines where and how web tests work. This is important to understand if you are

wondering why client side code is not tested.

http://blogs.msdn.com/slumley/pages/web-tests-work-at-the-http-layer.aspx

When is the "Run unit tests in application domain" needed?

When a unit test is run by itself, a separate application domain is created in the test process for each

unit test assembly. There is some overhead associated with marshalling tests and test results across the

application domain boundary. An app domain is created by default when running unit tests in a load

test. You can turn off the app domain using the load test run by using the Load Test editor's Run

Setting's "Run unit tests in application domain". This provides some performance boost in terms of the

number of tests per second that the test process can execute before running out of CPU. The app

domain is required for unit tests that use an app.config file.

How the "Test Iterations" Setting impacts the total number of tests executed

In the properties for the Run Settings of a load test, there is a property called "Test Iterations" that tells

VS how many tests iterations to run during a load test. This is a global setting, so if you choose to run 5

iterations and you have 10 vusers, you will get FIVE total passes, not fifty. NOTE: you must enable this

setting by changing the property "Use Test Iterations" from FALSE (default) to TRUE.

Page 15: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 15

Test timeout setting for load test configuration does not affect web tests

The "Test Timeout" setting in the Test Run Configuration file (in the "Test -> Edit Test Run Configuration"

menu) does not have an effect in all cases.

Uses the setting

o Running a single unit test, web test, ordered test, or generic test by itself

o Running any of the above types of tests in a test run started from Test View, the Test

List editor, or mstest.

o Tests running in a load test (except Web tests)

Does not use the setting

o Running a Web test in a load test

o The load test itself

This particular test timeout is enforced by the agent test execution code, but load test and Web test

execution are tightly coupled for performance reasons and when a load test executes a Web test, the

agent test execution code that enforces the test timeout setting is bypassed.

How user pacing and "Think Time Between Test Iterations" work

The setting "Think Time Between Test Iterations" is available in the properties for a load test scenario.

This value is applied when a user completes one test, then the think time delay is applied before the

user starts the next iteration. The setting applies to each iteration of each test in the scenario mix.

If you create a load test that has a test mix model "Based on user pace", then the pacing calculated by

the test engine will override any settings you declare for "Think Time Between Test Iterations".

Load test warmup and cool down behaviors

For information about how warmup and cooldown affect the results, see the next section.

Warmup:

When you set a warmup time for a load test, VS will start running test iterations with a single user,

and will ramp up to the proper initial user count over the duration of the warmup. The number of

users ramped up are as follows:

o Constant User Load – the total number of users listed

o Step Load Pattern – the initial user count. The test will ramp from this number to the

maximum number of users during the actual test run.

Cool down:

In 2008

The Load test Terminate method does not fire unless you use a cool down period.

In 2010

The Load test Terminate method always fires.

Changed in 2010

Page 16: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 16

What is the difference between Unique, Sequential and Random Data Sources

Single Machine running tests

Sequential – This is the default and tells the web test to start with the first row then fetch rows in order

from the data source. When it reaches the end of the data source, loop back to the beginning and start

again. Continue until the load test completes. In a load test, the current row is kept for each data source

in each web test, not for each user. When any user starts an iteration with a given Web test, they are

given the next row of data and then the cursor is advanced.

Random – This indicates to choose rows at random. Continue until the load test completes.

Unique – This indicates to start with the first row and fetch rows in order. Once every row is used, stop

the web test. If this is the only web test in the load test, then the load test will stop.

Multiple machines running as a rig

Sequential – This works that same as if you are on one machine. Each agent receives a full copy of the

data and each starts with row 1 in the data source. Then each agent will run through each row in the

data source and continue looping until the load test completes.

Random – This also works the same as if you run the test on one machine. Each agent will receive a full

copy of the data source and randomly select rows.

Unique – This one works a little differently. Each row in the data source will be used once. So if you

have 3 agents, the data will be spread across the 3 agents and no row will be used more than once. As

with one machine, once every row is used, the web test will stop executing.

Page 17: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 17

Simulation of Browser Caching during load tests

In a VS load test that contains Web tests, the load test attempts to simulate the caching behavior of the

browser. Here are some notes on how that is done:

There is a property named on each request in a Web test named "Cache Control" in the Web

test editor (and named "Cache" on the WebTestRequest object in the API used by coded Web

tests).

When the Cache Control property on a request in the Web test is false, the request is always

issued.

When the Cache Control property is true, the VS load test runtime code attempts to emulate the

Internet Explorer caching behavior (with the "Automatically" setting).This includes reading and

following the HTTP cache control directives.

The Cache Control property is automatically set to true for all dependent requests (typically for

images, style sheets, etc embedded on the page).

In a load test, the browser caching behavior is simulated separately for each user running in the

load test.

When a virtual user in a load test completes a Web test and a new Web test session is started to

keep the user load at the same level, sometimes the load test starts simulates a "new user" with

a clean cache, and sometimes the load test simulates a return user that has items cached from a

previous session. This is determined by the "Percentage of New Users" property on the

Scenario in the load test. The default for "Percentage of New Users" is 0.

Important Note: When running a Web test by itself (outside of the load test), the Cache Control

property is automatically set to false for all dependent requests so they are always fetched; this is so

that they can be displayed in the browser pane of the Web test results viewer without broken images.

Page 18: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 18

Comparing new users to return users

There is a property in the Load Test Scenario settings for "Percentage of new users". This setting has

impact on a few different aspects of the load test execution. The percentage is a measure of how many

of the simulated users are pretending to be "brand new" to the site, and how many are pretending to be

"users who have been to the site before".

A better term to describe a new user is "One Time User". This is because a new user goes away at

the end of its iteration. It does not "replace" a different user in the pool. Therefore, the term "New

User" should be considered to be a "One Time" user.

The "Percentage of New Users" affects the following whether the tests contained within the load test

are Web tests or unit tests:

The value of the LoadTestUserId in the LoadTestUserContext object. This only matters for unit

tests and coded Web tests that use this property in their code. On the other hand if you set

the number of test iterations equal to the user load, then you should get a different

LoadTestUserId regardless of the setting of "Percentage of New Users".

If you are using the load test feature that allows you to define an "Initial Test" and/or a

"Terminate Test" for a virtual user, then it affects when the InitializeTest and TerminateTest are

run: for "new users" (a more accurate name might be "one time users", the InitializeTest is run

for the virtual user, the "Body Test" is run just once, and then the "Terminate Test" is run. For

users who are NOT "new users", the InitializeTest is run once, the Body Test is run many times

(until the load test completes), and then the TerminateTest runs (which might be during the

cool-down period).

The "Percentage of New Users" affects the following Web test features that are not applicable for unit

tests:

The simulation of browser caching. The option affects how the VUser virtual browser cache is

maintained between iterations of Tests. "New users" have an empty cache (not the responses

are not actually cached, only the urls are tracked), "return users" have a cache. So if this value is

100% all Vusers starting a Test will be starting with an empty browser cache. If this value is 0%

all VUsers will maintain the state of the browser cache between iterations of Web Tests. This

setting affects the amount of content that is downloaded. If an object sits in a Vuser cache and if

the object has not been modified since the last time the Vuser downloaded it, the object will not

be downloaded. Therefore, new users will download more content versus returning users with

items it their browser cache.

The handling of cookie for a Web test virtual user: new users always start running a Web test

with all cookies cleared. When a user who is not a "new user" runs an Web test after the first

one run, the cookies set during previous Web tests for that virtual user are present.

Page 19: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 19

The below graphs (taken from test runs in VS 2010) demonstrate the difference between a new user and

a return user. The graphs are based on a 10 user / 50 iteration run, but with different percentages for

"new users" on each run. NOTE: The graphs below are new to VS 2010, but the way in which users are

simulated is the same as in VS 2008. For a better understanding of these graphs, go to the section called

"--UPDATED-- Virtual user visualization now available".

Zero percent new users shows a graph where each of the 10 vusers is constantly reused.

Fifty percent new users shows a graph where each of the 10 vusers is constantly reused by half of the

iterations, but the other half are split out among new vusers which never get reused.

One hundred percent new users shows a graph where none of the vusers is ever reused.

Page 20: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 20

Fro

m

IIS

Logs

Fro

m

IIS

Logs

Fro

m

IIS

Logs

Fro

m

IIS

Logs

How cache settings can affect your testing and app performance

This article shows how changing the caching settings in your Visual Studio tests and on your web server

can impact your test. It also shows a real world demonstration of the difference between NEW and

RETURN users.

TOR 09 - Caching - ReturnUsers HTM 268

HTML 263

GIF 83

BMP 32719

200 OK - 3871

304 Not Modified - 29462

VS Requests: 33,333

VS Requests Cached: 84,507

TOR 10 - Caching - NewUsers HTM 276

HTML 271

GIF 276

BMP 90243

200 OK - 46639

304 Not Modified - 44427

VS Requests: 89,384

VS Requests Cached: 43,758

TOR 12 - Caching - ReturnUsers - Content Expiration HTM 270

HTML 264

GIF 85

BMP 3330

200 OK - 3874

304 Not Modified - 75

VS Requests: 3,949

VS Requests Cached: 84,842

TOR 11 - Caching - NewUsers - Content Expiration HTM 268

HTML 262

GIF 268

BMP 44622

200 OK - 45286

304 Not Modified - 134

VS Requests: 44,742

VS Requests Cached: 42,090

Comparing New Users to Return Users (WRT caching):

New users are simulated by “clearing” the cache at the start of each new iteration, whereas the cache is carried from iteration to iteration for return users.

This results in many more requests being cached with return users.

NOTE: The total # of requests made by VS is a sum of the two VS values. In other words, “Total Requests” in the IDE does not include cached requests.

Looking at the impact of “content expiration” on the overall network and web server activity (For more information, see the section “Add an Expires or a Cache-Control Header” from http://developer.yahoo.com/performance/rules.html).

Notice that VS honors the content expiration (this is actually handled by the underlying System.NET component). However, VS still reports the cached file request, even though no call went out the wire. This is expected behavior since the request was a part of the site. In order to see how many requests went on the wire, you need to use IIS logs or network traces.

Page 21: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 21

Notes:

All 4 tests above were run for the same duration with the same number of users executing the same

test.

Although the numbers do not match exactly, they are close enough to show the behavior of the

tests. The discrepancy is due to a few things, including cool down of the test and the possible mis-

alignment of the query I used to gather data from the IIS logs.

The IIS Log items for "200 –OK" and "304-Not Modified" were gathered using LogParser and the

following query:

SELECT

sc-status, COUNT(*) AS Total

FROM *.log

WHERE

to_timestamp(date, time) between

timestamp('2010-02-12 02:13:22', 'yyyy-MM-dd hh:mm:ss')

and

timestamp('2010-02-12 02:18:22', 'yyyy-MM-dd hh:mm:ss')

GROUP BY

sc-status

Page 22: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 22

Goal based user behavior after the test finishes the warmup period

1. The user load starts at the value specified by the Initial User Count property of the Goal Based

Load Pattern.

2. At each sampling interval (which defaults to 5 seconds, but can be modified by the "Sample

Rate" property in the load test run settings), the performance counter defined in the goal based

load pattern is sampled. (If it can't be sampled for some reason, an error is logged and the user

load remains the same.)

3. The value sampled is compared with the "Low End" and "High End" properties of the "Target

Range for Performance Counter".

4. If the value is within the boundaries of the "Low End" and "High End", the user load remains the

same.

5. If the value is not within the boundaries of the "Low End" and "High End", the user load is

adjusted as follows:

The midpoint of the target range for the goal is divided by the sample valued for the goal

performance counter to calculate an "adjustment factor".

For example, if the goal is defined as "% Processor Time" between 50 and 70, the midpoint

is 60. If the sampled value for % Processor Time is 40, then AdjustmentFactor = 60/40 =

1.5, or if the sampled value is 80, the AdjustmentFactor = 60/80 = 0.75.

The AdjustmentFactor is multiplied by the current user load to get the new user load.

However, if the difference between the new user load and the current user load is greater

than the "Maximum User Count Increase/Decrease" property (whichever applies), then the

user load is only adjusted by as much as max increase/decrease property. My experience

has been that keeping these values fairly small is a good idea; otherwise the algorithm tends

to cause too much fluctuation (the perf counter keeps going above and below the target

range).

The new user load can also not be larger than the value specified by the goal based pattern's

MaximumUserCount property or less than the Minimum User Count property.

Two more considerations based on special properties of the goal based load pattern:

o If the property "Lower Values Imply Higher Resource Use" is True (which you might

use for example for a performance count such as Memory\Available Mbytes), then

the user load is adjusted in the opposite direction: the user load is decreased when

the sampled counter value is less than the Low End of the target range and

increased when the user load is greater than the High End of the target range.

o If the property "Stop Adjusting User Count When Goal Achieved" is True, then once

the sampled goal performance counter is within the target range for 3 consecutive

sampling intervals, then the user load is no longer adjusted and remains constant

for the remainder of the load test.

Lastly, as is true for all of the user load patterns, in a test rig with multiple agents, the new

user load is distributed among the agents equally by default, or according to the "Agent

Weightings" if these are specified in the agent properties.

Page 23: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 23

Threading models in Unit tests under load

When running unit tests in a load test, there is one thread for each virtual user that is currently running

a unit test. The load test engine doesn't know what's going on inside the unit test and needs to run each

on a separate thread to ensure that a thread will be available to start the next unit test without delay.

However, if you specify the Test Mix Based on User Pace feature (or specify a non-zero value for "Think

Time Between Test Iterations" (a property on each Scenario in the load test)), then the number of

concurrent virtual users is less than the total number of virtual users, and there is only one thread

needed in the thread pool for each concurrent virtual user.

There is an extra thread for each unit test execution thread that is used to monitor the execution of the

unit test, implement timing out of the test, etc. However, the stack size for this thread is smaller than

the default size so it should take up less memory.

More information can be found at: http://blogs.msdn.com/billbar/pages/features-and-behavior-of-load-

tests-containing-unit-tests-in-VS-2008.aspx

Page 24: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 24

The difference between Load Test Errors and Error Details

There's a distinction between "Errors" and "Error Details" within Load Test results.

1. "Load Test Errors" refers to any type of error that occurs in the load test. The info saved is the

user/requestURI/error text information. By default the load test results will save only 1000

errors of a particular type. This value is configured through a config file.

2. "Load Test Error Details" refers to the additional detail we capture for errors on Web test

requests: mostly the request and response body. The default value is 100. This value is

configured in the Load Test GUI.

This is the display of the Errors table in the test results viewer.

Each of these is a separate type of error and gets its own quantity of “errors” (#1) and “error details” (#2) The number of “errors” is shown in the Count column. Clicking on one of the numbers will bring up the Load Test Errors dialog below. There is no count displayed for “error details”.

Each line here is one of the “errors” entries (#1).

Any “errors” entry (#1) that has an associated “error details” will have a link in one or both of the last columns. Click on these to get the details about that specific error instance.

Page 25: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 25

How parameterization of HIDDEN Fields works in a webtest

For each extract hidden fields (using the built in "Extract Hidden") rule in a webtest, any context items

with the same name will be removed prior to extracting the new values. So if request 1 extracts 4

hidden values into a context "Hidden1", then request 2 extracts only 2 hidden values, also into a context

called "Hidden 1", then the resultant collection for "Hidden1" will contain ONLY the two values

extracted for request 2.

"Hidden Field Buckets"

In the example above, Hidden1 and Hidden2 represent hidden field buckets. We call the number at the

end as the bucket number, e.g. $HIDDEN0 is bucket 0.

The easiest example to explain is a frames page with two frames. Each frame will have an independent

bucket, and requests can be interleaved across the frames. Other examples that require multiple

buckets are popup windows and certain AJAX calls (since web tests support correlation of viewstate in

ASP.NET AJAX responses).

Hidden field matching

The algorithm to determine that a given request matches a particular bucket uses the heuristic that the

hidden fields parsed out of the response will match form post fields on a subsequent request.

E.g. if the recorder parses out of a response

<INPUT type=hidden ID=Field1 value=v1>

<INPUT type=hidden ID=Field2 value=v2>

Then on a subsequent post we see Field1 and Field2 posted, then this request and response match and a

hidden field bucket will be created for them. The first available bucket number is assigned to the hidden

field bucket.

Once a bucket is "consumed" by a subsequent request via binding, that bucket is made available again.

So if the test has a single frame, it will always reuse bucket 0:

Page 1

o Extract bucket 0

Page 2

o Bind bucket 0 params

Page 3

o Extract bucket 0

Page 4

o Bind bucket 0 params

If a test has 2 frames that interleave requests, it will use two buckets:

Page 26: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 26

Frame 1, Page 1

o Extract bucket 0

Frame 2, Page 1

o Extract bucket 1

Frame 2, Page 2

o Bind bucket 1 params

Frame 1, Page 2

o Bind bucket 0 params

Or if a test uses a popup window, or Viewstate, you would see a similar pattern as the frames page

where multiple buckets are used to keep the window state.

Why are some fields unbound?

Some hidden fields values are modified in java script, such as EVENT_ARGUMENT. In that case, it won't

work to simply extract the value from the hidden field in the response and play it back. If the recorder

detects this is the case, it put the actual value that was posted back as the form post parameter value

rather than binding it to the hidden field.

A single page will have have just one hidden field extraction rule applied. If there are multiple forms on a

given page, there is still just one down-stream post of form fields, resulting in one application of the

hidden field extraction rule.

Page 27: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 27

Testing execution order in Unit Tests

I think that most confusion comes from some user's expectation of MSTest to execute like the Nunit

framework. They execute differently since Nunit instantiates a test class only once when executing all

the tests contained in it, whereas MSTest instantiates each test method's class separately during the

execution process, with each instantiation occurring on a separate thread. This design affects 3 specific

things which often confuse users of MSTest:

1. ClassInitialize and ClassCleanup: Since ClassInitialize and ClassCleanUp are static, they are only

executed once even though several instances of a test class can be created by MSTest.

ClassInitialize executes in the instance of the test class corresponding to the first test method in

the test class. Similarly, MSTest executes ClassCleanUp in the instance of the test class

corresponding to the last test method in the test class.

2. Execution Interleaving: Since each instance of the test class is instantiated separately on a

different thread, there are no guarantees regarding the order of execution of unit tests in a

single class, or across classes. The execution of tests may be interleaved across classes, and

potentially even assemblies, depending on how you chose to execute your tests. The key thing

here is – all tests could be executed in any order, it is totally undefined.

3. TextContext Instances: TestContexts are different for each test method, with no sharing

between test methods.

For example, if we have a Test Class:

[TestClass]

public class VSClass1

{

private TestContext testContextInstance;

public TestContext TestContext

{

get

{

return testContextInstance;

}

set

{

testContextInstance = value;

}

}

[ClassInitialize]

public static void ClassSetup(TestContext a)

{

Console.WriteLine("Class Setup");

}

[TestInitialize]

public void TestInit()

{

Console.WriteLine("Test Init");

}

Page 28: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 28

[TestMethod]

public void Test1()

{

Console.WriteLine("Test1");

}

[TestMethod]

public void Test2()

{

Console.WriteLine("Test2");

}

[TestMethod]

public void Test3()

{

Console.WriteLine("Test3");

}

[TestCleanup]

public void TestCleanUp()

{

Console.WriteLine("TestCleanUp");

}

[ClassCleanup]

public static void ClassCleanUp ()

{

Console.WriteLine("ClassCleanUp");

}

}

(This consists of 3 Test Methods, ClassInitialize, ClassCleanup, TestInitialize, TestCleanUp and an explicit

declaration of TestContext)

The execution order would be as follows:

Test1 [Thread 1]: new TestContext -> ClassInitialize -> TestInitialize -> TestMethod1 -> TestCleanUp Test2 [Thread 2]: new TestContext -> TestInitialize -> TestMethod2 -> TestCleanUp Test3 [Thread 3]: new TestContext -> TestInitialize -> TestMethod2 -> TestCleanUp -> ClassCleanUp

The output after running all the tests in the class would be:

Class Setup

Test Init

Test1

TestCleanUp

Test Init

Test2

TestCleanUp

Test Init

Test3

TestCleanUp

ClassCleanUp

Page 29: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 29

How machines in the test rig communicate

The below Visio diagrams that shows which ports are used during setup and when the agent and

controller run tests.

Controller-Agent Communications

And here are the connections used during agent setup:

Page 30: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 30

Controller-Agent Communications

How a Load Test Plugin is executed on a test rig

Exactly one instance of the load test plugin class (implementing ILoadTestPlugin) is created on

every agent for each load test run.

The load test plugin class is never instantiated on the controller.

None of the agents will start running the load test before the call to ILoadTestPlugin.Initialize is

completed on all agents.

Page 31: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 31

Sharing State across agents in a load test rig is not supported out of the box

The following is an excerpt from a discussion on a possible way to customize VS to handle sharing state

across a rig:

Question: The load test in our scenario is driving integration tests (implemented using the VS unit

testing framework) so I want the data to be available to the unit test while it is running. I am thinking of

writing a lightweight service that acts as the provider of shared state. I will use the

ILoadTestPlugin.Initialize to initialize / reset the data source (using a filter for agent ID so that it runs

only once) by calling the service, retrieve the data from the service in LoadTest.TestStarting event and

then make this data available to the unit test using the test context. This way, the duration of the test

run is not affected by the state retrieval process. However, I need to be careful in implementation of the

shared state provider so that it doesn't have a major impact on the test run results (because of

synchronisation / contention).

Answer: As you said, the service needs to be super-fast and simple. Maintaining a simple list of

name/value pairs would go a long way. The trickiest thing about the service is what locking to provide.

For example, for state variable keeping a count, we don't want agents setting the value, as they will step

on each other and lose increments. A better design is to have a first class Increment command that the

service handles. There are similar questions for integrity of string data, although that is probably not as

important as providing a simple counter. Another common pattern is maintaining lists of stuff. One user

is adding things to the list, the other user is consuming them. This is probably best implemented with a

database.

Page 32: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 32

--NEW--Order of execution of components in a webtest

1) Any execution output lines below in blue with asterisks come from a coded version of the web test to

show how the different points of the code will execute.

2) All other output execution lines come from the rules and plugins written for this example.

3) None of the built in rules have output statements, but the order of execution follows the same pattern as

the custom items.

INITIALIZATION

***** Entering WebTest Initialization Code ***** Exiting WebTest Initialization Code ***** Entering WebTest Initialization Code ***** Exiting WebTest Initialization Code WebTest - PreWebTest ***** Entering Validation Initialization Code ***** Exiting Validation Initialization Code

NOTES: These items are accessible in coded web tests, but they still fire in declarative tests as well.

Entering a Transaction

***** Entering this.BeginTransaction("Transaction1") WebTest - PreTransaction: Transaction1

NOTES:

1) Transactions do not fire any requests so they do not show any response info. Transactions do

track the overall time for all requests and work inside the transaction.

Page 33: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 33

2) Pre Transaction events fire before all requests inside the transaction, so any

settings/configuration inside a transaction event will be overridden by any similar settings inside

any of the page or request events.

Entering a request with dependent requests

***** Entering request1 ***** Entering yield return request1 WebRequest - PreWebRequestDataBinding: http://localhost/HOL/ParamPage1.aspx WebTest - PreWebTestDataBinding: http://localhost/HOL/ParamPage1.aspx WebRequest - PreWebRequest: http://localhost/HOL/ParamPage1.aspx WebTest - PrePage: http://localhost/HOL/ParamPage1.aspx WebTest - PreRequest: http://localhost/HOL/ParamPage1.aspx Web Extraction Rule: http://localhost/HOL/ParamPage1.aspx Web Extraction Rule Number 2: http://localhost/HOL/ParamPage1.aspx WebTest level Validation Rule Request level Validation Rule WebRequest - PostWebRequest: Redirect:False http://localhost/HOL/ParamPage1.aspx WebTest - PostRequest: Redirect:False http://localhost/HOL/ParamPage1.aspx WebTest - PostPage: http://localhost/HOL/ParamPage1.aspx ***** Entering request1 = null ***** Exiting request1

NOTES:

1) Notice that web request events do NOT fire for any of the dependent requests.

2) Notice that the custom extraction rules fire in the order they are listed in the code. This behavior

holds true for all extraction/validation rules. In this test, I do not have output for the built-in hidden

params, but in this request it comes first in the list, therefore it fires first in the actual test.

Page 34: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 34

Entering the next request

WebRequest - PreWebRequestDataBinding: http://localhost/HOL/ParamPage3.aspx WebTest - PreWebTestDataBinding: http://localhost/HOL/ParamPage3.aspx WebRequest - PreWebRequest: http://localhost/HOL/ParamPage3.aspx WebTest - PrePage: http://localhost/HOL/ParamPage3.aspx WebTest - PreRequest: http://localhost/HOL/ParamPage3.aspx Web Extraction Rule: http://localhost/HOL/ParamPage3.aspx WebTest level Validation Rule Request level Validation Rule WebRequest - PostWebRequest: Redirect:False http://localhost/HOL/ParamPage3.aspx WebTest - PostRequest: Redirect:False http://localhost/HOL/ParamPage3.aspx WebTest - PostPage: http://localhost/HOL/ParamPage3.aspx

NOTES:

1) Notice that the custom extraction rules fire in the order they are listed in the code. This order is

opposite of the previous request (both in code and order executed).

Exiting the first transaction

***** Entering this.EndTransaction("Transaction1" WebTest - PostTransaction: Transaction1 NOTES:

Page 35: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 35

1) This fires after all request events inside the transaction, even though the playback shows only the transaction beginning.

Entering a looped request

WebRequest - PreWebRequestDataBinding: http://localhost/HOL/ParamPage3.aspx WebTest - PreWebTestDataBinding: http://localhost/HOL/ParamPage3.aspx WebRequest - PreWebRequest: http://localhost/HOL/ParamPage3.aspx WebTest - PrePage: http://localhost/HOL/ParamPage3.aspx WebTest - PreRequest: http://localhost/HOL/ParamPage3.aspx Web Extraction Rule: http://localhost/HOL/ParamPage3.aspx WebTest level Validation Rule Request level Validation Rule WebRequest - PostWebRequest: Redirect:False http://localhost/HOL/ParamPage3.aspx WebTest - PostRequest: Redirect:False http://localhost/HOL/ParamPage3.aspx WebTest - PostPage: http://localhost/HOL/ParamPage3.aspx WebRequest - PreWebRequestDataBinding: http://localhost/HOL/ParamPage3.aspx WebTest - PreWebTestDataBinding: http://localhost/HOL/ParamPage3.aspx WebRequest - PreWebRequest: http://localhost/HOL/ParamPage3.aspx WebTest - PrePage: http://localhost/HOL/ParamPage3.aspx WebTest - PreRequest: http://localhost/HOL/ParamPage3.aspx Web Extraction Rule: http://localhost/HOL/ParamPage3.aspx WebTest level Validation Rule Request level Validation Rule WebRequest - PostWebRequest: Redirect:False http://localhost/HOL/ParamPage3.aspx WebTest - PostRequest: Redirect:False http://localhost/HOL/ParamPage3.aspx WebTest - PostPage: http://localhost/HOL/ParamPage3.aspx

Page 36: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 36

NOTES:

1) The normal series of events fires on each iteration that meets the loop condition. When the loop

condition fails, there are NO MORE events fired as part of the embedded request(s)

Entering a redirect request

WebRequest - PreWebRequestDataBinding: http://localhost/HOL/ParamRedirect.aspx WebTest - PreWebTestDataBinding: http://localhost/HOL/ParamRedirect.aspx WebRequest - PreWebRequest: http://localhost/HOL/ParamRedirect.aspx WebTest - PrePage: http://localhost/HOL/ParamRedirect.aspx WebTest - PreRequest: http://localhost/HOL/ParamRedirect.aspx WebRequest - PostWebRequest: Redirect:False http://localhost/HOL/ParamRedirect.aspx WebTest - PostRequest: Redirect:False http://localhost/HOL/ParamRedirect.aspx WebRequest - PreWebRequestDataBinding: http://localhost/HOL/ParamPage4.aspx WebTest - PreWebTestDataBinding: http://localhost/HOL/ParamPage4.aspx WebRequest - PreWebRequest: http://localhost/HOL/ParamPage4.aspx WebTest - PreRequest: http://localhost/HOL/ParamPage4.aspx Web Extraction Rule: http://localhost/HOL/ParamPage4.aspx WebTest level Validation Rule Request level Validation Rule WebRequest - PostWebRequest: Redirect:True http://localhost/HOL/ParamPage4.aspx WebTest - PostRequest: Redirect:True http://localhost/HOL/ParamPage4.aspx WebTest - PostPage: http://localhost/HOL/ParamRedirect.aspx

NOTES:

1) When a redirect comes down the pipe, the webtest-request and web-request events

fire before the redirect is sent.

2) However, Extraction and validation rules do NOT fire until after the redirect. If you

want to validate against a request BEFORE it redirects, you need to use a plugin and

simulate the work of a validation rule.

Page 37: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 37

3) Notice that the Pre and Post Page events surround the entire set of requests in the

main level request EXCEPT for DataBinding events AND a request level PreRequest

event. Page level includes the main request, redirect requests and dependent

requests.

Entering Final Request

WebRequest - PreWebRequestDataBinding: http://localhost/HOL/ParamPage2.aspx WebTest - PreWebTestDataBinding: http://localhost/HOL/ParamPage2.aspx WebRequest - PreWebRequest: http://localhost/HOL/ParamPage2.aspx WebTest - PrePage: http://localhost/HOL/ParamPage2.aspx WebTest - PreRequest: http://localhost/HOL/ParamPage2.aspx Web Extraction Rule: http://localhost/HOL/ParamPage2.aspx WebTest level Validation Rule Request level Validation Rule WebRequest - PostWebRequest: Redirect:False http://localhost/HOL/ParamPage2.aspx WebTest - PostRequest: Redirect:False http://localhost/HOL/ParamPage2.aspx WebTest - PostPage: http://localhost/HOL/ParamPage2.aspx

Page 38: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 38

WebTest - PostWebTest

Code used in this test

This code uses an internal team tool called RTMonitor. The only thing RTMonitor does is write a

message to an output window. This is where the outputs from above are being collected. To ensure that

the proper order of execution was captured, I used breakpoints on every single RTMonitor line to allow

the output queue to blast each message separately instead of in batches.

Load Test Plugins. [System.ComponentModel.DisplayName("BehaviorLoadTestRule")]

[System.ComponentModel.Description("TODO: Add a more detailed description of LoadTestRule1

Loadtest rule.")]

public class ExecutionOrderPluginClasses : ILoadTestPlugin

{

LoadTest m_loadTest;

public void Initialize(LoadTest loadTest)

{

m_loadTest = loadTest;

// Here's the complete list of events that are fired. You should delete events you are

not using.

// To see a list of events, type "m_loadTest." on a new line in the constructor.

m_loadTest.LoadTestStarting += new EventHandler(LoadTest_LoadTestStarting);

m_loadTest.LoadTestFinished += new EventHandler(LoadTest_LoadTestFinished);

m_loadTest.LoadTestAborted += new

EventHandler<LoadTestAbortedEventArgs>(LoadTest_LoadTestAborted);

m_loadTest.LoadTestWarmupComplete += new EventHandler(LoadTest_LoadTestWarmupComplete);

m_loadTest.TestStarting += new

EventHandler<TestStartingEventArgs>(LoadTest_TestStarting);

m_loadTest.TestFinished += new

EventHandler<TestFinishedEventArgs>(LoadTest_TestFinished);

m_loadTest.TestSelected += new

EventHandler<TestSelectedEventArgs>(LoadTest_TestSelected);

m_loadTest.ThresholdExceeded += new

EventHandler<ThresholdExceededEventArgs>(LoadTest_ThresholdExceeded);

m_loadTest.Heartbeat += new EventHandler<HeartbeatEventArgs>(LoadTest_Heartbeat);

}

void LoadTest_LoadTestStarting(object sender, EventArgs e)

{

RTMonitor.Write("LoadTest_LoadTestStarting");

}

void LoadTest_LoadTestFinished(object sender, EventArgs e)

{

RTMonitor.Write("LoadTest_LoadTestFinished");

}

void LoadTest_LoadTestAborted(object sender, LoadTestAbortedEventArgs e)

{

RTMonitor.Write("LoadTest_LoadTestAborted");

}

Page 39: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 39

void LoadTest_LoadTestWarmupComplete(object sender, EventArgs e)

{

RTMonitor.Write("LoadTest_LoadTestWarmupComplete");

}

void LoadTest_TestFinished(object sender, TestFinishedEventArgs e)

{

String str = e.UserContext["AgentId"].ToString();

RTMonitor.Write("LoadTest_TestFinished " + str);

}

void LoadTest_TestSelected(object sender, TestSelectedEventArgs e)

{ RTMonitor.Write("LoadTest_TestSelected"); }

void LoadTest_TestStarting(object sender, TestStartingEventArgs e)

{ RTMonitor.Write("LoadTest_TestStarting"); }

void LoadTest_ThresholdExceeded(object sender, ThresholdExceededEventArgs e)

{ RTMonitor.Write("LoadTest_ThresholdExceeded"); }

void LoadTest_Heartbeat(object sender, HeartbeatEventArgs e)

{

RTMonitor.Write("LoadTest_Heartbeat");

}

}

Validation and Extraction rules [System.ComponentModel.DisplayName("BehaviorExtraction")]

[System.ComponentModel.Description("TODO: Add a more detailed description of ExtractionRule1

extraction rule.")]

public class ExtractionRule1 : ExtractionRule

{

public override void Extract(object sender, ExtractionEventArgs e)

{

RTMonitor.Write(Color.DarkOrange, "Web Extraction Rule: {0}",

e.Request.UrlWithQueryString.ToString());

}

}

[System.ComponentModel.DisplayName("BehaviorExtractionNumber2")]

[System.ComponentModel.Description("TODO: Add a more detailed description of ExtractionRule1

extraction rule.")]

public class ExtractionRule2 : ExtractionRule

{

public override void Extract(object sender, ExtractionEventArgs e)

{

RTMonitor.Write(Color.DarkOrange, "Web Extraction Rule Number 2: {0}",

e.Request.UrlWithQueryString.ToString());

}

}

[System.ComponentModel.DisplayName("BehavoiorValidation")]

[System.ComponentModel.Description("TODO: Add a more detailed description of 'ValidationRule1'

validation rule.")]

public class ValidationRule1 : ValidationRule

{

public override void Validate(object sender, ValidationEventArgs e)

{

RTMonitor.Write(Color.SlateGray,"Request level Validation Rule");

}

}

[System.ComponentModel.DisplayName("BehavoiorWebTestValidation")]

Page 40: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 40

[System.ComponentModel.Description("TODO: Add a more detailed description of 'ValidationRule1'

validation rule.")]

public class ValidationRule2 : ValidationRule

{

public override void Validate(object sender, ValidationEventArgs e)

{

RTMonitor.Write(Color.SlateGray, "WebTest level Validation Rule");

}

}

WebTest Plugin [System.ComponentModel.DisplayName("BehaviorWebTestPlugin")]

[System.ComponentModel.Description("TODO: Add a more detailed description of 'WebTestPlugin1'

WebTest Plugin.")]

public class WebTestPlugin1 : WebTestPlugin

{

public override void PreWebTest(object sender, PreWebTestEventArgs e)

{

RTMonitor.Write(Color.Red, "WebTest - PreWebTest");

}

public override void PostWebTest(object sender, PostWebTestEventArgs e)

{

RTMonitor.Write(Color.Red, "WebTest - PostWebTest");

}

public override void PreRequestDataBinding(object sender, PreRequestDataBindingEventArgs e)

{

RTMonitor.Write(Color.Red, "\tWebTest - PreWebTestDataBinding: {0}",

e.Request.UrlWithQueryString.ToString());

}

public override void PreRequest(object sender, PreRequestEventArgs e)

{

RTMonitor.Write(Color.Red, "\tWebTest - PreRequest: {0}",

e.Request.UrlWithQueryString.ToString());

}

public override void PostRequest(object sender, PostRequestEventArgs e)

{

RTMonitor.Write(Color.Red, "\tWebTest - PostRequest: Redirect:{1} {0}",

e.Request.UrlWithQueryString.ToString(), e.Request.IsRedirectFollow.ToString());

}

public override void PrePage(object sender, PrePageEventArgs e)

{

RTMonitor.Write(Color.Red, "\tWebTest - PrePage: {0}",

e.Request.UrlWithQueryString.ToString());

}

public override void PostPage(object sender, PostPageEventArgs e)

{

RTMonitor.Write(Color.Red, "\tWebTest - PostPage: {0}",

e.Request.UrlWithQueryString.ToString());

}

public override void PreTransaction(object sender, PreTransactionEventArgs e)

{

RTMonitor.Write(Color.Red, "\tWebTest - PreTransaction: {0}", e.TransactionName);

}

public override void PostTransaction(object sender, PostTransactionEventArgs e)

{

RTMonitor.Write(Color.Red, "\tWebTest - PostTransaction: {0}", e.TransactionName);

}

}

Page 41: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 41

WebRequest Plugin [System.ComponentModel.DisplayName("BehaviorWebTestRequestPlugin")]

[System.ComponentModel.Description("TODO: Add a more detailed description of

WebTestRequestPlugin1 extraction rule.")]

public class WebTestRequestPlugin1 : WebTestRequestPlugin

{

public override void PreRequestDataBinding(object sender, PreRequestDataBindingEventArgs e)

{

RTMonitor.Write(Color.Black, "\t\tWebRequest - PreWebRequestDataBinding: {0}",

e.Request.UrlWithQueryString.ToString());

}

public override void PreRequest(object sender, PreRequestEventArgs e)

{

RTMonitor.Write(Color.Black, "\t\tWebRequest - PreWebRequest: {0}",

e.Request.UrlWithQueryString.ToString());

}

public override void PostRequest(object sender, PostRequestEventArgs e)

{

RTMonitor.Write(Color.Black, "\t\tWebRequest - PostWebRequest: Redirect:{1} {0}",

e.Request.UrlWithQueryString.ToString(), e.Request.IsRedirectFollow.ToString());

}

}

Coded webtest code to get the initialization events

The bulk of the output was gathered from the declarative test, but this was used to capture execution

points that are not exposed as events.

public WebTest1Coded() { RTMonitor.Write(Color.Blue, "***** Entering WebTest Initialization Code"); this.PreAuthenticate = true; this.PreWebTest += new EventHandler<PreWebTestEventArgs>(this.testPlugin0.PreWebTest); this.PostWebTest += new EventHandler<PostWebTestEventArgs>(this.testPlugin0.PostWebTest); this.PreTransaction += new EventHandler<PreTransactionEventArgs>(this.testPlugin0.PreTransaction); this.PostTransaction += new EventHandler<PostTransactionEventArgs>(this.testPlugin0.PostTransaction); this.PrePage += new EventHandler<PrePageEventArgs>(this.testPlugin0.PrePage); this.PostPage += new EventHandler<PostPageEventArgs>(this.testPlugin0.PostPage); RTMonitor.Write(Color.Blue, "***** Exiting WebTest Initialization Code"); } public override IEnumerator<WebTestRequest> GetRequestEnumerator() { RTMonitor.Write(Color.Blue, "***** Entering Validation Initialization Code"); if ((this.Context.ValidationLevel >= Microsoft.VisualStudio.TestTools.WebTesting.ValidationLevel.High)) { ValidationRule2 validationRule1 = new ValidationRule2(); this.ValidateResponse += new EventHandler<ValidationEventArgs>(validationRule1.Validate); } this.PreRequestDataBinding += new EventHandler<PreRequestDataBindingEventArgs>(this.testPlugin0.PreRequestDataBinding); this.PreRequest += new EventHandler<PreRequestEventArgs>(this.testPlugin0.PreRequest); this.PostRequest += new EventHandler<PostRequestEventArgs>(this.testPlugin0.PostRequest); RTMonitor.Write(Color.Blue, "***** Exiting Validation Initialization Code"); RTMonitor.Write(Color.Blue, "***** Entering this.BeginTransaction(\"Transaction1\")"); this.BeginTransaction("Transaction1"); RTMonitor.Write(Color.Blue, "***** Entering request1"); WebTestRequest request1 = new WebTestRequest("http://localhost/HOL/ParamPage1.aspx"); if ((this.Context.ValidationLevel >= Microsoft.VisualStudio.TestTools.WebTesting.ValidationLevel.High)) {

Page 42: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 42

ValidationRule1 validationRule2 = new ValidationRule1(); request1.ValidateResponse += new EventHandler<ValidationEventArgs>(validationRule2.Validate); } ExtractHiddenFields extractionRule1 = new ExtractHiddenFields(); extractionRule1.Required = true; extractionRule1.HtmlDecode = true; extractionRule1.ContextParameterName = "1"; request1.ExtractValues += new EventHandler<ExtractionEventArgs>(extractionRule1.Extract); ExtractionRule1 extractionRule2 = new ExtractionRule1(); extractionRule2.ContextParameterName = "test1"; request1.ExtractValues += new EventHandler<ExtractionEventArgs>(extractionRule2.Extract); ExtractionRule2 extractionRule3 = new ExtractionRule2(); extractionRule3.ContextParameterName = "test1a"; request1.ExtractValues += new EventHandler<ExtractionEventArgs>(extractionRule3.Extract); WebTestRequestPlugin1 requestPlugin1 = new WebTestRequestPlugin1(); request1.PreRequestDataBinding += new EventHandler<PreRequestDataBindingEventArgs>(requestPlugin1.PreRequestDataBinding); request1.PreRequest += new EventHandler<PreRequestEventArgs>(requestPlugin1.PreRequest); request1.PostRequest += new EventHandler<PostRequestEventArgs>(requestPlugin1.PostRequest); RTMonitor.Write(Color.Blue, "***** Entering yield return request1"); yield return request1; RTMonitor.Write(Color.Blue, "***** Entering request1 = null"); request1 = null; RTMonitor.Write(Color.Blue, "***** Exiting request1"); RTMonitor.Write(Color.Blue, "***** Entering this.EndTransaction(\"Transaction1\""); this.EndTransaction("Transaction1");

WebTest - PreWebTest ==========WebTest - PreWebTest Event, UserName.Name datasource IDIOT WebTest - PreTransaction: Transaction1 WebRequest - PreWebRequestDataBinding: http://localhost/HOL/ParamPage1.aspx ==========WebRequest - PreRequestDataBinding Event, UserName.Name datasource IDIOT WebTest - PreWebTestDataBinding: http://localhost/HOL/ParamPage1.aspx ==========WebTest - PreRequestDataBinding Event, UserName.Name datasource IDIOT ----------Advanced Data Table Cursor WebRequest - PreWebRequest: http://localhost/HOL/ParamPage1.aspx WebTest - PrePage: http://localhost/HOL/ParamPage1.aspx WebTest - PreRequest: http://localhost/HOL/ParamPage1.aspx ==========WebTest - PreRequest Event, UserName.Name datasource SILLY Web Extraction Rule: http://localhost/HOL/ParamPage1.aspx Web Extraction Rule Number 2: http://localhost/HOL/ParamPage1.aspx WebTest level Validation Rule Request level Validation Rule WebRequest - PostWebRequest: Redirect:False http://localhost/HOL/ParamPage1.aspx WebTest - PostRequest: Redirect:False http://localhost/HOL/ParamPage1.aspx WebTest - PostPage: http://localhost/HOL/ParamPage1.aspx WebRequest - PreWebRequestDataBinding: http://localhost/HOL/ParamPage2.aspx?Parameter1={{DataSource1.UserNames%23csv.Name}} ==========WebRequest - PreRequestDataBinding Event, UserName.Name datasource SILLY WebTest - PreWebTestDataBinding: http://localhost/HOL/ParamPage2.aspx?Parameter1={{DataSource1.UserNames%23csv.Name}} ==========WebTest - PreRequestDataBinding Event, UserName.Name datasource SILLY ----------Advanced Data Table Cursor WebRequest - PreWebRequest: http://localhost/HOL/ParamPage2.aspx?Parameter1=GOOSE WebTest - PrePage: http://localhost/HOL/ParamPage2.aspx?Parameter1=GOOSE WebTest - PreRequest: http://localhost/HOL/ParamPage2.aspx?Parameter1=GOOSE ==========WebTest - PreRequest Event, UserName.Name datasource GOOSE Web Extraction Rule Number 2: http://localhost/HOL/ParamPage2.aspx?Parameter1=GOOSE Web Extraction Rule: http://localhost/HOL/ParamPage2.aspx?Parameter1=GOOSE WebTest level Validation Rule

Page 43: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 43

Request level Validation Rule WebRequest - PostWebRequest: Redirect:False http://localhost/HOL/ParamPage2.aspx?Parameter1=GOOSE WebTest - PostRequest: Redirect:False http://localhost/HOL/ParamPage2.aspx?Parameter1=GOOSE WebTest - PostPage: http://localhost/HOL/ParamPage2.aspx?Parameter1=GOOSE WebTest - PostTransaction: Transaction1 WebRequest - PreWebRequestDataBinding: http://localhost/HOL/ParamPage3.aspx ==========WebRequest - PreRequestDataBinding Event, UserName.Name datasource GOOSE WebTest - PreWebTestDataBinding: http://localhost/HOL/ParamPage3.aspx ==========WebTest - PreRequestDataBinding Event, UserName.Name datasource GOOSE ----------Advanced Data Table Cursor WebRequest - PreWebRequest: http://localhost/HOL/ParamPage3.aspx WebTest - PrePage: http://localhost/HOL/ParamPage3.aspx WebTest - PreRequest: http://localhost/HOL/ParamPage3.aspx ==========WebTest - PreRequest Event, UserName.Name datasource ID10T Web Extraction Rule: http://localhost/HOL/ParamPage3.aspx WebTest level Validation Rule Request level Validation Rule WebRequest - PostWebRequest: Redirect:False http://localhost/HOL/ParamPage3.aspx WebTest - PostRequest: Redirect:False http://localhost/HOL/ParamPage3.aspx WebTest - PostPage: http://localhost/HOL/ParamPage3.aspx WebRequest - PreWebRequestDataBinding: http://localhost/HOL/ParamPage3.aspx ==========WebRequest - PreRequestDataBinding Event, UserName.Name datasource ID10T WebTest - PreWebTestDataBinding: http://localhost/HOL/ParamPage3.aspx ==========WebTest - PreRequestDataBinding Event, UserName.Name datasource ID10T ----------Advanced Data Table Cursor WebRequest - PreWebRequest: http://localhost/HOL/ParamPage3.aspx WebTest - PrePage: http://localhost/HOL/ParamPage3.aspx WebTest - PreRequest: http://localhost/HOL/ParamPage3.aspx ==========WebTest - PreRequest Event, UserName.Name datasource IDGIT Web Extraction Rule: http://localhost/HOL/ParamPage3.aspx WebTest level Validation Rule Request level Validation Rule WebRequest - PostWebRequest: Redirect:False http://localhost/HOL/ParamPage3.aspx WebTest - PostRequest: Redirect:False http://localhost/HOL/ParamPage3.aspx WebTest - PostPage: http://localhost/HOL/ParamPage3.aspx WebRequest - PreWebRequestDataBinding: http://localhost/HOL/ParamRedirect.aspx ==========WebRequest - PreRequestDataBinding Event, UserName.Name datasource IDGIT WebTest - PreWebTestDataBinding: http://localhost/HOL/ParamRedirect.aspx ==========WebTest - PreRequestDataBinding Event, UserName.Name datasource IDGIT ----------Advanced Data Table Cursor WebRequest - PreWebRequest: http://localhost/HOL/ParamRedirect.aspx WebTest - PrePage: http://localhost/HOL/ParamRedirect.aspx WebTest - PreRequest: http://localhost/HOL/ParamRedirect.aspx ==========WebTest - PreRequest Event, UserName.Name datasource ID1OT WebRequest - PostWebRequest: Redirect:False http://localhost/HOL/ParamRedirect.aspx WebTest - PostRequest: Redirect:False http://localhost/HOL/ParamRedirect.aspx WebRequest - PreWebRequestDataBinding: http://localhost/HOL/ParamPage4.aspx ==========WebRequest - PreRequestDataBinding Event, UserName.Name datasource ID1OT WebTest - PreWebTestDataBinding: http://localhost/HOL/ParamPage4.aspx ==========WebTest - PreRequestDataBinding Event, UserName.Name datasource ID1OT ----------Advanced Data Table Cursor WebRequest - PreWebRequest: http://localhost/HOL/ParamPage4.aspx WebTest - PreRequest: http://localhost/HOL/ParamPage4.aspx ==========WebTest - PreRequest Event, UserName.Name datasource STUPID Web Extraction Rule: http://localhost/HOL/ParamPage4.aspx WebTest level Validation Rule

Page 44: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 44

Request level Validation Rule WebRequest - PostWebRequest: Redirect:True http://localhost/HOL/ParamPage4.aspx WebTest - PostRequest: Redirect:True http://localhost/HOL/ParamPage4.aspx WebTest - PostPage: http://localhost/HOL/ParamRedirect.aspx WebRequest - PreWebRequestDataBinding: http://localhost/HOL/ParamPage2.aspx ==========WebRequest - PreRequestDataBinding Event, UserName.Name datasource STUPID WebTest - PreWebTestDataBinding: http://localhost/HOL/ParamPage2.aspx ==========WebTest - PreRequestDataBinding Event, UserName.Name datasource STUPID ----------Advanced Data Table Cursor WebRequest - PreWebRequest: http://localhost/HOL/ParamPage2.aspx WebTest - PrePage: http://localhost/HOL/ParamPage2.aspx WebTest - PreRequest: http://localhost/HOL/ParamPage2.aspx ==========WebTest - PreRequest Event, UserName.Name datasource IDIOT Web Extraction Rule: http://localhost/HOL/ParamPage2.aspx WebTest level Validation Rule Request level Validation Rule WebRequest - PostWebRequest: Redirect:False http://localhost/HOL/ParamPage2.aspx WebTest - PostRequest: Redirect:False http://localhost/HOL/ParamPage2.aspx WebTest - PostPage: http://localhost/HOL/ParamPage2.aspx WebTest - PostWebTest

--NEW--401 Access Denied responses and "Requests per second" measurements

Visual Studio load test results include timing information for all requests, both successful and failed. This

includes responses of type 401 and 404. The following data comes from a web test created specifically

to demonstrate how VS calculates RPS. It also includes a SQL query you can use if you want to calculate

the RPS for only certain types of requests.

List of pages executed during the test (100 iterations with all 3 pages called per iteration)

RequestId RequestUri

0 http://localhost/HOL/CreateUserAcct%20-%20Copy.aspx (401 Access Denied page)

1 http://localhost/HOL/CallWebService.aspx (200 OK page)

2 http://localhost/HOL/CreateUserAcct%20-%20Copy (404 not found page)

From Summary Page:

Requests Failed 200 Shows the 401 and 404 failures

Requests Cached Percentage 0 Shows that all pages ran every time

Avg. Response Time (sec) 0.032 Matches the “Overall” measurement

From Direct Database Query (Actual query used is below)

Overall 0.0319266666666667

Page 0 0.02092

Page 1 0.0498

Page 2 0.02506

W/O 401 0.03743

W/O 404 0.03536

Page 45: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 45

DB Query Used: SELECT 'Overall', AVG(ResponseTime)

FROM [LoadTest2010].[dbo].[LoadTestPageDetail]

WHERE LoadTestRunId = 48 AND PageId < 3

UNION SELECT 'Page 0', AVG(ResponseTime)

FROM [LoadTest2010].[dbo].[LoadTestPageDetail]

WHERE LoadTestRunId = 48 AND PageId = 0

UNION SELECT 'Page 1', AVG(ResponseTime)

FROM [LoadTest2010].[dbo].[LoadTestPageDetail]

WHERE LoadTestRunId = 48 AND PageId = 1

UNION SELECT 'Page 2', AVG(ResponseTime)

FROM [LoadTest2010].[dbo].[LoadTestPageDetail]

WHERE LoadTestRunId = 48 AND PageId = 2

UNION SELECT 'W/O 401', AVG(ResponseTime)

FROM [LoadTest2010].[dbo].[LoadTestPageDetail]

WHERE LoadTestRunId = 48 AND (PageId = 1 OR PageId = 2)

UNION SELECT 'W/O 404', AVG(ResponseTime)

FROM [LoadTest2010].[dbo].[LoadTestPageDetail]

WHERE LoadTestRunId = 48 AND (PageId = 1 OR PageId = 0)

--UPDATED-- File Downloads, Download Size and Storage of files during Web Tests

The web test engine does not write responses to disk, so you don't need to specify a location for the file.

It does read the entire response back to the client, but only stores the first 1.5M of the response in

memory.

You can override that using the WebTestRequest.ResponseBodyCaptureLimit property in the

request's section of a coded web test.

For a declarative webtest, you can add the following code to a plugin:

public class IncreaseResponseSize : WebTestPlugin { [DisplayName("Size to use - MB")] [Description("The maximum size to allow - defined in Mb (e.g. 10 = 10Mb)")] [DefaultValue(5)] public int iSize { get; set; } public override void PreWebTest(object sender, PreWebTestEventArgs e) { if (!e.WebTest.Context.ContainsKey("$LoadTestUserContext")) e.WebTest.ResponseBodyCaptureLimit = iSize * 1024 * 1024; // 10 MB } }

Page 46: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 46

--NEW-- Info on how VS records and generates web tests.

1. WHAT RECORDING TECHNIQUES DOES VS USE TO GET THE DATA TO BUILD THE WEB TEST?

There are 2 recorder that work together, the one that hooks into IE and the one that listens to

WinINET. A 3rd recorder merges the 2 recordings.

The ie record only gets top level requests. It doesn't get things like ajax or gifs, css files, etc. The

wininet gets all that information.

When each recorder gets a request it will examine the response for the dependents it expects.

E.G. when the WinINET recorder gets a request that the browser did not, we check to see if it is

one of those expected dependent requests. If so, the request is thrown away because we will

find it again at run time.

However if it is something like an ajax request, then it will not match a dependent request, so:

o in 2008 we automatically put these as top level requests

o in 2010 we added the option of trying to match these as dependent requests. What we

do is look to see if this potential dependent request started after top level page started

but before top level page finished.

NOTE: (The AJAX request parser in 2010 currently has a bug that forces all AJAX requests to the top

level).

2. WHAT CONSTITUTES THE PROMOTION OF A DYNAMIC PARAMETER TO A CONTEXT, VS. LEAVING HARD CODED, VS. ADDING TO HIDDEN PARAMS.

Generally speaking there are scenarios where what looks like a dynamic parameter is NOT dynamic

in the given recording. For example on SQL server reporting services there are parameters that

expire and get recreated AFTER a timeout. So the recorder may not catch all these as dynamic

parameters. In such a scenario, we may just leave it hard coded as it hasn’t change during the

playback . There isn’t anything a tool can do to detect the dynamic nature of a parameter in a

guaranteed way unless the parameter has changed across the recorder log and the playback log.

--NEW--IE9 and other browser emulation in VS2010

QUESTION:

Does anyone know if additional browsers are available for the load test mixer?

ANSWER:

This feature is commonly misunderstood by clients. They think VS actually uses these browsers during

the test. Actually all VSTS does is create a header with a string in it that identifies to the server what

“browser” it is talking to. These strings are well-known. The server may or may not behave differently

based on this information.

Page 47: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 47

There is one other important part. VS spins up the same number of async connections for dependent

requests that the browser does. So if you emulate IE7, you will see two separate TCP conversations

when pulling down a web page with dependent requests and if you emulate IE8, you will see up to 6

More info: One of the big gotchas that may happen in VS, but is uncommon with browser emulation

(especially IE) is the fact that VS uses the standard built-in .NET WebHttp objects to control all traffic,

where IE uses the native-mode WinINET. There are some subtle differences there. I have only hit one or

two cases where it mattered, but I just wanted to mention this difference.

Page 48: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 48

Items new to VS 2010

"Find" feature now available in Webtest playback UI

In VS 2010, you can now directly search for values in the playback window of the UI. With the playback

window active, press Ctrl-F to open the "find" dialog box. You then type in the phrase to search for. You

can also choose whether to look in the request, the response, the headers, all text, etc. You can further

refine the search by limiting to the currently highlighted request.

You can also right-click on a form post or query string parameter in the request tab to start a search.

Page 49: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 49

"Go To Web Test" feature now available in Webtest playback UI

In VS 2010, you can now highlight a specific value shown in the playback window, right-click, and choose

"Go to web test". This will open the web test window itself and highlight the item whose value you

chose. The feature works on the specific request currently highlighted, so if you have several requests

with the same parameter name, you will be directed to the request that directly corresponds to the

request you were looking at in the playback window.

Page 50: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 50

Recorder Log Available

In VS 2010, as you record a new Web test the recorded requests are saved to a Web test log file. Any

time you are in a new playback screen for this Web test, you can click on the Recorded Result menu bar

command to open the recorded requests and responses. (NOTE: if you upgrade a project from 2008 or

if you manually delete the original playback file, the button will be grayed out).

The recording will have the same name appended with "[Recorded]." This gives you the ability to see

the requests the browser made and the responses during recording, and compare them to what the

web test is sending and receiving. You can also search the recording for specific values that were

recorded.

Page 51: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 51

Add extraction rule directly from the playback UI

In the playback window, you can highlight any static value from a response that you wish to extract for

use in future requests. Simply highlight the value, right click, and choose Add Extraction Rule. It will

automatically name the rule, name the parameter and add the rule to the right request in the test. You

will still have to go to the subsequent request(s) where you want to use the parameter and add the

parameter to the request. If the value is found in the Web test, you will also be prompted to do a search

and replace of the value with the context parameter binding.

Tip: if this is value changes each time the test is run, the value from the result viewer will not be in the

editor. So rather than adding the extraction rule from the test result, add it from the recorder log

instead (since this will have the recorded value, which will also be in the Web test).

Page 52: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 52

New "Reporting Name" property for web requests

Web requests now have a new property exposed called "Reporting Name." This property allows you to

define any string to use in test results instead of the actual request URL. This is very handy for requests

with very long URLS or tests where there are several requests to the exact same URL. In the following

Web test, most requests are to the same URL, but the results are changed to show the "Reporting

Name" values set.

A request without any reporting name defined.

Page 53: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 53

LoadTestResultsTables now differentiate between GET and POST requests

If the webtest in the previous section ("Reporting Name Property") is executed in a load test, there are

two features you can see in the results.

1) Any Reporting Names you used will show up in the results table.

2) Any requests with the same name but with different methods will be reported separately.

The call from above with a reporting name

The calls from above without a reporting name. Even though they are the same requests, some have a GET method and some have a POST method.

Page 54: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 54

--UPDATED-- Virtual user visualization now available

NOTE: This feature is only available on tests where the "Timing Details Storage" property for the Run

Settings is set to "All Individual Details"

How to view activity visualization

In VS 2010, you can view a map of the virtual users activity AFTER a test run completes by clicking on the

"Details" button in the results window.

Page 55: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 55

What is shown in the visualization window

3 choices: 1) Test 2) Transaction 3) Page

View shows users in relation to each other (Y-axis) and durations of a single instance of each user’s measured activity (X-axis). For complete details on this, see the entry “New users versus One Time users”

Use the “Zoom to time” slider to control how much of the test details you wish to see.

Hover the mouse pointer over an instance to get a popup of the info about that instance.

Page 56: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 56

More Information

Here are the table definitions from the LoadTest2010 Results Store:

For the LoadTestTestDetail table, the big differences are that you get the outcome of the tests, which

virtual user executed it, and the end time of the test.

[LoadTestRunId] [int] NOT NULL ,

[TestDetailId] [int] NOT NULL ,

[TimeStamp] [datetime] NOT NULL ,

[TestCaseId] [int] NOT NULL ,

[ElapsedTime] [float] NOT NULL,

[AgentId] [int] NOT NULL,

[BrowserId] [int],

[NetworkId] [int],

[Outcome] [tinyint],

[TestLogId] [int] NULL,

[UserId] [int] NULL,

[EndTime] [datetime] NULL,

[InMeasurementInterval] [bit] NULL

For the LoadTestPageDetail table, you now get the end time of the page as well as the outcome of the

page.

[LoadTestRunId] [int] NOT NULL ,

[PageDetailId] [int] NOT NULL ,

[TestDetailId] [int] NOT NULL ,

[TimeStamp] [datetime] NOT NULL ,

[PageId] [int] NOT NULL ,

[ResponseTime] [float] NOT NULL,

[ResponseTimeGoal] [float] NOT NULL,

[GoalExceeded] [bit] NOT NULL,

[EndTime] [datetime] NULL,

[Outcome] [tinyint] NULL,

[InMeasurementInterval] [bit] NULL

New to

2010

New to

2010

Page 57: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 57

For the LoadTestTransactionDetail table the big changes are you get the response time of the

transaction and the end time. Statistics for transactions such as Min, Max, Avg, Median, StdDev, 90%,

95% and 99% are being calculated. These statistics are based on the ResponseTime column, not the

ElapsedTime. The difference between the 2 is that elapsed time includes think time whereas the

response time does not.

[LoadTestRunId] [int] NOT NULL ,

[TransactionDetailId] [int] NOT NULL ,

[TestDetailId] [int] NOT NULL ,

[TimeStamp] [datetime] NOT NULL ,

[TransactionId] [int] NOT NULL ,

[ElapsedTime] [float] NOT NULL,

[EndTime] [datetime] NULL,

[InMeasurementInterval] [bit] NULL,

[ResponseTime] [float] NULL

Another change in VS 2010 is that the default for whether or not to collect details has changed. In VS 2005 and VS 2008 the default was to not collect this detail data. In VS 2010, the default is to collect the detail data. This is controlled by the Timing Details Storage property on the Run Settings node in a load test. So you can still run your own analysis on this data, but there is also a new view in VS that you can use to get a look at the data. The view is the Virtual User Activity Chart. When a load test completes, there will be a new button enabled on the load test execution toolbar. It is the detail button below:

When you click on this button you will brought to the Virtual User Activity Chart. It looks like the following:

New to

2010

Page 58: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 58

Here is what you are looking at. Each horizontal row represents a virtual user. Each line in a horizontal

row represents a test, page or transaction. If you look at top left of this view, you will see a combo box

that shows which type of detail you are looking at. So in my case this is showing pages. Each color

represents a different page in the test. The length of the line represents the duration of the page. So

you can quickly tell which pages are running long.

If you look at the bottom of the chart, you will see a zoom bar. The zoom bar allows you to change the

range that you are looking at. The zoom bar overlays one of the graphs from the graph view. So

whichever graph is selected in the graph view, you will see that on the zoom bar. This makes it very

easy to correlate spikes in a graph with what tests/pages/transactions are occurring during that spike.

The legend on the left also has some filtering and highlight options. If you uncheck a page, then all

instances of that page are removed from the chart. If you click to Highlight Errors, then all pages that

failed will have their color changed to red. If you look at bottom part of the legend, you will see all the

errors that occurred during the test. You can choose to remove pages with certain errors or remove all

successful pages so you only see errors.

There is one other very useful feature of this view. You can hover over any line to get more information

about the detail and possibly drill into the tests that the detail belongs to. For example this is what it

looks like when you hover a detail:

Page 59: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 59

You see information about user, scenario, test, url , outcome, etc. For this detail, there is also a test log

link. If you click this, you will see the actual test that the page was a part of. For example, when I click

test log, I see the following:

You see the full set of details collected for the test in the usual web test playback view that you are use

to. If it was a unit test, you would have seen the unit test viewer instead.

Page 60: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 60

New Excel reporting features built into load test results

There are two new features for reporting through Excel built into the load test results window

1) Load Testing Run Comparison Report

http://blogs.msdn.com/slumley/archive/2009/11/07/VS-2010-feature-load-testing-run-comparison-

report-in-excel.aspx

2) Load Test Trend Report

http://blogs.msdn.com/slumley/archive/2009/05/22/dev10-feature-load-test-excel-report-

integration.aspx

Page 61: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 61

New Load Test and Load Test Rig Licensing and configurations

This information was taken straight from a blog post by Ed Glas

(http://blogs.msdn.com/edglas/archive/2010/02/07/configuration-options-for-load-testing-with-visual-

studio-2010.aspx)

Using Visual Studio Ultimate enables you to generate 250 virtual users of load. To go higher than 250

users, you need to purchase a Virtual User Pack, which gives you 1000 users. You can use the 1000 users

on any number of agents. Note that if you install the Virtual User Pack on the same machine as Visual

Studio Ultimate, you do not get 1250 users on the controller. The 250 virtual users you get with Ultimate

can only be used on "local" runs, not on a Test Controller. If you need to generate more 1000 users, you

purchase additional Virtual User Packs, which aggregate or accumulate on the Test Controller. In other

words, installing 2 Virtual User Packs on one controller gives you 2000 Virtual Users, which can be run

on any number of agents.

Configuration 1: "Local" Load Generation

This is what you get when you install Visual Studio Ultimate, which is the ability to generate

load "locally" using the test host process on the same machine that VS is running on. In addition

to limiting load to 250 users, it is also limited to one core on the client CPU.

Note that purchasing Ultimate also gives you the ability to collect ASP.NET profiler traces by

using a Test Agent as a data collector on the Web server.

Page 62: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 62

Configuration 2: Distributed Test Controller and Test Agents

This is a common configuration if you are scaling out your load agents. With this configuration,

the Test Controller and each Test Agent is on a separate machine.

The advantage of this configuration is the controller is easily shared by team members, and

overhead from the controller does not interfere with load generation or operation of the client.

Note the Test Controller must have one or more Virtual User Packs installed to enable load

testing. Load agents in this configuration always use all cores on the machine.

Page 63: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 63

Configuration 3 A and B: Stacked Configuration

With configuration A, you install the Test Controller and Test Agent on the same machine as VS,

then configure the Test Controller with Virtual User Packs. This enables you to generate >250

virtual users from the client machine, and unlocks all cores in the processor. Configuration B

shows an alternative configuration, enabled if you configure the machine with Virtual User

Packs using the VSTestConfig command line.

Note that a Virtual User Pack can only be used on one machine at a time, and configuring it on a

machine ties it to that machine for 90 days. So you can't have the same Virtual User Pack

installed on both the VS client and a separate machine running the Test Controller. See the

Virtual User Pack license for details.

Page 64: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 64

Configuration 4: Stacked Controller, Distributed Agents

In this configuration, the controller is running on the same machine as the Test client, with

distributed agents running as load generators. This configuration is recommended if you have a

solo performance tester. If your test controller and test agents will be shared by a team, we

recommend running the controller on a separate box. Note that test agents are tied to a single test

controller. You can't have two test controllers controlling the same agent.

If you are using Visual Studio 2008, these options should look familiar to you as the VS 2008

load agents and controller offered the same configuration options. The new twist with VS 2010 is

the Virtual User Packs, which offer you more flexibility in how you configure your load agents.

The Test Controller and Test Agent are "free" when you purchase Ultimate.

Page 65: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 65

New test mix: "Sequential Test Mix"

It is not recommended to use ordered tests in a load test. In the load test results, you do not get the

pass/fail results, test timings or transaction timings for any of the inner tests. You just get a Pass/Fail

result and duration for the overall ordered test.

To address this issue, there is a new test mix type in VS2010 called Sequential Test Mix. Here is what it looks like in the load test wizard:

For this mix type, you set the order of tests that each virtual user will run through. You can mix web and

unit tests in the mix and you will get the individual test, page and transaction results. When a virtual

user completes the last test in the mix, it will cycle back to the first test in the mix and start over.

Page 66: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 66

If you just want to control the order of web tests, you could also use a main web test that calls all of the

tests in order as "nested tests". This is called "Web Test Composition." For example, suppose I have

WebTest1 and WebTest2 and I want 1 to run before 2. I would create a third web test that has no

requests, but references tests 1 and 2. To create this kind of test, first record web tests 1 and 2. Then

add a third web test and just hit stop in the web test recorder. When you are back in the web test

editor, right click on the root node and select "Add Call to Web Test..."

This will launch a dialog and then select WebTest1. Then do same steps and add WebTest2. Now just

run WebTest3 and you will execute both tests. WebTest composition has been available since VS2008

Page 67: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 67

Query String and FORM POST URLs get parameterized

When you choose to parameterize the web servers in a web test, you may see more webservers listed

than your test actually calls. This is expected behavior.

that the parameter parser is finding websites that reside inside query strings. Notice this in the .webtest

file:

<QueryStringParameter Name="Source"

Value="http%3A%2F%2Flocalhost%3A17012%2Fdefault%2Easpx"

RecordedValue="http%3A%2F%2Flocalhost%3A17012%2Fdefault%2Easpx" CorrelationBinding=""

UrlEncode="False" UseToGroupResults="False" />

Any Query String that has a URL gets added to the server list

Any Form Post parameter that has a URL gets added to the server list

NO added header value makes it into the list

If the form post or query parameter NAME is a URL (not the value, but the name of the

parameter), it does NOT get added.

This button will cause VS to detect URLs and create parameters for them. This web test has only ONE request, but VS detects four web servers.

Any Query String that has a URL gets added to the server list

Any Form Post parameter that has a URL gets added to the server list

If the form post or query parameter NAME is a URL (not the value, but the name of the parameter), it does NOT get added.

NO added header value makes it into the list

Page 68: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 68

New options on Load Test Scenarios

There are some new properties exposed for load test scenarios that make it easier to control how your

tests run.

Agents to Use

The agent names that are entered should be the names of agents that are connected to the controller to which the

load test will be submitted. They should be the simple computer names of the agents (as seen in the "Computer

Name" field in the Control Panel). Unfortunately, at this time, if you switch to submitting the load test to a

different controller, you will need to change the value for "Agents to Use" as there is no way to parameterize

this list to vary depending on the controller used. This list of agents designates a subset of those the agents that

are connected to the controller, and are in the Ready state when the load tests starts (they may be running a

different load test or other test run when the load test is queued as long as they become Ready when the load test

is taken out of the Pending state and starts running), and that meet any agent selection criteria to allow the test

run to be run on the agent. The Scenario will run on all agents in the list that meet these criteria, and the user

load for the Scenario will be distributed among these agents either evenly (by default) or according to any agent

weightings specified in the Agent properties for the agents (from the "Administer Test Controllers" dialog in Visual

Studio).

Delay Start Time

Amount of time to wait after the load test starts before starting any tests in this scenario.

Disable During Warmup

If true, the delay time does not begin until after warmup completes.

Page 69: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 69

Loops and Conditionals

In Visual Studio 2008, if you wanted to conditionally execute some requests or you wanted to

loop through a series of requests for a given number of times, you had to convert a declarative

web test to a coded web test. In VS2010, these options are exposed directly in declarative

webtests.

The ability to add these are exposed by right-clicking on a request and selecting the option you

want from the context menu:

The context menu showing the loop and condition insert options

Sample dialog box for setting the properties of a loop

Page 70: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 70

What the entries look like in the declarative test

Loop results when the test is played back

What results look like if a conditional call fails

What the results look like if a conditional call succeeds.

Page 71: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 71

--NEW--Load test distributions/Network emulation

You can refer to these resources to understand network emulation and how it works in context of load

test:

http://msdn.microsoft.com/en-us/library/dd997557.aspx

http://msdn.microsoft.com/en-us/library/ff817099.aspx

http://blogs.msdn.com/b/lkruger/archive/2009/06/08/introducing-true-network-emulation-in-

visual-studio-2010.aspx

--NEW--SilverLight recording info

For Silverlight apps hosted in the browser, get the plugins from

http://www.codeplex.com/teamtestplugins . They make it easy to handle rich post data in Silverlight

http traffic, as well as other tasks.

--NEW--Unlimited agent licenses for certain MSDN subscribers

Visual Studio 2010 Load Test Feature Pack: MSDN subscribers with Visual Studio Ultimate are provided a

license key to generate UNLIMITED virtual users without having to purchase the Visual Studio Load Test Virtual

User Pack 2010.

http://msdn.microsoft.com/en-us/vstudio/ff520697

Page 72: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 72

Configurations and Settings

How to Change the Location Where Agents Store Run Files

If you need to move the location that an agent uses to store the files downloaded to it for executing

tests, the following steps will take care of this. On each agent machine,

Open QTAgentService.exe.config Add "<add key="WorkingDirectory" value="<location to use>"/>" under the <appSettings> node. Create the <location to use> folder.

How to set a proxy server for web tests

By default, there is no proxy set on a web test, so it doesn't matter what the Internet Explorer® ("IE")

proxy settings are. If your test sets a specific proxy server within the web test then the IE setting is still

not used. In coded web tests or web test plug-ins, you can set the proxy name using the WebProxy

property of the WebTest class. NOTE that this method is broken in Visual Studio Team Test ("VSTT")

2008 RTM, but is fixed in SP1 for VSTT 2008.

If you wish to use the machine's IE proxy settings then you can set the Proxy property to "default"

(without the quotes). In this case you should turn off Automatic Proxy Detection on each agent.

Automatic Proxy detection is very slow and can greatly impact the amount of load you can drive on an

agent.

How to configure Web Tests so Fiddler can capture playback info

In 2008

By default, web test playback ignores proxy servers set for localhost, so enabling a proxy for 127.0.0.1

(which is where Fiddler captures) will not result in any captured data. To make this work, either add a

plugin with the following code, or put the following code in the Class constructor for a coded web test:

this.Proxy = "http://localhost:8888";

WebProxy webProxy = (WebProxy)this.WebProxy;

webProxy.BypassProxyOnLocal = false;

In 2010

To get fiddler to work in VS 2010, simply open Fiddler, then start playing the web test. There is no need

to code for anything.

Changed in 2010

Page 73: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 73

Controlling the amount of memory that the SQL Server Results machine consumes

The default behavior for SQL Server is to consume as much memory as it thinks it can, the workload on

the machine may not be allowing SQL Server to correctly identify memory pressure and hence give back

some memory. You can configure SQL Server to a max memory limit, which if all you are doing is

inserting results should be fine.

The below is how you can set memory to 512mb. The size of the memory you use will vary based on the

machine, testing and how much memory you have.

sp_configure 'show advanced options', 1

RECONFIGURE

GO

sp_configure 'max server memory', 512

RECONFIGURE

GO

How to configure the timeouts for deployment of load tests to agents

The file to change is "Microsoft Visual Studio 9.0\Xml\Schemas\VSt.xsd". look for the run config schema.

Then search for "timeout":

<xs:element name="Timeouts" minOccurs="0">

<xs:complexType>

<xs:attribute name="runTimeout" type="xs:int" use="optional"

default="0"/>

<xs:attribute name="testTimeout" type="xs:int"

use="optional" default="1800000"/>

<xs:attribute name="agentNotRespondingTimeout" type="xs:int"

use="optional" default="300000"/>

<xs:attribute name="deploymentTimeout" type="xs:int"

use="optional" default="300000"/>

<xs:attribute name="scriptTimeout" type="xs:int"

use="optional" default="300000"/>

</xs:complexType>

</xs:element>

Change the values as needed and note that the time is in milliseconds.

Page 74: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 74

How to set the number of Load Test Errors and Error Details saved

Load Test Errors:

You can change the total number of errors stored for a run in the appropriate configuration file

(depending on whether this is for local runs or for test rig runs):

Version Run Type File Name Location

2008 Local VSTestHost.exe.config <Program Files>\Microsoft Visual Studio

9\Common7\IDE\ 2008 Remote QTController.exe.config <Program Files>\Microsoft Visual Studio 9.0

Team Test Load Agent\LoadTest\ 2010 Local DevEnv.exe.config <Program Files>\Microsoft Visual Studio

9\Common7\IDE\

2010 Remote QTController.exe.config <Program Files>\Microsoft Visual Studio

9\Common7\IDE\

Add a key to the "appSettings" section of the file (add the "appSettings" section if needed) with the

name "LoadTestMaxErrorsPerType" and the desired value.

<appSettings>

<add key="LoadTestMaxErrorsPerType" value="5000"/>

</appSettings>

Load Test Error Details:

Page 75: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 75

Multi-proc boxes used as agents should have .NET garbage collection set to server

mode

In 2008

To enable your application to use Server GC, you need to modify either the VSTestHost.exe.config or the QTAgent.exe.config. If you are not using a Controller and Agent setup, then you need to modify the VSTesthost.exe.config. If you are using a controller and agent, then modify the QTAgent.exe.config for each agent machine. Open the correct file. The locations are

VSTestHost.exe.config - C:\Program Files\Microsoft Visual Studio 9.0\Common7\IDE

QTAgent.exe.config - C:\Program Files\Microsoft Visual Studio 9.0 Team Test Load

Agent\LoadTest

To enable gcserver you need to add the following highlighted line in the runtime section:

<?xml version ="1.0"?>

<configuration>

<runtime>

<gcServer enabled="true" />

<assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1">

<probing

privatePath="PrivateAssemblies;PublicAssemblies"/>

</assemblyBinding>

</runtime>

</configuration>

In 2010

The agent service in VS 2010 is now set to Server GC by default. No need to take any action here.

Location of list of all agents available to a controller

To retrieve a list of agents assigned to a controller without using the VS IDE, look in:

In 2008

<install point>\Microsoft Visual Studio 9.0 Team Test Load

Agent\LoadTest\QTControllerConfig.xml

In 2010

<install point>\Microsoft Visual Studio

10.0\Common7\IDE\QTControllerConfig.xml

Changed in 2010

Changed in 2010

Page 76: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 76

--NEW--Setting the Agents to run in 64 bit mode

By default, Visual Studio 2010 agents run in 32 bit mode. To set the agents to 64 bit you must configure

each testsettings file you plan to use for running test rig tests. To do this, follow these steps:

1) Open the test setting file for editing.

2) Go to the "Hosts" page and choose the correct option.

--NEW--Managing Test Controllers and Agents from VS *AND* from Lab Center

You can administer both the test agents and the test controller. If a test controller is registered with a team

project, you can configure and monitor it and any registered test agents using the Test Controller Manager in

the Lab Center for Microsoft Test Manager. Otherwise, to configure and monitor the test controller and any

registered agents, click Test in Microsoft Visual Studio 2010 and point to Manage Test Controllers

The full article can be found at: http://msdn.microsoft.com/en-us/library/dd695837.aspx

Page 77: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 77

Networks, IP Switching, Test Startups

IP Address Switching anatomy (how it works)

Each agent is assigned a range of up to 256 IP addresses to use. At the start of a test run, the agent

service configures the IP addresses on the network card. When the test starts running, new connections

are round-robined through the pool of IP addresses.

The most common use for IP Switching is when load testing against a load balancer. Load balancer

typically use the IP address to route requests to a particular Web server in the farm. So if you have 2

agents driving load to 3 Web servers, since all traffic is coming from two IPs (one on each agent), only

two of the web servers would get all the traffic. IP Switching provides a way to have traffic come from

multiple IPs on the same agent, enabling the load balancer to balance load across the farm.

VSTT currently limits the number of unique IP addresses to 256 per agent. In most testing situations, this

will be plenty of addresses. The main place where this limitation might impact you is if you are running a

large test where every single user must have a separate IP Address for some sort of session state. This is

pretty unusual.

In VS 2008, there is no way to have a given virtual user use the same IP. That is, with IP switching turned

on, a given user will multiple IPs out of the IP pool, and may use different IPs on subsequent iterations.

In VS 2010, the Web test engine tries to ensure that the same user will always use the same IP address,

but there is no guarantee that it will be the case.

The biggest problem with assigning unique IP Addresses to every user is that currently the IP switching

configuration limits you to a range of 256 IP addresses per agent, which would mean you would also be

limited to 256 virtual users per agent. One solution is to use VMs to get multiple load test agents on a

single physical machine.

Gotcha: IP Address Switching is ONLY for WEB TESTS

The IP Switching feature will NOT work with Unit Tests

Gotcha: IP Addresses used for switching are not permanent

When you choose to use multiple IP addresses from each agent machine during load testing (known as

IP address switching or spoofing), most testing tools require you to add those IP addresses to the NIC of

the machine, and they are always available and always show up on the machines. VS allows you to set a

range of IP addresses directly in the test project. Then VS dynamically adds the addresses to the agent(s)

when the test run starts, and removes them when the test run stops. . If you need to perform IP

switching, a controller/agent setup is required.

Page 78: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 78

How to Setup IP Switching

There are 2 parts to setting up IP Switching. First, you must configure the Test Rig Agents to use IP

Switching. Then you must tell the Load Test itself that it should take advantage of that. Here are the

steps and the pitfalls involved:

Setting up the agents

1. Open up the Test Rig Administration dialog (Test -> Administer Test Controller)

2. Highlight each of the agents and bring up the Properties for the agent

3. Fill out all of the appropriate information (as outlined in the picture below)

Where to configure Agent Properties

Page 79: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 79

Make sure you pick the correct adapter here. Use the Network Connections properties built into Windows along with the IPCONFIG command to see which NIC is assigned to what subnet (see below).

The base address is 3 octets and should be representative of the subnet you are on. If you are using a class B subnet, you still need a third octet for the base.

The output from the IPCONFIG command in a CMD window.

C:\Documents and Settings>ipconfig

Windows IP Configuration

Ethernet adapter Secondary:

Connection-specific DNS Suffix . :

IP Address. . . . . . . . . . . . : 10.69.200.3

Subnet Mask . . . . . . . . . . . : 255.255.0.0

Default Gateway . . . . . . . . . : 10.69.0.1

Ethernet adapter Primary:

Connection-specific DNS Suffix . :

IP Address. . . . . . . . . . . . : 10.99.3.3

Subnet Mask . . . . . . . . . . . : 255.255.0.0

Getting the proper IP Address info for spoofing

The information as shown in the Network Connections dialog box in Windows. You may need to hover the mouse over the NIC to see the entire name of the NIC.

Page 80: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 80

Setting up The Load Test

Once the test rig is setup, you can configure which Load Test will actually use IP Switching by setting the

correct property for the Load Test:

Where to enable IP Switching for the Load Test Itself (after configuring the agents to use it)

Page 81: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 81

Startup: Slowness Restarting a Test Rig with Agents Marked as "Offline"

If you have agent machines that are either disabled (powered off, service stopped, etc) or that no longer

exist, but you only mark them as "Offline" in the "Administer Test Controllers" dialog, restarting the rig

will take a long time. The controller will attempt to contact all agents listed in the dialog regardless of

their status, and it will take approximately one minute or more for each missing machine.

Startup: Multiple Network Cards can cause tests in a rig to not start

Problem: When running tests against a controller and test agents the tests start with pending state but

then nothing else happens.

Visual Studio Client Resolution: The problem is that you have two network adapters on the client

machine. The following entries in the controller log confirm that this is the problem:

[I, 2972, 11, 2008/06/26 13:02:59.780] QTController.exe: ControllerExecution: Calling

back to client for deployment settings.

[E, 2972, 11, 2008/06/26 13:06:51.155] QTController.exe: StateMachine(RunState):

Exception while handling state Deploying: System.Net.Sockets.SocketException: A

connection attempt failed because the connected party did not properly respond after a

period of time, or established connection failed because connected host has failed to

respond 65.52.230.25:15533

This is exactly the type of error message we see when the controller communication with Visual Studio

fails because the client has network cards: To configure your Visual Studio installation to communicate

with the controller, try this:

In regedit:

Find the key:

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\VisualStudio\9.0\EnterpriseTools\QualityTools

Add a new key under the above key named "ListenPortRange"

In the key "ListenPortRange", add a new string value with the name "BindTo" and the IPv4

address for the client (65.52.230.25 in your case) as the BindTo value.

Test Rig Resolution:

Read the following support article for the steps to resolve this issue on a test rig:

http://support.microsoft.com/kb/944496

Page 82: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 82

Startup: Slow startup can be caused by _NT_SYMBOL_PATH environment variable

If you have the environment variable _NT_SYMBOL_PATH defined on your systems, your tests may stay

in the "pending" state for a long time. This happens whenever the symbol path defines a symbol server

that is external to your environment and you do not have a local cache of symbols available. To work

around this, do the following:

1. Remove the _NT_SYMBOL_PATH in the environment where you start devenv.exe from.

2. Change _NT_SYMBOL_PATH, by putting a cache location in front of the symbol store location.

For more information about symbol paths and symbol servers, go to:

http://msdn.microsoft.com/en-us/library/ms681416(VS.85).aspx

Startup: tests on a Rig with Agents on a Slow Link

The load test does not actually start on any agents until deployment of all files has occurred to all agents

(by the way, this means that the slow up start of a load test on a rig with many agents could have been

caused by slow deployment to one or more agents).

A common root cause is the _NT_SYMBOLS_PATH variable defined in the environment that points to

somewhat slow symbol server (like \\symbols\symbols).

Try one these workarounds:

- Undefine _NT_SYMBOLS_PATH in the environment where you start devenv.exe from. - Change _NT_SYMBOLS_PATH, by putting a cache in front, such as cache*c:\symcache. This is

will make 1st run same slow but all subsequent runs fast.

--NEW-- Startup: tests on a Rig with several Agents

Since performance counters are fired up serially, adding many agents to a test rig can cause the test to

be very slow to start. Currently, you can either

Build custom countersets that use fewer counters (OK Solution)

Remove the Agent countersets from the load test (fastest solution)

"Not Bound" Exception when using IP Switching is not really an error

The below error may appear several times when running a load test where you are using IP Switching. In

most cases, this can be ignored.

00:51:35 AGENT02 <none> <none> <none> Exception

LoadTestException 151 Web test requests were not bound to either the correct IP

address for IP switching, or the correct port number for network emulation, or both.

New to 2010

Page 83: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 83

The one situation where the presence of this error may indicate a real issue with the test is when the

application is relying on a given iteration to always come through on the same IP address for purposes of

maintaining a session (such as a load balancer like Microsoft ISA Server with the IP Sticky setting turned

on).

Page 84: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 84

How to configure the timeout for deployment of load tests to agents

You might encounter timeouts when deploying load tests to agents when the deployment contains

many or large files. In that case you can increase the timeout for deployment. The default value is 300

seconds.

In 2010

You have to change the .testsettings file that corresponds to your active test settings in Visual Studio,

because the deployment timeout setting is not exposed via the Visual Studio UI. Check via the menu

Test | Select Active Test Settings (Visual Studio 2010) which file is active. You can find the file in the

Solution Items folder of your solution. Open it in the XML editor, by right clicking it, choosing "Open

With…" and selecting "XML (Text) Editor".

The TestSettings element will have an Execution element. Add a child element called Timeouts, if not

already present, to the Execution element. Give it a deploymentTimeout attribute with the desired

timeout value in milliseconds. For example:

<?xml version="1.0" encoding="UTF-8"?> <TestSettings name="Controller" id="330da597-4a41-4ae7-8b95-60c32ab793fb" xmlns="http://microsoft.com/schemas/VisualStudio/TeamTest/2010"> (…) <Execution location="Remote"> <Timeouts deploymentTimeout="600000" />

IntelliSense should help you out when adding/editing this.

In 2008

In 2008 you have to change the .testrunconfig file that corresponds to your active test run configuration,

Add a child element Timeouts under the TestRunConfiguration element if no such element is already

present. Check via the menu Test | Select Active Test Run Configuration which file is active. You can find

the file in the Solution Items folder of your solution. Give it a deploymentTimeout attribute with the

desired timeout value in milliseconds. For example:

<?xml version="1.0" encoding="UTF-8"?> <TestRunConfiguration name="Controller" id="af5824b3-56fa-4534-a3f8-6e763a56869a" xmlns="http://microsoft.com/schemas/VisualStudio/TeamTest/2006"> <Timeouts deploymentTimeout="600000"/>

IntelliSense should help you out when adding/editing this.

Changed in 2010

Page 85: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 85

Customizing and extending the available network emulation settings

Visual Studio 2010 added the feature of handling network emulation within the test harness. This

functionality is based off of a toolkit that was unofficially released as NEWT

(http://blog.mrpol.nl/2010/01/14/network-emulator-toolkit/).

The default profiles within Visual Studio are somewhat limited, but these can be enhanced by making

additional emulation files or modifying the existing files.

The location of the emulation files is:

The sample on the next page shows some of the items that can be set and changed: If you create a new

file, save it as a "*.NETWORK" file in the above directory. The name you assign the profile in the XML is

what will be displayed inside Visual Studio.

If you already have custom profiles you created with NEWT, just make sure to add the

<NetworkEmulationProfile name="NAME_OF_PROFILE_HERE"

xmlns="http://microsoft.com/schemas/VisualStudio/TeamTest/2010">

Before the <Emulation> tag and to close it after the </Emulation> tag

New to 2010

Page 86: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 86

<NetworkEmulationProfile name="NAME_OF_PROFILE_HERE"

xmlns="http://microsoft.com/schemas/VisualStudio/TeamTest/2010">

<Emulation>

<VirtualChannel name="VirtualChannel 1" DispatchType="packet">

<FilterList>

<Filter name="FILTER_1" not="0"/>

</FilterList>

<VirtualLink name="LINK_1" instances="1">

<LinkRule dir="upstream"/>

<LinkRule dir="downstream"/>

</VirtualLink>

</VirtualChannel>

<VirtualChannel name="VirtualChannel 2" DispatchType="packet">

<FilterList>

<Filter name="FILTER_3" not="0">

<IpVersion>IPv4</IpVersion>

<Local>

<IpAddress>255.255.255.255</IpAddress>

<IpMask>0.0.0.0</IpMask>

<PortBegin>0</PortBegin>

<PortEnd>65535</PortEnd>

</Local>

<Remote>

<IpAddress>127.0.0.1</IpAddress>

<IpMask>255.255.255.255</IpMask>

<PortBegin>0</PortBegin>

<PortEnd>65535</PortEnd>

</Remote>

</Filter>

<Filter name="FILTER_2" not="0">

<IpVersion>IPv6</IpVersion>

<Local>

<IpAddress>FFFF:FFFF:FFFF:FFFF:FFFF:FFFF:FFFF:FFFF</IpAddress>

<IpMask>0000:0000:0000:0000:0000:0000:0000:0000</IpMask>

<PortBegin>0</PortBegin>

<PortEnd>65535</PortEnd>

</Local>

<Remote>

<IpAddress>0000:0000:0000:0000:0000:0000:0000:0001</IpAddress>

<IpMask>FFFF:FFFF:FFFF:FFFF:FFFF:FFFF:FFFF:FFFF</IpMask>

<PortBegin>0</PortBegin>

<PortEnd>65535</PortEnd>

</Remote>

</Filter>

</FilterList>

<VirtualLink name="LINK_2" instances="1">

<LinkRule dir="upstream">

<Bandwidth>

<Speed unit="kbps">100000</Speed>

<QueueManagement>

<NormalQueue>

<Size>100</Size>

<QueueMode>packet</QueueMode>

<DropType>DropTail</DropType>

</NormalQueue>

</QueueManagement>

</Bandwidth>

</LinkRule>

<LinkRule dir="downstream"/>

</VirtualLink>

</VirtualChannel>

</Emulation>

</NetworkEmulationProfile>

Page 87: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 87

--NEW--Socket Exception: "An operation on a socket could not be performed

because…"

PROBLEM:

I am running load tests using VS 2008. I am seeing 1000’s of socket exceptions:

Exception: SocketException

Last Message: An operation on a socket could not be performed because the system lacked sufficient

buffer space or because a queue was full

RESOLUTION:

While this issue can be caused by a couple of things, one place to look for issues is described in the

following link: http://blogs.msdn.com/b/sql_protocols/archive/2009/03/09/understanding-the-error-

an-operation-on-a-socket-could-not-be-performed-because-the-system-lacked-sufficient-buffer-space-

or-because-a-queue-was-full.aspx

"UNDERSTANDING THE ERROR “AN OPERATION ON A SOCKET COULD NOT BE PERFORMED BECAUSE THE SYSTEM

LACKED SUFFICIENT BUFFER SPACE OR BECAUSE A QUEUE WAS FULL.”

--NEW--Failed scenarios when client is in a non-trusted domain

PROBLEM:

What scenarios will work and what will fail when a VS client is in a non-trusted domain.

RESOLUTION:

<NOTE: This information is taken directly from work done by a member of the testing product team,

therefore the information is strictly provided "AS IS">

I have tried to capture what works and what needs workarounds for test and lab mgmt. capabilities, in

such a no-trust multi-domain topology, along with my recommendation.

Topology and assumptions:

- Corp domain: TFS server (AT and DT), Build controller, test controller are always in corp - Dev domain: Visual Studio (VS), Microsoft Test Manager (MTM) are always in dev - Trust: No trust between corp and dev - DNS: DNS servers are able to resolve names in both names and there is a line of sight - Firewalls: The required ports are opened on both sides as mentioned in the ‘Network Ports and

Protocols’ section of this topic: http://msdn.microsoft.com/en-us/library/ms252473.aspx

The above setup is needed to use all TFS capabilities minus lab mgmt. Lab mgmt introduces the

following new components in the system:

Page 88: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 88

1. System Center Virtual Machine Manager Server (VMM) 2. Hyper-V hosts managed by VMM 3. Virtual machines (VM) that run on the hosts (these are part of the environment)

It doesn’t matter where you put the VMs – on dev or corp or some other domain or workgroup. All

scenarios work fine. Based on where you place VMM server and Hyper-V hosts, some scenarios will not

work or need workarounds. I will describe these two topologies (as T1 and T2) and what scenarios work

where.

T1

- VMM server – corp domain - Hyper-V hosts – dev domain

T2

- VMM server – corp domain - Hyper-V hosts – corp domain (note: you can join VMs to any domain even if you put the hosts

on corp)

Summary

As you can see below, all components (TFS, VS, MTM, VMM, hosts) in the same domain is the best case.

If you split the components in 2 un-trusted domains, the capabilities still will work (with some

workarounds – called out) if you setup the components as in T2. Some capabilities will not work in T1.

So, if the customer wants to use lab mgmt. in a multi-domain setup, the recommendation is to use T2,

and be aware of the workarounds/additional steps that need to be done. I have called them out below.

Test and Lab Management Capabilities

Sl

No

Visual Studio 2010 Capability/Feature T1 T2 All

components

in same

domain

1. Testing on local machine

i. Running manual tests locally from MTM √ √ √

ii. Running automated tests locally from VS √ √ √

1. Testing on virtual environments

i. Running manual tests from MTM with remote data

collection

X O* √

ii. Running automated tests on environments X √ √

2. Testing on physical environments

i. Running manual tests from MTM with remote data

collection

O* O* √

ii. Running automated tests on environments O** O** √

Page 89: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 89

2. Build deployment

i. Automated build-deploy-test workflow X √ √

3. Environment creation and management

i. Create environment from VM templates √ √ √

ii. Start/stop/snapshot environment √ √ √

iii. Connect to environment using Environment Viewer √ √ √

iv. Clone environments using network isolation X √ √

4. Admin operations

i. Add hosts to the VMM server O^ √ √

√: Supported out of the box (OOB); O: Not supported OOB, but possible with workarounds; X: Not

supported

Notes on the workarounds:

* Microsoft Test Manager (in dev domain) needs to talk to test controller (in corp domain) in this case.

For this authentication to work, you need to use cached credentials or shadow accounts (same user

name/password on both machines) on the client machine to talk to test controller.

**Some extra steps are needed for test agent/controller communication to work across domains.

Review the ‘Requirements for Workgroups and Multiple Domain’ section

here: http://msdn.microsoft.com/en-us/library/dd648127.aspx

^ Adding a host in untrusted domain involves extra steps including manually installed VMM agent.

Page 90: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 90

Performance Counters and Data

Customizing the Available Microsoft System Monitor counter sets

The counter set templates for VS are located in the following directory (assuming a typical install):

In 2008

C:\Program Files\Microsoft Visual Studio

9.0\Common7\IDE\Templates\LoadTest\CounterSets

In 2010

(x86) C:\Program Files\Microsoft Visual Studio

10.0\Common7\IDE\Templates\LoadTest\CounterSets

(x64) C:\Program Files (x86)\Microsoft Visual Studio

10.0\Common7\IDE\Templates\LoadTest\CounterSets

These files are standard XML files and can be modified to allow for quick and easy re-use of custom sets.

It is recommended that you copy the counter set you wish to enhance and add the name CUSTOM to it

so you will always remember that it is a custom counter set. Or you can create your own totally

independent counter set. The following shows the layout of the file:

Page 91: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 91

<?xml version="1.0" encoding="utf-8"?>

<CounterSet Name="Custom" CounterSetType="Custom Set">

<CounterCategories>

<CounterCategory Name="Memory">

<Counters>

<Counter Name="% Committed Bytes In Use" />

<Counter Name="Available Mbytes" />

</Counters>

</CounterCategory>

<CounterCategory Name="Processor">

<Counters>

<Counter Name="% Processor Time">

<ThresholdRules>

<ThresholdRule

Classname="Microsoft.VisualStudio.TestTools.WebStress.Rules.Thresh

oldRuleCompareConstant,

Microsoft.VisualStudio.QualityTools.LoadTest">

<RuleParameters>

<RuleParameter Name="AlertIfOver" Value="True" />

<RuleParameter Name="WarningThreshold" Value="80" />

<RuleParameter Name="CriticalThreshold" Value="95" />

</RuleParameters>

</ThresholdRule>

</ThresholdRules>

</Counter>

</Counters>

<Instances>

<Instance Name="*" />

</Instances>

</CounterCategory>

<CounterCategory Name="PhysicalDisk">

<Counters>

<Counter Name="% Disk Read Time" Range="100" />

<Counter Name="% Idle Time" Range="100" HigherIsBetter="true">

<ThresholdRules>

<ThresholdRule

Classname="Microsoft.VisualStudio.TestTools.WebStress.Rules.Thresh

oldRuleCompareConstant,

Microsoft.VisualStudio.QualityTools.LoadTest">

<RuleParameters>

<RuleParameter Name="AlertIfOver" Value="False" />

<RuleParameter Name="WarningThreshold" Value="40" />

<RuleParameter Name="CriticalThreshold" Value="20" />

</RuleParameters>

</ThresholdRule>

</ThresholdRules>

</Counter>

<Counter Name="Avg. Disk Bytes/Read" RangeGroup="DiskBytesRate" />

<Counter Name="Avg. Disk Bytes/Transfer" RangeGroup="DiskBytesRate" />

<Counter Name="Avg. Disk Bytes/Write" RangeGroup="DiskBytesRate" />

<Counter Name="Avg. Disk Queue Length" RangeGroup="Disk Queue Length" />

<Counter Name="Split IO/Sec" RangeGroup="Disk Transfers sec" />

</Counters>

<Instances>

<Instance Name="*" />

</Instances>

</CounterCategory>

</CounterCategories>

</CounterSet>

This all needs to be on one line. Make

sure you format it properly when

putting it in the final file.

New To 2010

HigherIsBetter is used for highlighting better

or worse results in the Excel reports.

New To 2010

Range specifies the

graph range.

New To 2010

RangeGroup uses a common range for all counters in that range group.

Page 92: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 92

Performance Counter Considerations on Rigs with slow links

Having a slow WAN between the controller and agents may definitely cause some timeouts or delays in

performance counter collection. Each performance counter category is read in a separate operation:

that's one method call at the level of the .NET classes that we call, and I don't know if each call results in

just one or more than one network read.

There are some timeout settings for performance counter collection that you can change by editing the

QTController.exe.config file (or VSTestHost.exe.config file when running locally on VS 2008, or in

devenv.config.exe for 2010) and adding these lines:

<appSettings>

<add key="LoadTestCounterCategoryReadTimeout" value="9000"/>

<add key="LoadTestCounterCategoryExistsTimeout" value="30000"/>

</appSettings>

The values are in ms, so 9000 is 9 seconds. If you make this change, also change the load test sample

rate to be larger than this: at least 10 or preferably 15 seconds, and yes with many agents located far

from the controller, it is recommended to delete most of the categories in the Agent counter set

(perhaps just leave Processor and Memory).

The .NET API that used to read the performance counters is

PerformanceCounterCategory.ReadCategory(), so the entire category is read even if the counter set

definition only includes one counter and one instance. This is a limitation at the OS level in the way

performance counters are read.

The defaults in VS are:

LoadTestCounterCategoryReadTimeout: 2000 ms (2 seconds)

LoadTestCounterCategoryExistsTimeout: 10000 ms

Page 93: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 93

Increase the performance counter sampling interval for longer tests

Choose an appropriate value for the "Sample Rate" property in the Load Test Run Settings based on the

length of your load test. A smaller sample rate, such as the default value of five seconds, requires more

space in the load test results database. For longer load tests, increasing the sample rate reduces the

amount of data collected.

Here are some guidelines for sample rates: Load Test Duration Recommended Sample Rate < 1 Hour 5 seconds 1 - 8 Hours 15 seconds 8 - 24 Hours 30 seconds > 24 Hours 60 seconds

Changing the default counters shown in the graphs during testing

If you want to change the default set of counters that show up in the graphs when you start a test, you

can go into each of the .counterset XML files (same directory as above) and set or add to the

DefaultCounter entries in the following section (at the bottom of the files):

<DefaultCountersForAutomaticGraphs>

<DefaultCounter CategoryName="Memory" CounterName="Available MBytes"/>

</DefaultCountersForAutomaticGraphs>

Possible method for fixing "missing perfmon counters" issues

On the controller machine for your rig, map a drive to each of the machines you will be collecting perf

counters for within the load test. Then, before you kick off a test, open each drive you mapped and

verify that you have connectivity. Leave the window open during the test.

Page 94: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 94

How and where Performance data gets collected

There are two types of data collected by VS during a test run: Real perfmon counters and pseudo

perfmon counters. All real perfmon counters are collected directly by the VS Controller machine.

In the Load Test editor, all of the performance counter categories that start with "LoadTest:" (see the

LoadTest counter set in the load test editor) is data that is collected on the agents by the load test

runtime engine. These are not real Perfmon counters in the sense that if you try to look at them with

Perfmon you won't see them, though we make them look like Perfmon counters for consistency in the

load test results database and display. The agents send this some of this data (see below) in messages to

the controller every 5 seconds which rolls up the agent (e.g. Requests / sec across the entire rig rather

than per agent). The controller returns the rolled up results to Visual Studio for display during the run

and also stores them in the load test results database.

[Requests Per Second Counters] The VS RPS does not count cached requests, even though VS is sending

an http GET with if-modified-since headers.

What data is sent every 5 seconds? we do everything possible to limit how much data is sent back in

that message. What we do send back is the average, min, max values for all of the pseudo

performance counters in the categories that start with "LoadTest:" that you see under the "Overall",

"Scenarios" and "Errors" nodes in the load test analyzer tree (nothing under the "Machines" node).

Note that the biggest factor in the size of these result messages is the number of performance counter

instances, which for Web tests is mostly determined by the number of unique URLs reported on during

the load test. We also send back errors in these 5 seconds messages, but details about the failed

requests are not sent until the end of the test, so tests with lots of errors will have bigger messages.

Lastly, we only send back metadata such as the category names and counter names once and use

numeric identifiers in subsequent messages, so the messages at the start of the load test may be slightly

larger than later messages.

One thing you could do to reduce the size of the messages is to reduce the level of reporting on

dependent requests. You could do this by setting the "RecordResult" property of the

WebTestRequest object to false. This eliminate the page and request level reporting for that request,

but you could add a transaction around that request single request and that would really match the

page time for that request

Page 95: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 95

--NEW-- How to remove Controller and Agent countersets

You cannot directly remove Controller and Agent counters in the "Manage Countersets" dialog, nor can

you delete them from the mappings in the "Run Settings." In order to remove them, you need to delete

the actual countersets:

When you choose this, you will get the following dialog:

Page 96: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 96

--NEW-- Controller and Agent countersets show up in test even after removing them

If you delete the Controller and Agent countersets, they will not be in the load test. HOWEVER, as soon

as you manage countersets and change any other counters, VS will put the Controller and Agent

counters back in the test:

Below, there are no Controller/Agent counters. 1

We add a single counterset to a machine already defined 2

After the addition, the counters are back in the countersets AND the counterset mappings.

3

Page 97: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 97

Page 98: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 98

Data and Results

Custom Data Binding in UNIT Tests

The first thing to do is create a custom class that does the data initialization (as described in the first

part of this post: http://blogs.msdn.com/slumley/pages/custom-data-binding-in-web-tests.aspx). Next,

instantiate the class inside your unit test as follows:

[TestClass]

public class VSClass1

{

private TestContext testContextInstance;

public TestContext TestContext

{

get { return testContextInstance; }

set { testContextInstance = value; }

}

[ClassInitialize]

public static void ClassSetup(TestContext a)

{

string m_ConnectionString = @"Provider=SQLOLEDB.1;Data

Source=dbserver;Integrated Security=SSPI;Initial Catalog=Northwind";

CustomDs.Instance.Initialize(m_ConnectionString);

}

[TestMethod]

public void Test1()

{

Dictionary<string, string> dictionary = customDs.Instance.GetNextRow();

//......Add the rest of your code here.

}

Verifying saved results when a test hangs in the "In Progress" state after the test has

finished

If you run a test and either the test duration or the number of iterations needed for completion of the

test have been reached, but the test stays in the "In Progress" state for a long time, you can check if all

of the results have been written to the load test results repository by running this SQL query against the

LoadTest database:

select LoadTestName, LoadTestRunId, StartTime, EndTime from

LoadTestRun where LoadTestRunId=(select max(LoadTestRunId) from

LoadTestRun);

If the EndTime has a non-NULL value then the controller is done writing results to the load test results

database and it should be safe to restart the rig (killing anything if needed).

This doesn't necessarily mean that all results from all agents (if the agents got hung) were successfully

written to the load test database, but it does mean that there's no point in waiting before killing the

agents/tests.

Page 99: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 99

The metrics during and after a test differ from the results seen.

Scenario 1:

When you run load tests and look at the numbers you get while the tests are running, the values you see

may not be the same values that you get when you load the completed test results at a later point. This

behavior is not unexpected, based on warmup and cooldown settings.

Comparison of a test with and without warmup. Notice the total number of tests run is different, but the recorded times are

close enough to be valid for reporting.

Scenario 2:

When you compare the summary page results to the detailed results values, there can be a difference in

what is reported. This is due to the implementation of collecting the timing details, which are currently

flushed when a test iteration ends. For iterations that are in progress with in-flight requests, we give the

iteration 10 seconds (configurable via cooldown) to complete any in-flight requests. If they do not

complete, the transactions in those iterations are not counted in the details, but are counted in the

summary page.

Page 100: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 100

Fro

m

IIS

Logs

Fro

m

IIS

Logs

Fro

m

IIS

Logs

Fro

m

IIS

Logs

How new users and return users affect caching numbers

Comparing VS Results to IIS Results for 100% new vs. 100% return

This section shows how VS handles caching and how to interpret the numbers shown for total requests

and cached requests.

TOR 09 - Caching - ReturnUsers HTM 268

HTML 263

GIF 83

BMP 32719

200 OK - 3871

304 Not Modified - 29462

VS Requests: 33,333

VS Requests Cached: 84,507

TOR 10 - Caching - NewUsers HTM 276

HTML 271

GIF 276

BMP 90243

200 OK - 46639

304 Not Modified - 44427

VS Requests: 89,384

VS Requests Cached: 43,758

Comparing the same tests using HTML's Content Expiration setting

TOR 12 - Caching - ReturnUsers - Content Expiration HTM 270

HTML 264

GIF 85

BMP 3330

200 OK - 3874

304 Not Modified - 75

VS Requests: 3,949

VS Requests Cached: 84,842

TOR 11 - Caching - NewUsers - Content Expiration HTM 268

HTML 262

GIF 268

BMP 44622

200 OK - 45286

304 Not Modified - 134

VS Requests: 44,742

VS Requests Cached: 42,090

Comparing New Users to Return Users (WRT caching):

New users are simulated by “clearing” the cache at the start of each new iteration, whereas the cache is carried from iteration to iteration for return users.

This results in many more requests being cached with return users.

NOTE: The total # of requests made by VS is a sum of the two VS values. In other words, “Total Requests” in the IDE does not include cached requests.

Looking at the impact of “content expiration” on the overall network and web server activity (For more information, see the section “Add an Expires or a Cache-Control Header” from http://developer.yahoo.com/performance/rules.html).

Notice that VS honors the content expiration (this is actually handled by the underlying System.NET component). However, VS still reports the cached file request, even though no call went out the wire. This is expected behavior since the request was a part of the site. In order to see how many requests went on the wire, you need to use IIS logs or network traces.

Page 101: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 101

Notes:

All 4 tests above were run for the same duration with the same number of users executing the same

test.

Although the numbers do not match exactly, they are close enough to show the behavior of the

tests. The discrepancy is due to a few things, including cool down of the test and the possible mis-

alignment of the query I used to gather data from the IIS logs.

The IIS Log items for "200 –OK" and "304-Not Modified" were gathered using LogParser and the

following query:

SELECT

sc-status, COUNT(*) AS Total

FROM *.log

WHERE

to_timestamp(date, time) between

timestamp('2010-02-12 02:13:22', 'yyyy-MM-dd hh:mm:ss')

and

timestamp('2010-02-12 02:18:22', 'yyyy-MM-dd hh:mm:ss')

GROUP BY

sc-status

data sources for data driven tests get read only once

When initializing data driven tests the data is read ahead of time, and only retrieved once. Therefore

there is no need to optimize the connection to the data source.

Page 102: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 102

Consider including Timing Details to collect percentile data

There is a property on the Run Settings in the Load Test Editor named "Timing Details Storage". If

Timing Details Storage is enabled, then the time to execute each individual test, transaction, and page

during the load test will be stored in the load test results repository. This allows 90th and 95th

percentile data to be shown in the load test analyzer in the Tests, Transactions, and Pages tables. VS

2010 adds 99th percentile and standard deviation stats. Also, in VS 2010 this setting is on by default.

Consider turning it off for very large load tests, as with a many-agent test it can take up to half the time

of the load test to process all the timing details. In other words, if you have a 12 hour load test running

on 30 agents it could take 6 hours to collect and crunch all the data. In VS 2010, the details data is also

used to populate the virtual user activity chart.

The amount of space required in the load test results repository to store the Timing Details data may be

very large, especially for longer running load tests. Also, the time to store this data in the load test

results repository at the end of the load test is longer because this data is stored on the load test agents

until the load test has finished executing at which time the data is stored into the repository. For these

reasons, Timing Details is disabled by default. However if sufficient disk space is available in the load

test results repository, you may wish to enable Timing Details to get the percentile data. Note that

there are two choices for enabling Timing Details in the Run Settings properties named "StatisticsOnly"

and "AllIndividualDetails". With either option, all of the individual tests, pages, and transactions are

timed, and percentile data is calculated from the individual timing data. The difference is that with the

StatisticsOnly option, once the percentile data has been calculated, the individual timing data is deleted

from the repository. This reduces the amount of space required in the repository when using Timing

Details. However, advanced users may want to process the timing detail data in other way using SQL

tools, in which case the AllIndividualDetails option should be used so that the timing detail data is

available for that processing.

Page 103: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 103

Consider enabling SQL Tracing through the Load Test instead of separately

There is a set of properties on the Run Settings in the Load Test Editor that allow the SQL tracing feature

of Microsoft SQL Server to be enabled for the duration of the load test. If enabled, this allows SQL trace

data to be displayed in the load test analyzer on the "SQL Trace" table available in the Tables dropdown.

This is a fairly easy-to-use alternative to starting a separate SQL Profiler session while the load test is

running to diagnose SQL performance problems. To enable this feature, the user running the load test

(or the controller user in the case of a load test run on a rig) must have the SQL privileges needed to

perform SQL tracing, and a directory (usually a share) where the trace file will be written must be

specified. At the completion of the load test, the trace file data is imported into the load test repository

and associated with the load test that was run so that it can be viewed at any later time using the load

test analyzer.

How to collect SQL counters from a non-default SQL instance

If you want to collect performance counters from a SQL Server instance while running a load test, you

can do this easily by checking the SQL counter set in the "Manager Counter Sets" dialog in the VS load

test editor. Doing this includes the default counter set for SQL Server in your load test. The

performance counter category names that are specified in this counter set begin with "SQLServer:": for

example "SQLServer:Locks".

However, if you are trying to monitor another SQL Server instance that is not the default SQL server

instance, the names of the performance counter categories for that instance will have different category

names. For example, if your SQL server instance is named "INST_A", then this performance counter

category will be named "MSSQL$INST_A:Locks". To change the load test to collect these performance

counters, the easiest thing to do is open the .loadtest file with the XML editor or a text editor and

replace all instances of "SQLServer:" by "MSSQL$INST_A:Locks" (correcting the replacement string for

your instance name).

How 90% and 95% response times are calculated

Within the load test results summary page, the percentile values mean that:

90% of the total transactions were completed in less than <time> seconds

95% of the total transactions were completed in less than <time> seconds

The calculation of the percentile data for transactions is based not on the sampled data that is shown in

the graph, but on the individual timing details data that is stored in the table

LoadTestTransactionDetail. The calculation is done using a SQL stored procedure that orders the data

by the slowest transaction times, uses the SQL "top 10 percent" clause to find the 10% of the slowest

transactions then uses the min() function on that set of rows to get the value for the 90th percentile

time. The stored procedure in the LoadTest database that does this is

"Prc_UpdateTransactionPercentiles".

Page 104: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 104

Transaction Avg. Response Time vs. Request Avg. Response Time

For each HTTP request (including each dependent request) there is a request response time, and these

are all averaged to get the "Avg. Response Time" that appears on the default graph and on the Requests

table in the load test analyzer. There is also the "Avg. Page Time" (seen on the Pages table and can be

graphed, but is not be default) that is the average time to download the request that is in the web test

plus the time to download all dependents (dependents may be downloaded in parallel). Then for

transactions, there are two counters: "Avg. Response Time" and "Avg. Transaction Time". The former is

the average of the sum of all of the page times (without the think times), and the latter is the same but

includes the think times.

For more descriptions see this online doc page: http://msdn.microsoft.com/en-

us/library/ms404656.aspx.

Considerations for the location of the Load Test Results Store

When the Visual Studio Team Test Controller is installed, the Load Test Results Store is set up to use an

instance of SQL Express that is installed on the controller computer. SQL Express is limited to using a

maximum of 4 GB of disk space. If you are going to run many load tests and want to keep the results for

a while, you should consider configuring the Load Test Results Store to use an instance of the full SQL

Server product if available. See the Visual Studio Team Test documentation for instructions on setting

up the database to be used as the Load Test Results Store.

--UPDATED-- Set the recovery model for the database to simple

VS 2008 – By default the recovery model in SQL server for the Load Test Results Store is set to "full". You

should change this to simple.

VS 2010 – There was a change in the way the recovery model was configured in the

loadtestresultsrepository.sql command that ships with VS 2010, but the change does not take effect due

to a different command further down in the script. This issue is known and will be resolved in a future

version.

To change either version - Open SQL Management Studio and connect to the server that has the

LoadTest/LoadTest2010 database. Right click on the LoadTest/LoadTest2010 database in "Object

Explorer" and choose "Properties". Go to the "Options" page and change the drop down for "Recovery

Model" to Simple.

VS 2010 SP1 – The DB recovery model is set to simple. There is no need to do anything.

Page 105: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 105

How to clean up results data from runs that did not complete

If you have a Load Test Run that abnormally aborts and does not show data in a consistent manner (or

does not show up in the list of runs as either completed or aborted), you can use the following query on

the SQL repository server to clean up the database:

update LoadTestRun set Outcome='Aborted' where Outcome='InProgress'

The Outcome field is left blank until the test either completes or is manually aborted. Any test results in

the DB cannot be accessed through the GUI until the Outcome field has one of the two values

'Completed' or 'Aborted'

InstanceName field in results database are appended with (002), (003), etc.

Question:In the LoadTest databases, the Instance Names are sometimes appended with "(002)",

etc. For example, I have a transaction called "Filter Render Request" and in the load test database I

have two transactions. Also, I have a URL pointing to RenderWebPartContent and I have several

entries. Can someone give me a quick explanation?

Answer: To make a long story short it is a unique identifier that is used mostly internally to distinguish

between cases where you have the same test name in two different scenarios in the load test or the

same page name (simple file name) in different folders in two different requests.

Layout for VS Load Test Results Store

For VS 2008:

http://blogs.msdn.com/billbar/articles/529874.aspx

For VS 2010:

http://blogs.msdn.com/slumley/archive/2010/02/12/description-of-tables-and-columns-in-vs-

2010-load-test-database.aspx

Changed in 2010

Page 106: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 106

How to view Test Results from the GUI

http://blogs.msdn.com/slumley/pages/managing-load-test-results.aspx

SQL Server Reporting Services Reports available for download

http://blogs.msdn.com/slumley/pages/load-test-reports.aspx

How to move results data to another system

VS 2008 introduces a GUI results manager. The manager works on the Load Test Results Store that is

currently specified in the "Administer Test Controller" dialog box, or on the local repository. To open the

results manager, you must have a load test opened and set as the active window. Then click on the icon

shown below:

How to launch the "Manage Load Test Results" dialog box

Once in the manager, you choose a controller name from the drop down list (or <local> if you want the

results from the local database) and the manager will populate with the tests it finds. You can select

whatever test results you wish to move, and then choose "export" to move them into a file (compressed

with an extension of .ltrar). That file can be moved to another machine and then imported into a new

results store.

Page 107: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 107

Load Test Results without SQL NOT stored

It is possible to configure a load test to run without a SQL load test repository. This is configured by

changing the Storage Type property on the Run Settings node in the load test editor. There are 2

options for this property: Database and None. I strongly recommend that you always use database. If

you run a load test with the storage property set to none, the results are only kept in memory. So as

soon as you close the Load Test Execution UI, the results are gone.

You might wonder why there is still an entry in the Test Results window and what effect

importing/exporting the test result would have.

For most result types all of the data needed to display the result can be exported into a TRX file. This is

not true for load tests. The only thing that a TRX file stores for a load test is the connection string to the

database with the results and the run id of the run to load. So if you do not run the load test with

storage type set to database, then exporting the TRX file is useless. It will contain no useable data that

you can use for later analysis. So ALWAYS use a database when running load tests.

Unable to EXPORT from Load Test Repository

If you create a custom LoadTest results repository (using a name other than "LoadTest"), the

import/export functionality in VSTT 2008 will not work. This is a known issue and is fixed in VS2010.

The error will look like:

[V, 3160, 6, 2009/11/30 16:06:15.808] devenv.exe: StateMachine(AgentState): calling

state handler for Online

[V, 3160, 7, 2009/11/30 16:06:15.813] devenv.exe: ControllerObject: RunQueueThread

waiting for runs...

[I, 3160, 4, 2009/11/30 16:06:15.889] devenv.exe: WebLoadTestAdapter: Opened

connection to results repository

[I, 3160, 4, 2009/11/30 16:06:16.426] devenv.exe: WebLoadTestAdapter: Closed

connection to results repository

[V, 3160, 4, 2009/11/30 16:06:36.201] devenv.exe: WebLoadTestAdapter:

LoadTestExporterImporter running: bcp "select * from LoadTest.dbo.LoadTestRun where

LoadTestRunId in (278)" queryout

"C:\Users\sellak\AppData\Local\Temp\3\LoadTestResults.7408358f-3580-4a38-9781-

409d90c52a22\LoadTestRun.dat" -S AXPERFORMANCE -T -N

[V, 3160, 4, 2009/11/30 16:06:36.367] devenv.exe: WebLoadTestAdapter: bcp output:

SQLState = 42S02, NativeError = 208

Error = [Microsoft][SQL Native Client][SQL Server]Invalid object name

'LoadTest.dbo.LoadTestRun'.

Notice the call to LoadTest.dbo.LoadTestRun is hardcoded, which is what causes the feature to break.

In general, we recommend you use the LoadTest database name (or in the case of 2010, the database is

named LoadTest2010).

Page 108: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 108

Web Test TRX file and the NAN (Not a Number) Page Time entry

In VS 2008, if a Web Test trx file is opened in an XML editor, you may notice the NAN page time for some

of the responses.

<Response url="http://teamtestweb1/storecsvs/"

contentType="text/html; charset=utf-8"

statusLine="HTTP/1.1 200 OK"

pageTime="NaN"

time="0.006"

statusCodeString="200 OK"

contentLength="12609">

When/why does this happen?

This only happens to non top-level requests, i.e. redirects and dependents.

At the end of Web test execution, all results (objects and their members) are serialized to a trx file,

including the pageTime. NAN is the result of doing a .ToString() on a float or double value that has not

been initialized. This means that the pageTime is not known at the time this entry was written to the trx.

The following is the screenshot of the Web test result file opened in the Playback window. It shows how

this property is set in the code.

The high-lighted one is the top-level page. It is redirected and the redirected to page has some

dependent requests. The 'Total Time' for the top-level page, i.e. the page time, refers to the time to

send all requests and receive all responses (including the redirects and dependents) from the Web

server. It is only calculated and populated for the primary request, but not for 'redirected to' and the

dependents. This is why that you are seeing Nan page time in the XML file.

Page 109: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 109

Proper understanding of TRX files and Test Results directory

TRX files are the result files created when you run a unit or web test in Visual Studio. There are two

pieces here. The first describes how TRX files are constructed in VS 2008, and the second part shows

how things have changed for VS 2010

In 2008

In VS 2008, if you run a Web test outside a load test, the entire Web test result is serialized to the trx

file. So each request and response in the test is serialized. If the test runs multiple iterations, the trx file

can get very large.

We added optimizations to control the amount data that is stored in the TRX for request/response

bodies by only storing one copy of a unique response bodies (in multi-iteration runs you may end up

with multiple identical responses). Also, the request and response bodies are compressed to

dramatically reduce the amount of space they require in the TRX.

There is a test context snapshot stored before every request (including dependent requests).

Sometimes, you'll find really large VIEWSTATE in a test context that can make them really large.

The request/response headers and the test context snapshots are not compressed and duplicates are

not eliminated, so they have the potential to become bloated.

In 2010

In VS2010, there is one major change on how the WebTestResultDetails class is persisted upon test

completion. Instead of writing the WebTestResultDetails class to a trx file, VS serializes the object to a

*.webtestResult file. The relative path of this file is added as an element to the trx file. By saying

'relative', it means relative to the path of the corresponding trx file.

The file only exists on the machine that you run the Web test from, i.e. the VS / mstest machine.

For a local run, the file goes to \TestResults\prefix_Timestamp\In\TestExecuId.

For a remote run, the file goes to \TestResults\prefix_Timestamp\In \Agent\TestExecuId.

When you open a Web test trx file from the Test Results window, VS reads the value of

WebTestResultFilePath from the trx file, and then loads the .webtestResult from

TrxDirecory\WebTestResultFilePath into Web Test Result window.

Note about Data Collectors and TRX files

If you have data collector(s) turned on for a unit/Web test, the collector data, e.g. event log, go to \TestResults\prefix_Timestamp\In\TestExecuId\Agent. For a Load test, collector data go to \TestResults\prefix_Timestamp\In\Agent.

Changed in 2010

Page 110: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 110

Understanding the Response Size reported in web test runs

If you look at the size of a response shown for a single pass of a web test (within the test results

window), it may differ from the size reported from tools such as Fiddler or Netmon. This is due to the

fact that VS is measuring the size of the response after it has been uncompressed, while Fiddler and

Netmon will look at the size of the response on the wire.

This behavior has been changed in SP1, HOWEVER, there are a couple of gotchas to be aware of:

The compressed size will only be reported in VS if the response in NOT using "chunked

encoding"

The test results window will not indicate whether the reported size is the compressed or the

uncompressed size.

VS has a receive buffer that defaults to 1,500,000 bytes and it throws away anything over that.

The number reported is what is saved in the buffer, not the number of bytes received. You can

increase the size of this buffer by altering the ResponseBodyCaptureLimit at the start of your

test. This needs to be done in code and cannot be modified in a declarative test.

Page 111: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 111

Errors and Known Issues

--UPDATED-- CSV files created in VS will not work as data sources

PROBLEM:

If you create a CSV file in VS, it saves the file with a 2 byte prefix indicating encoding type, which is

hidden. When you select the file as a data source, the first column will be prefixed with two unusual

characters. The problem is the two bytes on the front that cannot be seen unless the file is viewed in

hex format.

RESOLUTION:

The encoding choice for saving the CSV file is on the actual "Save" button when doing a "Save As".

Choose US - ASCII. (NOTE: If you get a warning about problems with TFS Source Control, select OK)

Page 112: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 112

Incorrect SQL field type can cause errors in web tests

If you create a SQL table to hold test parameters and you use the default SQL column type nchar(50),

you will get failed requests and the context parameters in the "request" tab of the test results will not

show the bad parameters. The nchar field pads all entries to the specified length with hidden characters

but the "request" view in the test results does not show them. In order to see the extra characters, click

on the "View Raw Data" checkbox and look through the data until you see the hidden characters. This

will indicate that the wrong SQL field type is being used.

Leading zeroes dropped from datasource values bound to a CSV file

If you have a datasource which contains values that start with the number 0, and you have this

datasource in a CSV file, VSTT will strip the leading zero(es) from the values when using them. The same

behavior does NOT occur to data values in a SQL datasource.

--UPDATED-- Recorded Think Times and paused web test recordings

When you are recording a web test, VS uses the time between steps as you record to generate the

ThinkTime values after each request. When you add a comment, the recorder switches from RECORD

mode to PAUSE mode, however, the timer to calculate think times does not pause, so you end up with

think times that include the time you spent typing in the comment. This is also true if you manually

pause the recording for any other reason. To fix this, do the following:

Go through the test after recording is complete and adjust the think times manually.

This issue no longer occurs in 2010.

Changed in 2010

Page 113: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 113

After opening a webtest with the VS XML Editor, it will not open in declarative mode.

In the VS IDE, you can right click on a webtest file and choose to "Open in XML Editor". Once you do that

and then close the window, the next time you double click on the webtest to open it, the file should

open in the default declarative view. However, in VS 2010 there is a known issue that causes the

webtest to always be opened in XML mode.

To work around this issue:

1) open the test project file (e.g. .csproj),

2) look for the web test that is opened as XML,

3) delete the line '<SubType>Designer</SubType>'

4) save the test project.

Example of the section needing to be changed:

---------------------------------------------------------------------

<None Include="WebTest1.webtest">

<CopyToOutputDirectory>Always</CopyToOutputDirectory>

<SubType>Designer</SubType>

</None>

---------------------------------------------------------------------

Calls to HTTPS://Urs.Microsoft.Com show up in your script

If you record a script using IE7 and you have phishing enabled, you can get extra calls to

Urs.Microsoft.Com. These calls are being made by IE as part of the phishing filter in IE (for more

information, please go to: http://download.microsoft.com/download/2/8/e/28e60dcc-123c-4b27-b397-

1f6b2b6cb420/Part1_MM.pdf). You may either remove these calls, or disable phishing in IE before you

make the calls. To disable phishing, go to TOOLS -> PHISHING FILTER -> TURN OFF AUTOMATIC WEBSITE

CHECKING.

Possible DESKTOP HEAP errors when driving command line unit tests

When you run a large number of unit tests that call command line apps, and they are run on a test rig

(this does not happen when running tests locally), you could have several of the tests fail due to running

out of desktop heap. You need to increase the amount of heap that is allocated to a service and

decrease the amount allocated to the interactive user. See the following post for in depth information,

and consider changing the registry as listed below:

http://blogs.msdn.com/ntdebugging/archive/2007/01/04/desktop-heap-overview.aspx

HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Session Manager\SubSystems

OLD SETTING: "Windows SharedSection=1024,3072,512"

NEW SETTING: "Windows SharedSection=1024,1024,2460"

Applies only to 2010

Page 114: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 114

Goal based load tests in VS 2008 do not work after applying SP1

There is a QFE available that fixes the following bugs with Goal Based Load Patterns that were

introduced in VS 2008 SP1:

If you defined a goal based load pattern using a performance counter from any of the

"LoadTest:*" categories, an error would occur and the user load would not be not adjusted

according to the goal.

If you defined a goal based load pattern using a "single instance" performance counter (for

example Memory\Available Mbytes), an error would occur and the user load not be not

adjusted according to the goal.

If the Machine Name property entered for the goal based performance counter did not exactly

match the casing for the computer name, an error would occur and the user load would not be

adjusted according to the goal.

The hotfix can be obtained from:

http://support.microsoft.com/kb/957451

This is no longer an issue in VS 2010

Using Named Transactions in a Goal-Based Load Profile can cause errors

When keeping track of transactions in a test, VS postpends the transaction name with a number in

parentheses. This is to differentiate transactions that use the same name in different tests and keep the

collected metrics separate. Because of this, using a specific transaction name in a goal based load profile

will most likely give you the error: "A LoadTestPlugin attempted to set the 'MinTargetValue' property of

the load profile for Scenario Scenario1 of Load Test LoadTest1; this not allowed after the

LoadTestLoadProfile has been assigned to the LoadProfile property of the LoadTestScenario." This is due

to the naming convention used by VS described above. There is no way to use wildcards in the fields so

you would have to know the exact postpended value.

Also, Even if you know the value, you may see the error near the beginning, since the transaction may

not have run yet, so the instance to check may not yet exist.

Changed in 2010

Page 115: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 115

Debugging Errors in Load Tests

http://blogs.msdn.com/slumley/pages/debugging-errors-in-load-test.aspx

Debugging OutOfMemory Exceptions in Load Tests

http://blogs.msdn.com/billbar/pages/diagnosing-outofmemoryexceptions-that-occur-when-running-

load-tests.aspx

Memory leak on load test when using HTTPS

Problem: There is a memory leak in VS 2008 when running load tests that contain both HTTP and HTTPS

requests.

Resolution: We've analyzed this memory leak and determined that this is a bug in the

System.Net.HttpWebRequest class (used to issue Web test requests) that occurs when the Web test

target https Web sites. A workaround is to set the Load Test to use the "Connection Pool" connection

model. This problem is fixed in VS 2010.

Page 116: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 116

"Not Trusted" error when starting a load test

When you start a load test, you may get the following error:

"The location of the file or directory xxx is not trusted"

This can occur if you have signed code in your test harness and you make changes to some of the code

without resigning it. You can try either one of the below options to attempt to resolve it:

OPTION 1:

1. In the .NET Framework 2.0 Configuration, Go to Runtime Security Policy | Machine | All_Code

2. Right click All_Code, select "New...", and select any name for your new group. Click Next

3. Select URL as your condition

4. Type \\machine_name\shared_folder\assembly.dll or \\machine_name\shared_folder\* and

click Next

5. Make sure permission is set to FullTrust

6. Click Next, and Finish

7. Close all your Visual Studio IDEs, restart, and try again

OPTION 2:

caspol -machine -addgroup 1 -url file:<location XXX>/* FullTrust -name

FileW

This issue can also occur if you have a downloaded zip file (or other) that is flagged in the properties as

"Blocked" You need to unblock it to use. Right click on the file and go to the properties:

Page 117: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 117

Detail Logging may cause "Out of disk space" error

When you use the new feature in VS 2010 "Save Log on Test Failure", you may get an "Out of disk space"

error. Depending on the number of "Maximum Test Logs" and the size of data for each iteration, the

logs being saved can be very large(for instance, a webtest that uploads and/or downloads large files).

Error details and stack traces no longer available in VS 2010

When a particular request encountered an error in VS 2008 while running a load test (with "Timing

Details Storage" set to "All Individual Details"), you could go to the details of the error and see the

information specific to that request. This option is no longer in VS 2010. It has been replaced by the new

detailed logging feature that logs the entire Web test or unit test result for a failed virtual user iteration.

VS does not appear to be using more than one processor

If you are running a load test on a multi processor machine but notice that only one processor is being

used, this is due to the fact that you are running the test as "<Local – no controller>". VS will only use

multiple processors on an Agent/Controller setup. This is by design due to licensing considerations. In

order to take advantage of multi-proc systems, please use an agent and controller setup. It is possible to

setup the controller and agent on the same machine as VS.

While the limitation in the product still exists in 2010, however, you can unlock all processors by

installing a vUser license on the local machine. See "New Load Test and Load Test Rig Licensing and

configurations" for more information.

Changes made to Web Test Plugins may not show up properly

If you have a plugin that is part of the same project as a declarative web test, and you make changes in

the plugin, you may not always see those changes reflected in the test run. For instance, if you have a

plugin that writes a certain string out to an event log, and you change the string in the plugin, you still

see the old string value in the event log. This is a known issue and may be fixed in VSTT 2008 SP1 (it is

not in the beta release of SP1). In order for the bug to appear, the following conditions must be met:

You must be running on a controller/Agent test rig

Your web test must be declarative (bug does not occur with coded web tests)

You must have a "Test Results" folder in the root of your solution folder

If you are experiencing the bug, you can work around it by:

Generating a coded web test

Renaming or deleting the "Test Results" folder

Changing the test project's location for the "Test Results" folder

New to 2010

New to 2010

Page 118: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 118

--UPDATED-- Socket errors or "Service Unavailable" errors when running a load test

When running a load test, you might receive several errors similar to:

Exception SocketException Only one usage of

each socket address (protocol/network address/port) is normally

permitted

HttpError 503 - ServiceUnavailable 503 -

ServiceUnavailable

These are often due to exhaustion of available connection ports either on the VS machine(s) or on the

machines under test. To see if this could be happening, open a CMD window on your VS machine(s) and

on the machine(s) under test, and run the following command:

"netstat –anp tcp"

If you see lots of connections in a TIME_WAIT state, then you can be suffering from port exhaustion.

More Information: TCP establishes connections based on the following items:

Client port + Client IP = Client Socket

Server Port + Server IP = Server Socket

Client Socket + Server Socket = connection

The TIME_WAIT state is a throwback from the old days (well more accurately the default of 4 minutes is

the throwback). The idea is that if the client closes a connection, the server puts the socket into a

TIME_WAIT state. That way, if the client decides to reconnect, the TCP negotiation does not need to

occur again and can save a little bit of time and overhead. The concept was created because creating a

TCP connection was a costly operation years ago when networks were very slow).

To get around this issue, you need to make more connections available and/or decrease the amount of

time that a connection is kept in TIME_WAIT. In the machine's registry, open the following key and

either add or modify the values for the two keys shown:

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Tcpip\Parameters

]

"TcpTimedWaitDelay"=dword:0000001e (30 seconds)

"MaxUserPort"=dword:0000fffe (65,535 ports)

If you are experiencing the issue on one of the VSTT load machines, you may also need to change the

load test connection model to "Connection Pooling" and increase the pool size considerably.

Page 119: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 119

Error "Failed to load results from the load test results store"

"Unable to cast object of type 'System.DBNull' to type 'System.Byte[]'" error when trying to retrieve load

test results from the DB inside VS.

This error will occur if you get a NULL value in the LoadTest column of the LoadTestRun table. To fix it,

go to the table and delete the row that has the NULL value. The occurrence of this issue should be

extremely rare.

Hidden Field extraction rules do not handle some fields

Some responses may contain hidden fields whose formats have an extra character that causes the build

in "Extract Hidden Fields" rule to not find the value. Consider the following entry in a response:

<input type="hidden" name="_ListSchemaVersion_{9fcdfcc2-6d4f-4a22-a379-

8224954c1d9a}" id="_ListSchemaVersion_{9fcdfcc2-6d4f-4a22-a379-8224954c1d9a}"

value="1" />

A default Hidden extraction rule was added to the request. When the rule fired, the result was:

$HIDDEN2._ListSchemaVersion_{9fcdfcc2-6d4f-4a22-a379-8224954c1d9a

It should have been

$HIDDEN2._ListSchemaVersion_{9fcdfcc2-6d4f-4a22-a379-8224954c1d9a}

This is not a bug, but just a side effect of how VS process context parameters.

Test results iteration count may be higher than the max test iterations set

When a test run that defines a specific number of test iterations is complete, you may see more tests

run than the iterations set in the run properties. This is rare and is caused by the load test process

crashes and restarts. This issue exists in VS 2008 and VS 2010. The reason for this is that the Restart file

we use to handle restarting a load test after QTAgent dies was never updated to include info about the

tests completed, so it will always run the initial number of test iterations after restart.

Resolution:

Find out what is causing QTAgent to crash and fix that issue.

Page 120: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 120

In flight test iterations may not get reported

When a load test is stops, there will be "in-flight" tests and requests. The load test engine gives all in-

flight requests 10 seconds to stop. If the request doesn't finish after some time, we kill the request and

don't record that request detail or test detail.

A way to control this is to specify a Cool-Down period of 10 minutes in the Load Test's run

settings. Assuming that the requests in your Web test have the default request timeout of 5 minutes,

all in-flight requests at the time load test completion at one hour should either finish or be timed out in

5 minutes and then the in-flight tests should be displayed in the User Details Test chart.

Completion of Unit Test causes spawned CMD processes to terminate

If you spawn a process from within a web test, and then that process spawns a separate CMD window

(using the "START" command), the second CMD window should be totally independent of the test. If this

method is used for Unit tests or for Windows applications, it will work as expected. However, web tests

will kill the spawned process. Here is an excerpt from an email thread with the product team:

Here's what I've discovered. There is an option in VSTT that allows you to keep VSTestHost alive after a

test run completes: go to "Tools", "Options", "Test Tools", "Test Execution" and see the check box "Keep

test execution engine running between test runs". This is on by default, and I'm guessing it is on for

you. When you run just a unit test in a test run, this option works and VSTestHost does not get killed

when the test run completes, so neither does its child processes. However, when you run a Web test, this

option seems to be ignored and VSTestHost is killed by a call to Process.Kill() which I believe does kill the

child processes of VSTestHost as well (if you uncheck this option, you'll see that running the unit test has

the same behavior). I'm not sure why VSTestHost goes away even when this option is set when a Web

test is run – this may have been intentional. Here's a workaround that seems to work instead:

create a unit test that sleeps for 10 seconds (or whatever time is needed)

create an ordered test that includes your coded Web test first then the unit test that sleeps

run the ordered test rather than the coded Web test

NOTE: an example of this scenario is firing off a batch file that starts a NETCAP.EXE window to gather

trace data during the test run. This NETCAP process must run asynchronously so it will not block the web

test. It must also complete by itself or the resultant trace file will not get written.

Web tests should not be starting other processes, or performing any blocking operations as they will

cause problems with the load test engine. For the netcap example, a better solution is to write this as a

VS2010 data collector.

Page 121: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 121

Bug with LoadProfile.Copy() method when used in custom goal based load tests

If you create a custom goal based load test plugin and use the LoadProfile method Copy(), you will get

an error saying: "A LoadTestPlugin attempted to set the 'MinTargetValue' property of the load profile for

Scenario Scenario1 of Load Test LoadTest1; this not allowed after the LoadTestLoadProfile has been

assigned to the LoadProfile property of the LoadTestScenario." This is due to a regression in hotfix

957451. There is currently no fix for this, however there is a workaround. You need to create your own

copy method and use it to populate the custom LoadProfile. Make sure that you do NOT set the

"ScenarioName" value since this is where the bug lies. Here is some sample code:

The Copy() Method usage that will fail: LoadTestGoalBasedLoadProfile newGoalProfile = _scenario.LoadProfile.Copy() as

LoadTestGoalBasedLoadProfile;

The custom method to replace the Copy() method: private LoadTestGoalBasedLoadProfile ProfileCopy(LoadTestGoalBasedLoadProfile

_profile)

{

LoadTestGoalBasedLoadProfile _goalLoadProfile = new

LoadTestGoalBasedLoadProfile();

_goalLoadProfile.CategoryName = _profile.CategoryName;

_goalLoadProfile.CounterName = _profile.CounterName;

_goalLoadProfile.InstanceName = _profile.InstanceName;

_goalLoadProfile.InitialUserCount = _profile.InitialUserCount;

_goalLoadProfile.MinUserCount = _profile.MinUserCount;

_goalLoadProfile.MaxUserCount = _profile.MaxUserCount;

_goalLoadProfile.MaxUserCountIncrease = _profile.MaxUserCountIncrease;

_goalLoadProfile.MaxUserCountDecrease = _profile.MaxUserCountDecrease;

_goalLoadProfile.MinTargetValue = _profile.MinTargetValue;

_goalLoadProfile.MaxTargetValue = _profile.MaxTargetValue;

return _goalLoadProfile;

}

Using the method in your HeartBeat event handler: void _loadTest_Heartbeat(object sender, HeartbeatEventArgs e)

{

// Make a private instance of the profile to edit

LoadTestGoalBasedLoadProfile _goalLoadProfile =

ProfileCopy((LoadTestGoalBasedLoadProfile)_scenario.LoadProfile);

// Do your modifications to the private copy of the profile here

// [some code]

// Assign your private profile back to the test profile

_scenario.LoadProfile = _goalLoadProfile;

}

}

Page 122: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 122

Errors in dependent requests in a Load Test do not show up in the details test log

The new Detailed Test Logging feature will not allow you to see the details of errors that

occur inside dependent requests during a load test (like AJAX or JSON requests)

The problem is that if a dependent request has an error, even though the test will be flagged

as failed, and the log for that iteration will be stored, the log does not contain any details for

any dependent requests. Therefore you do not get any details about why the failure

occurred.

To work around this issue, you need to make sure any dependent requests that are having

problems get moved back up to main requests, at least during a test debugging phase.

Web Test execution shows the failure

Load Test execution shows that there is a failure

Page 123: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 123

The errors table in the results shows the exception count and allows you to drill into the details. The picture below shows you how to display the full details log for this failed iteration

Here you see the details log. It

shows that there is a failure, but

the request details do not show

where the error occurred, nor can

you get any details about the error.

Page 124: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 124

WCF service load test gets time-outs after 10 requests

If you encounter time-outs when running a load test against a WCF service that uses message-level

security, this could be caused by the WCF service running out of security sessions. The maximum

number of simultaneous security sessions is a WCF configuration setting with a default value of 10. Any

additional requests to the service that would lead to more security sessions will be queued.

If you want the service support more than 10 simultaneous clients, you will need to change it in the WCF

configuration setting. Another reason you might run out of security sessions is when the client isn't

properly closing those sessions after it is done with the service.

A WCF security session is established by a security handshake between client and service in which

asymmetric encryption is used to establish a symmetric encryption key for additional requests in the

same session. The initial asymmetric encryption is more computationally expensive than the symmetric

encryption that is used for subsequent requests. A client must explicitly close the security session to

release server resources or they will only be released by the server after a time-out in the order of

minutes.

If the client only needs to call the web service once, the message exchange with the symmetric key is

unnecessary and you can save a roundtrip by disabling security sessions. Set the

'establishSecurityContext' to false in the app.config of the client. This can also serve as a workaround for

clients that do not properly close the session, but do keep in mind that this will skew your performance

results. So only use this workaround while you fix the client.

For more details on secure sessions and the 'establishSecurityContext' property see

http://msdn.microsoft.com/en-us/library/ms731107.aspx

Loadtestitemresults.dat size runs into GBs

During a load test the load agents will write to a file called loadtestitemsresults.dat. If you are planning

to execute a long running load test, you need to be sure that the loadtestitemsresults.dat file will be on

a drive with enough disk space because it can grow into many GBs.

The loadtestitemsresults.dat file is created by the QTAgent or QTAgent32 process. You should add the

key WorkingDirectory to QTAgent.exe.config and/or QTAgent32.exe.config to point to the right drive.

For example, add <add key="WorkingDirectory" value="D:\Logs"/> to the appSettings section.

For Visual Studio 2010, see http://blogs.msdn.com/lkruger/archive/2009/06/08/visual-studio-team-test-

load-agent-goes-64-bit.aspx for more information about when QTAgent.exe or QTAgent32.exe is used.

Page 125: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 125

Content-Length=0 Header not sent resulting in HTTP 411 Length Required Error

You may run into an issue where the web request is failing with an HTTP 411 Length Required

response. This is in a Post Request with no body. This will not always occur as some web servers may

ignore the missing header. However RFC specification 2616 defines that even with a content length of

zero, the header should still be sent (http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html).

Visual Studio uses its own header collection class to allow for a single collection per request. This makes

the code more efficient. The internal method used to build this collection first removes all headers that

are restricted by the System.Net.HttpWebRequest class (http://msdn.microsoft.com/en-

us/library/system.net.httpwebrequest.headers.aspx) and then adds back the appropriate headers.

However the internal code does not add a content-length when the length is zero. Also, V.S. does not

allow you to directly set any headers that are controlled by the system (such as content-type and

content-length).

To work around this issue, add a dummy body to your request. Here is an example:

WebTestRequest request1 = new WebTestRequest("http://localhost/");

request1.Method = "POST";

request1.Encoding = System.Text.Encoding.GetEncoding("utf-8");

request1.Headers.Add(new WebTestRequestHeader("Content-Length", "0"));

StringHttpBody request1Body = new StringHttpBody();

request1Body.ContentType = "";

request1Body.InsertByteOrderMark = false;

request1Body.BodyString = "";

request1.Body = request1Body;

yield return request1;

request1 = null;

Page 126: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 126

Error that test could not run because the network emulation is required

You may receive the following error when trying to start a load test:

Warning 5/25/2010 4:58:53 PM Could not run load test 'LoadTest1' on agent 'PRITAMB1':

Network emulation is required for load test 'LoadTest1' but the driver is not installed on agent

PRITAMB1.

Network emulation is required for load test 'LoadTest1' but the driver is not installed on agent

PRITAMB1. PRITAMB1

This is most likely caused by the fact that the network emulation drivers did not successfully install

during the VS setup. There are two methods you can try to resolve this issue.:

1) Re-configure agent using the GUI inside Visual Studio

2) Follow these steps:

a. Launch VS with administrator privilege, create a default C# test project. b. Open Local.testsettings in Solution Explorer. c. Select "Data and Diagnostics", Enable "Network Emulation" and press "Configure". d. In Network Emulation Detail dialog, select a network profile other than "LAN", like "3G",

then press OK. e. Network Emulation driver would be installed after a short network disconnection.

NOTE: If you just install VS and not the remote agent, the Network Emulation driver is not installed. You

must run the command "VSTestConfig NETWORKEMULATION /install" from an elevated VS Command

Prompt. This will install the driver so that you can use it from VS

Error/Crash in "Open and Manage Load Test Results" dialog

If you enter a controller name in the "Manage Load Test Results" dialog that does NOT have a trailing

'>', Visual Studio will show an error "Failed to load results from the load test results store. Invalid URL:

The hostname could not be parsed" and the DEVENV.EXE process might crash. This is a known issue in

2010 RTM and will possibly be fixed in a future version or service pack.

Missing ‘>’

Applies only to 2010

Page 127: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 127

Calls to CaptchaGenerator.aspx fail during playback

Any calls to a page that uses image security to verify if a human or a machine is interacting will fail. This

is due to the fact that Captcha (and other image generator security products) use image files to show

the text string that should be passed back. Visual Studio has no way to parse the image for the correct

value. The only ways to avoid this are to:

1) Turn off the security at the server

2) If the security program has a "generic" key that will always work, pass that key back to the

server.

Request failure with improperly encoded query strings calling SharePoint 2010

When testing a site built on SharePoint 2010, requests may fail. When running this in a 2010 Web Test,

the query string is not encoded at all and fails out

1. POST to /global/pages/search.aspx a. Response – HTTP 302 with location header: /global/pages/Search.aspx?k=ALL(Developer

OR Support Engineer)AND(prname="Engineering" OR prname="ITOperations")AND(lvl=59 OR lvl=60 OR lvl=61 OR lvl=62)

2. GET to /global/pages/Search.aspx?k=ALL(Developer OR SUPPORT ENGINEER)AND(PRNAME="ENGINEERING" OR PRNAME="ITOPERATIONS")AND(LVL=59 OR LVL=60 OR LVL=61 OR LVL=62) HTTP/1.1

a. Response – HTTP 400 Bad Request b. Fiddler only shows the request as /global/pages/Search.aspx?k=ALL(Developer c. VS is set to follow redirects on the initial POST so this request was automatic

Resolution:

Visual Studio now has a property on requests called EncodeRedirectedUrl. Set this to true and it should

work as expected. This is not available in the UI, so you either need a plugin or a coded test to set it.

Network Emulation does not work in any mode other than LAN

Let's say you have3 two NIC cards and two IP addresses assigned on the Agent machine. One is used to

communicate with controller (intranet) and other to communicate with external web site (extranet).

The problem is when Network Emulation is enabled (DSL etc), the Load Test code is picking wrong IP

address and setting this as source IP address. So, all requests were failing with "501 Network

Unreachable" socket exception.

Applies only to 2010

Applies only to 2010

Page 128: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 128

As you may already know, for Network Emulation, the loadtest has to specify a port number from a port

range that is set for the network type to be emulated. Unfortunately, it also has to also specify a source

IP address in the .Net call (HTTPRequest.ServicePoint.BindIPEndPointDelegate), and it assumes the first

IP address that is returned by System.Net.Dns.GetHostAddresses is the correct one. In this case, we are

getting the intranet IP address first and ending up binding HTTP requests to it.

The solution that worked is to enable IP Switching, and specify an IP address range that consists of one

IP address that is equal to the one that works. (To set this, open TestManageTestControllers in VS,

and click on Properties for the Agent machine, and fill appropriate fields).

This will enable the Load Test to use correct IP address in communicating with Web Site.

Error that Browser Extensions are disabled when recording a web test

You might see the following error when trying to record a web test:

To fix this, go to "Tools" -> "Internet Options and set the following:

Page 129: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 129

Error: Request failed: No connection could be made because the target machine

actively refused it

Proxy and ForeFront (Anti-Virus) are creating issues here. There are lots of 504 gateway errors and

errors caused by ForeFront e.g.

The number of HTTP requests per minute exceeded the configured limit. Contact your Forefront TMG

administrator

MaxConnection value in App.Config is not honored when running a load test

If you have a unit test that reads an App.Config file, and you set a maxconnection value in that config,

Visual Studio will ignore that value and default to a connection max of 100. Here is what happens:

<?xml version="1.0" encoding="utf-8" ?>

<configuration>

<system.net>

<connectionManagement>

<add address="*" maxconnection="1500"/>

</connectionManagement>

</system.net>

</configuration>

Here is my sample test that writes the value of maximum connections to a file –

[TestMethod]

public void TestMethod1()

{

File.WriteAllText("c:\\out.txt", "The current connection limit is "

+ System.Net.ServicePointManager.DefaultConnectionLimit.ToString());

}

When run in a single user test, I see the following output –

The current connection limit is 1500

When run in a load test with 1 iteration, I see the following output –

The current connection limit is 100

The load test code does set the DefaultConnectionLimit to 100; otherwise it defaults to something very

low, so the load test code is overriding the config setting. If you write a line of code anywhere in your

unit test (like the TestInitizlize or ClassInitialize method) to set the DefaultConnectionLimit explicitly,

that should override the load test setting and the load test sets this before running any unit test code

--NEW-- Cannot change the "Content-Type" header value in a webtest

Although the "Content-Type" header is exposed for certain requests, and even if you use a plugin or rule

to set the "Content-Type" header, Visual Studio will overwrite the value with its internal value. There is

currently no workaround for this.

Page 130: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 130

--NEW-- VS does not expose a method for removing specific cookies from requests.

The cookie collection that Visual Studio sends with requests is dynamically built based on the cookies

sent back from the server. There is a method that allows you to add a new cookie to the collection, but

there is no exposed way to delete an existing cookie.

Further, the collection itself is marked as read-only, so if you create a new cookie collection using

System.Net and then assign it to the request's cookie collection, the request will fail to execute.

--NEW-- File Upload" feature in VS does not allow you to use a stream to send file

If you want to use a stream to upload a file as a form post item, Visual Studio 2010 provides a form post

item to easily do this. You can also create a plugin to do the same thing. However, neither method

provides a way to get the file contents from a stream. The ONLY method for getting the file is by reading

a static file on the drive. This is not a limitation of the testing functionality of Visual Studio, but a

limitation in the "File" class in the .NET Framework.

--NEW--Unit tests that consume assemblies requiring MTA will fail with default test

settings

If you create a unit test that consumes an assembly that runs in an MTA, the test will most likely hang,

or fail with an error similar to:

Test method SimpleLoadTest.SyncFullJobTest.Test_SyncFullJobTest_WithNoError threw

exception: System.InvalidCastException: Unable to cast COM object of type

'System.__ComObject' to interface type 'ISyncKnowledge2'. This operation failed

because the QueryInterface call on the COM component for the interface with IID

'{ED0ADDC0-XXXX-XXXX-XXXX-45661D2XXXXX}' failed due to the following error: No such

interface supported (Exception from HRESULT: 0x80004002 (E_NOINTERFACE))..

A good example of this is using the SQL SERVER SYNC FRAMEWORK in a unit test. This is due to the fact

that Visual Studio testing defaults to STA for tests execution. To fix this, you must manually edit the

".testconfig" file (".testrunconfig" in 2008) being used to execute the test(s). Add the line: <ExecutionThread apartmentState="1" />

to the <ExecutionThread> section of the file. Full details are at http://msdn.microsoft.com/en-

us/library/ms404663.aspx

--NEW--MSTest tests that consume assemblies requiring MTA will fail with default

test settings

This is the same issue as the previous article, and more information can be found at this link.

http://blogs.msdn.com/b/ploeh/archive/2007/10/21/runningmstestinanmta.aspx

Page 131: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 131

--NEW--Load Test Agent Error "Failed to open the Visual Studio v10.0 registry key"

PROBLEM:

Got a machine with VS Ultimate 2010 and VS 2010 Controller installed. All appear to be good. I go to the

next machine and try to install the agent. The install program runs fine and completes. Then it gives me

the configuration wizard. No matter what I choose, the “Apply Settings” fails with:

Further, the service does not show up in the list of installed services.

RESOLUTION:

When installing the agent software, there are 2 components:

1) Agent software 2) VS Performance Tools (for development debugging/profiling/etc.

The error will occur if you only install the Agent software and do not install the Performance tools. The

default install DOES install both items.

--NEW--"Agent to use" property of a Load Test not acting as expected

PROBLEM:

I have one controller which has 4 agents assigned to it. I want to run 2 load tests on this controller at the

same time. I tried using the “Agents To Use” property of scenario in load test to control each load test to

use two separate agents. But for some reason, the second load test run goes into the queue and only

starts after the first load test run is complete, even though 2 agents are shown as “Ready”.

RESOLUTION:

Can you try filtering agents for a test run by using agent selection rule in test setting? You can specify

attribute (name / value pair) for agents. Please check here for details: http://msdn.microsoft.com/en-

us/library/dd695837.aspx

Then you can use attributes in test setting to filter agents for your test run. Please check here for details:

http://msdn.microsoft.com/en-us/library/ff426235.aspx (Step 16). You will have to use 2 different test

settings for your 2 load test runs so that each run happens on different set of agents. This will definitely

work.

NOTE: This issue is under investigation by the product team.

Page 132: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 132

--NEW--Fiddler 2 not seeing application traffic

PROBLEM:

I’ve used Fiddler2 to record the HTTP traffic of many rich clients. It has never failed me – until

recently. My prior customer has an application that they said used HTTP. Fiddler would NOT see the

traffic. NetMon confirmed port 80, HTTP traffic. Other tools (that wouldn’t save to VSTS web tests)

would see it too. What’s going on? I looked at the Fiddler FAQ and tried the suggestions without any

success.

RESOLUTION:

It was pretty clear that the application was avoiding the proxy but how? The answer was in the

application’s .CONFIG file. There was an entry that was set that bypassed the proxy. If Fiddler isn’t

seeing the traffic, you’ve confirmed that traffic exists, and none of the items mentioned in the Fiddler

FAQ appear to be the cause, check all of the configuration files for the application (there can be several)

for an entry bypassing the default proxy.

--NEW--BUG: Microsoft.VisualStudio.TestTools.WebStress.LoadTestResultsCollector

PROBLEM:

I’m running a goal based load test. The goal is set to use a custom performance counter which records a

number of operations/sec from the internals of my customer's code. I setup the load test using the

correct category name, counter name, a blank for the instance name and the appropriate machine

name. The issue is related to the counter being only defined on a remote machine. When I ran the load

test I received the following error…

LoadTestGoalCounterNotFoundException

A goal based load pattern can not be applied because it specifies a

performance counter (UKGC08\THGWebTestHarness: Availability

Searches\Operations Executed / sec\) that could not be read: Category does not

exist.

The specific counter my customer is using does not exist on the client machine (where Visual Studio is

installed) and nor does it exist on the Controller. My workaround for the present is to install the

counters on the client machine and the controller, which gets me past the exception so I can test their

code without issue.

This is a known issue and is being investigated by the product team.

Page 133: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 133

--NEW--ASP.NET Profiling sometimes does not report after a run.

PROBLEM:

I created a simple web performance test and then a load test using the Asp.Net MVC app Tailspin Toys. I

occasionally get the View Profiler Report icon to show up. I have my testing settings set to “local”. I have

edited the “local” test settings to enable Asp.Net Profiler with Tier Interaction. Is there a reason why I

need to run the load test with Asp.Net Profiling(with Tier Interaction) twice, back to back before the icon

lights up on the load test results page?

RESOLUTION:

For some reason (still being investigated) the ASP.Net profiler shutdown message is not getting recorded

and the data collector seems to be getting into an unstable state. Typically on the IIS machine, in the

event viewer \application log you’ll see an entry from the profiler for initialization and shutdown.

However, there are times when the shutdown message is never received after the end of the load test.

It does however log a shutdown message AFTER an IIS reset.

Each time the profiler gets into an odd state, restarting the web server helps.

--NEW--System.InvalidCastException: Unable to cast COM object

PROBLEM:

While running single user load tests, we would see a failure rate of about 75%. This occurred while

running a Unit test that called code where worker threads were being created in the customer code. The

failures showed the following error:

Test method

SimpleLoadTest.SyncFullJobTest.Test_SyncFullJobTest_WithNoError

threw exception: System.InvalidCastException: Unable to cast COM

object of type 'System.__ComObject' to interface type

'ISyncKnowledge2'. This operation failed because the

QueryInterface call on the COM component for the interface with

IID '{ED0ADDC0-3B4B-46A1-9A45-45661D2114C8}' failed due to the

following error: No such interface supported (Exception from

HRESULT: 0x80004002 (E_NOINTERFACE))..

RESOLUTION:

The resolution for the failures is in

http://blogs.msdn.com/b/ploeh/archive/2007/10/21/runningmstestinanmta.aspx. The issue is that the

customer code needs to run in an MTA configuration, but Visual Studio defaults to STA. The article

shows how to modify a testrunconfig file to allow the unit test to run in MTA.

Page 134: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 134

--NEW-- Assembly could not be loaded and will be ignored.

"Visual Studio Project Sample Loading Error: Assembly could not

be loaded and will be ignored. Could not load file or assembly or

one of its dependencies. Operation is not supported. (Exception

from HRESULT: 0x80131515)"

This can come from a file inside a zip where the zip came from another computer. To fix this, unblock

the file on the "Properties" tab of the file.

MORE INFORMATION

http://blogs.msdn.com/b/brada/archive/2009/12/11/visual-studio-project-sample-loading-error-

assembly-could-not-be-loaded-and-will-be-ignored-could-not-load-file-or-assembly-or-one-of-its-

dependencies-operation-is-not-supported-exception-from-hresult-0x80131515.aspx

--NEW--Issue with webtest login when getting 307 Temporary Redirect

PROBLEM:

I recorded a simple login webtest. The very first request hits the main login page. The site requires users

to authenticate against a proxy server before hitting the page. The proxy uses NTLM.

When I recorded the test, the first call went straight to the main page, passing the NTLM behind the scenes with the typical 401 and renegotiate behind the scenes. This is also how IE handles the connection (verified through WireShark)

When I playback the test, the first request gets a 307 Temporary Redirect, and this translates into about 5 more 307 Temporary Redirect calls as the webtest gets authenticated.

Page 135: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 135

The problem is that the redirects can take up to 30-40 seconds for some reason. This causes the first

request to take over 1 minute to complete. It is successful, but this timeframe is too long for a valid test.

RESOLUTION:

Force a pre-authentication to the proxy server in the webtest:

http://blogs.msdn.com/b/rogeorge/archive/2009/06/23/how-to-authenticate-to-a-proxy-server-within-

a-visual-studio-webtest-using-visual-studio-2008-sp1.aspx

--NEW--Data bound validation rule fails when set at the TEST level

If you create a validation rule at the WebTest level and then bind a context parameter to it, the

validation rule will not resolve the context and therefore will fail. If, however, you move the exact same

rule to a request level, it will work.

Page 136: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 136

--NEW--"Could not read result repository"

PROBLEM:

If you have your test set to None for Storage Type, but the Timing Details Storage set for anything

other than None, you'll get the following error message.

RESOLUTION:

Either change the Storage Type setting or the Timing Details Storage setting.

--NEW--Page response time counters disappear after test is completed

PROBLEM:

As my test is progressing, I can see the Page response time being updated in the graph as well as in the

table. Yet, once the test is done, the counters disappear, the values disappear and under Tables >

Transaction I cannot see anything.

During The Run

Page 137: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 137

After The Run

RESOLUTION:

It is actually known issue with Dev10 RTM build and is fixed in SP1.

As a workaround in case you don’t want to move to SP1, you can reopen result from DB using “Open

and manage results” in load test editor.

--NEW--WebTestContext.Clear() not clearing cookies

The WebTestContext.Clear() method clears all context information, but will not clear the cookie container. For more information, see the following blog post: http://social.msdn.microsoft.com/Forums/en/vstswebtest/thread/7dfefda3-14ea-4529-b541-79ef09ee130a

--NEW--SQL Tracing error "Could not stop SQL tracing"

PROBLEM:

I’ve enabled SQL tracing in my load test; after the test completes, the trace is available, but I am seeing

this error:

Page 138: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 138

A trace file was in fact successfully created and had a file size of 102400KB. On opening it in SQL Profiler I

could see the trace. What I think happened is that a maximum file size was reached and it stopped

tracing.

It appears that a maximum file size may be set up together with a rollover option when creating a new

trace. So I assume that Visual Studio sets the maximum file size to 102400KB with no rollover enabled. So

I would like to know if there is any way of configuring maximum file size and rollover for SQL Trace

integrated into Load Tests? Otherwise, the SQL trace integration is not going to work for real world

testing.

RESOLUTION:

Currently there is no resolution. It has been added to the feature request list for the product team.

Currently there is no estimation when this will be addressed.

--NEW--LoadTestCounterCategoryNotFoundException

PROBLEM:

Hi, thanks for replies, I’ve checked the list you sent and things seem to be OK, however I’m unable to

access the perf counters from the controller, for example, if I try to add perf counters on that Win7 box I

get:

Page 139: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 139

RESOLUTION:

Hi, I’ve found the answer in this article: http://blogs.msdn.com/b/edglas/archive/2008/11/19/reading-

performance-counters-on-vista-and-server-2008-machines.aspx

Basically, my issue was that the Remote Registry service is set to manual in Win7 by default and is set to

automatic in WinXP so I enabled the service and things started working (my test agent is a local admin

so none of the other security settings were needed in my case).

Page 140: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 140

In Depth Blog Posts on Debugging and Troubleshooting

Web Test Authoring and Debugging Techniques

This blog post is taken directly from Ed Glas' blog site and is replicated here in full. Some of the

inform ation in here also resides in the "New to Visual Studio 2010" section.

With each release of VS we have made major strides in Web Test Authoring and Debugging. With VS 2008, we added a number of features to address the most common challenges with Web test authoring, the most important being a low-level http recorder and a automatic correlation tool. This covered the most prevalent challenges outlined in Web Test Authoring and Debugging Techniques. Again with VS 2010 we have made major strides in Web test authoring and debugging:

1) More http recordings "just work"

2) New tools to help you debug and fix the ones that don't, and

3) New extensibility points for the recorder, editor, and results viewer enable you, us and our

community to release rich functionality "out of band" to handle custom applications and rich

data types.

A New Name, But Under the Covers Still the Same

In this release we renamed "Web Test" to "Web Performance Test" to highlight the primary scenario for

Web tests, which is using them as scripts in a load test to model user actions. Load tests are used to

drive load against a server, and then measure server response times and server response errors.

Because we want to generate high loads with a relatively low amount of hardware, we chose to drive

Web performance tests at the protocol layer rather than instantiating a browser. While Web

performance tests can be used as functional tests, this is not their primary focus (see my post Are Web

Tests Functional Tests?). You will see that I still refer to "Web Performance Tests" as "Web Tests" for

short.

If you really want to test the user experience from the browser, use a Coded UI test to drive the

browser.

In order to be successful working with Web Performance Tests, it is important you understand the

fundamentals about how they work.

Web Performance Tests Work at the HTTP Layer

The most common source of confusion is that users do not realize Web Performance Tests work at the

HTTP layer. The tool adds to that misconception. After all, you record in IE, and when running a Web test

you can select which browser to use, and then the result viewer shows the results in a browser window.

So that means the tests run through the browser, right? NO! The Web test engine works at the HTTP

layer, and does not instantiate a browser. What does that mean? In the diagram below, you can see

there are no browsers running when the engine is sending and receiving requests:

Applies only to 2010

Page 142: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 142

What Does This Mean for You?

This design has fundamental and far-reaching impact if you are working with Web tests. It is critical that

you understand this if you are going to be successful authoring and debugging Web tests. This escapes

even customers who have worked extensively with Web tests, and is a major source of confusion. The

Web test engine:

1) Sends and receives data at the HTTP layer.

2) Does NOT run a browser.

3) Does NOT run java script.

4) Does NOT host ActiveX controls or plugins.

Ok, so Web tests work at the HTTP layer. What about requests sent and received by javascript and/or

browser plugins? The best example for java script generating HTTP traffic is AJAX calls. The most

common example of browser plugins are SilverLight or Flash. The Web test recorder will record HTTP

traffic from AJAX calls and from most (but not all) browser plugins.

Web Tests Can Succeed Even Though It Appears They Failed

A common source of confusion comes from the browser preview in the Web test result viewer. This

browser control does not run java script nor host plugins, which is by design since the engine does not

do this either, and for security reasons. A common technique in pages requiring java script is to sense

this, and put up an alternate page when the browser is not running java script, such as "java script

required on this page":

This page looks like it failed, when if fact it succeeded! Looking closely at the response, and subsequent

requests, it is clear the operation succeeded. As stated above, the reason why the browser control is

pasting this message is because java script has been disabled in this control.

Another variant of this is plugins such as this page that is using SilverLight:

Page 143: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 143

Again, it looks like the page failed, when in fact at the HTTP layer it succeeded.

A Common Challenge: Dynamic Parameters

One of the major challenges with working at the HTTP layer are "dynamic parameters". A dynamic

parameter is a parameter whose value changes every each time it is run. The most common case is a

session parameter, such as a login session ID. Each time a user logs in to a site, he is given a new login

session ID. In order to simulate this user's actions, the test cannot simply replay the recorded session ID,

it must replay the new session ID. Web tests handled most patterns of dynamic parameters

automatically, but there are still some patterns it does does not automatically handle.

Huge Strides Forward with VS 2010

With ever more complicated applications being built on HTTP, it is getting harder and harder to develop

scripts at the HTTP layer.

With VS 2010, we again have made tremendous strides across the tool, in recording, editing, and

debugging, so help you be successful doing this. Some of the high-level features are:

1) Finding and fixing dynamic parameters

2) Enabling an extensibility point in the recorder such that record/playback "just works" for any

app (effectively enabling you to automate #1).

3) Enabling extensibility points for editing and viewing results of rich data types

Page 144: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 144

We have made a number of other improvements as well, most notably:

Editor Improvements

1) Support for looping and branching in Web tests

2) Request details editor

3) Create meaningful reports using the reporting name on Page

4) Goal validation rule

Recorder Improvements

1) Record/playback of file upload "just works"

2) Record all dependents by default

3) New recorder options in Tools Options

4) Improvements in hidden field and dynamic parameter correlation

Debugging a Web Test to Find and Fix Dynamic Parameters

The VS recorder automatically handles most classes of dynamic parameters: cookies, query string and

form post parameter values, and hidden fields. In VS 2010 we have made incremental improvements on

each of these. Yet there are still a few dynamic parameter patterns that cannot be deterministically

detected and fixed up.

Our goal with this release was to build tooling around the flow for debugging a web test, mostly to help

find and fix dynamic parameters. This flow is described in Sean's seminal post, How to Debug a Web

Test. The flow is this:

1) Record a Web test and play it back. Playback fails.

2) Look at the form post and query string parameters in a failed request and determine if any look

to be dynamic

3) Go to the web test to determine if they are bound to an extracted value

4) If not, search through the log to find where the parameter is set. Or better yet, search through

the recording log to find the unique value in order to find where it is getting set.

5) Add an extraction rule and bind the parameter value to the extracted value.

In VS 2010, you'll find commands in Web test playback and the editor that seamlessly support this flow:

1) A new recorder log that enables you to see the http traffic that was generated from IE. This is a

huge new feature critical for debugging tests. You can jump from a request, request parameter,

or response in playback to the same context in the recording log to compare them.

2) Search in playback and search and replace in the Web test editor. These features are super-

important for quickly finding and fixing dynamic parameters.

3) Jump from a request in playback to that same request in the editor. This greatly increases the

efficiency of the workflow.

4) Create an extraction rule directly from playback, automatically setting the correct parameters

on the extraction rule. Again, this increases efficiency.

Page 145: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 145

Together, these features really grease the debugging and fix up workflow, and will make you much more

efficient when working with web tests.

A quick overview of the flow:

From the Web test results viewer, select a from a failed request that looks like a dynamic parameter.

Right-click from the parameter on the Request tab to jump to the editor to confirm it was not bound.

In the editor, you can see this value is not bound to a context parameter:

Now go back to the results viewer. At this point, you want to find the dynamic values in the response of

one of the previous requests, as the dynamic parameter value had to have come from a response body

(since that's how http and browsers work). To do this, you want to go to the recorder log. The reason

you want to do this from the recorder log is that the recording will have the original recorded value in it.

To do this, click on the recorder log icon (we really should have put this on the context menu too!).

This will take you to the same request with the same parameter selected. Now right-click on the

parameter and do a quick find to find the parameter value in a previous response. Again, you want to do

this from the recording log, since the parameter is dynamic the value will be in the recording log but not

the playback log.

Page 146: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 146

Search up in response bodies to find the value. Note that if the dynamic string was constructed in java

script, you may need to only search of the dynamic part of the value:

Once you find it, right click to add an extraction rule:

Once the extraction rule is added, you also need to bind the parameter values. Choose yes to the

message box to launch search and replace from the Web test editor.

Page 147: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 147

You can see that we have added tooling to make finding and fixing dynamic parameters much easier in

VS 2010!!!

Engineering the Solution

To engineer this solution, we made several important design changes to Web tests and Web test results.

1) First, we changed the persistence mechanisms for Web test results to store results to a separate

log file rather than the in the trx.

2) We created a full public API for the Web test result.

3) We stamp request ids in each http request (enables jumping between playback and the editor).

4) The recorder generates a Web test result file and saves it as part of the recording.

About the Web Performance Test Recorder Log

The Recorder Log is a file stored in the same directory as the web test is recorded into. You can get to

the recorder log from the Web test results viewer as shown above. Or you can open it from VS, browse

to the Web test folder and look for *.webtestresult to find recorder log files in your project folder. The

name of the recorded result file is stored in the RecordedResultFile attribute in the web test xml

file. This file is not added to the project by default, if you wish to share it with team members consider

adding it to the solution so it will get checked in to source control.

The recorder log is persisted in the same file format as a Web test result. There is a full API over this data

(see the WebTestResult and WebTestResultDetails classes).

Adding Your Own Recorder Plugins to Make Record/Playback "Just Work"

Once you have found and fixed up the dynamic parameters in a test, consider writing a recorder plugin

to do this for you automatically each time you record a new test for this web site.

Recorder plugins are a new, super-powerful capability to the VS 2010 recorder. Recorder plugins are an

extensibility hook that gives you full access to the recorded result and the recorded Web test, and move

seamlessly from a recorded request to that corresponding request in the web test. This enables you to

make any modifications you see fit to the generated Web test. This is in effect a "catch-all", the ultimate

power and productivity tool in your hands to save time fixing up Web tests.

Page 148: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 148

I really can't emphasize enough what a powerful solution this is. If you will be scripting a web site for

any given period of time, and it requires you fix up the recordings, it will be worthwhile for you to invest

in building a recorder plugin for it.

Recorder plugins can be used for any number of reasons: fixing up dynamic parameters (adding

extraction rules and bindings), automatically adding validation rules, automatically adding data sources

and doing data bindings, filtering out recorded dependents, etc.

Recorder plugins are pretty straightforward to code up and install. Recorder Plugins derive from the

WebTestRecorderPlugin class. Once you have implemented a plugin, just drop the assembly into either

of these directories, and then restart VS:

%ProgramFiles%\Microsoft Visual Studio

10.0\Common7\IDE\PrivateAssemblies\WebTestPlugins

%USERPROFILE%\My Documents\Visual Studio 10\WebTestPlugins

Page 149: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 149

Here's sample recorder plugin code that adds an extraction rule and binds query string parameters to

the extracted value.

using System;

using System.Collections.Generic;

using System.Text;

using System.ComponentModel;

using Microsoft.VisualStudio.TestTools.WebTesting;

using Microsoft.VisualStudio.TestTools.WebTesting.Rules;

using System.Diagnostics;

namespace RecorderPlugins

{

[DisplayName("Correlate ReportSession")]

[Description("Adds extraction rule for Report Session and binds this to

querystring parameters that use ReportSession")]

public class CorrelateSessionId : WebTestRecorderPlugin

{

public override void PostWebTestRecording(object sender,

PostWebTestRecordingEventArgs e)

{

// Loop through the responses in the recording, looking for the session

Id.

bool foundId = false;

foreach (WebTestResultUnit unit in e.RecordedWebTestResult.Children)

{

WebTestResultPage recordedWebTestResultPage = unit as

WebTestResultPage;

if (recordedWebTestResultPage == null)

{

continue;

}

// If we haven't found the session Id yet, look for it in this

response.

if (!foundId)

{

// Look for the "ReportSession" string in the response body of a

recorded request

int indexOfReportSession =

recordedWebTestResultPage.RequestResult.Response.BodyString.IndexOf("ReportSession");

if (indexOfReportSession > -1)

{

// Find the corresponding page in the test, this is the page

we want to add an extraction rule to

WebTestRequest requestInWebTest =

e.RecordedWebTest.GetItem(recordedWebTestResultPage.DeclarativeWebTestItemId) as

WebTestRequest;

Debug.Assert(requestInWebTest != null);

if (requestInWebTest != null)

{

foundId = true;

string startsWith = "?ReportSession=";

string endsWith = "&";

string contextParamName = "ReportSession";

AddExtractTextRule(requestInWebTest, startsWith, endsWith,

contextParamName);

e.RecordedWebTestModified = true;

}

}

}

Page 150: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 150

else

{

// Once we have extracted the session id, bind any session id

parameters to the context parameter

// This call gets the corresponding request in the web test.

WebTestRequest requestInWebTest =

e.RecordedWebTest.GetItem(recordedWebTestResultPage.DeclarativeWebTestItemId) as

WebTestRequest;

Debug.Assert(requestInWebTest != null);

if (requestInWebTest != null)

{

BindQueryStringParameter(requestInWebTest, "SessionId",

"SessionId");

}

}

}

}

///

/// Code to add an ExtractText rule to the request.

///

///

///

///

///

private static void AddExtractTextRule(WebTestRequest request, string

startsWith, string endsWith, string contextParameterName)

{

// add an extraction rule to this request

// Get the corresponding request in the Declarative Web test

ExtractionRuleReference ruleReference = new ExtractionRuleReference();

ruleReference.Type = typeof(ExtractText);

ruleReference.ContextParameterName = contextParameterName;

ruleReference.Properties.Add(new PluginOrRuleProperty("EndsWith",

endsWith));

ruleReference.Properties.Add(new PluginOrRuleProperty("StartsWith",

startsWith));

ruleReference.Properties.Add(new PluginOrRuleProperty("HtmlDecode",

"True"));

ruleReference.Properties.Add(new PluginOrRuleProperty("IgnoreCase",

"True"));

ruleReference.Properties.Add(new PluginOrRuleProperty("Index", "0"));

ruleReference.Properties.Add(new PluginOrRuleProperty("Required",

"True"));

ruleReference.Properties.Add(new

PluginOrRuleProperty("UseRegularExpression", "False"));

request.ExtractionRuleReferences.Add(ruleReference);

}

public static void BindQueryStringParameter(WebTestRequest requestInWebTest,

string queryStringParameterName, string contextParameterName)

{

// This code adds data binds the SessionId parameter to the context

parameter

foreach (QueryStringParameter param in

requestInWebTest.QueryStringParameters)

{

if (param.Name.Equals(queryStringParameterName))

{

param.Value = "{{" + contextParameterName + "}}";

Page 151: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 151

}

}

}

}

}

More Recorder Enhancements in VS 2010

In addition to lighting up these powerful new scenarios, the VS 2010 does what it did in VS 2008 only

better.

No More Empty Recorder Pane

With VS 2008, there were several cases for which the recorder would not record requests. Most of these

involved the IE 7 and IE 8 process model, where these browsers start new processes when crossing

security contexts (thus the need to run VS as Admin). These problems have been fixed in VS 2010, as the

recorder now records across IE process boundaries.

More Apps "Just Work"

There were a few cases for which hidden field correlation and dynamic parameter detection did not

work with VS 2008. You let us know about those cases, and we have improved the hidden field

correlation and dynamic parameter detection tools to handle them in VS 2010. These were mostly

around dynamic parameters in AJAX requests.

And binary post bodies are now handled correctly, which were not always handled correctly with VS

2008.

The recorder now also automatically handles File Uploads so they will "just work". Files that are

uploaded are automatically added to the project, and the file upload file name will be dynamically

generated to enable you to upload the same file to different names automatically.

Page 152: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 152

New Tools Options for the Recorder

You asked for more control over the recorder, you got it with new recorder options in Tools Options:

Web Test Editor Enhancements in VS 2010

One of our goals with VS 2010 was to enable you to stay in "declarative" Web tests for more use cases

without having to move to a coded Web test. One reason you had to go to code with VS 2005 or VS 2008

was to do looping or conditional execution in your Web test.

Looping and Branching

The VS 2010 declarative editor now supports powerful new looping and branching constructs. Looping

and branching are based on conditional rules, which follow the exact same extensibility model as

validation rules and extraction rules. There are many rules "in the box":

Page 153: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 153

You can see above, there are a lot of flexible rules already built in.

A few scenarios this enables:

1) Conditional logins. In a load test, if you want to simulate a user logging in once and then doing

many operations in the test, this can be accomplished easily in a conditional rule. Session IDs are

typically handled by cookies, and you can easily set up a rule to only go to the login pages if the

login has not happened yet.

2) Variability in your scripts. If you want users to occasionally skip steps in a script, or randomly

repeat some steps, this is easily achieved with the probability rule which will only execute some

requests based on the probability you specify.

3) Loop until some operation succeeds. If an operation is expected to fail for some users, but will

succeed on retry, and you need to model the retry, you can do this by looping while the

operation is not successful. To do this, use an extraction rule to indicate whether or not the

action was successful, then use the Context Parameter Exists to loop until it is successful.

You can debug your loops and conditions using the results viewer, which shows the results of

conditional evaluations.

Page 154: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 154

A word of caution: do not use loops to do many, many loops within a given test. This will "gum up" the

load test engine, since it function is to control virtual user execution. Also, an entire web test result is

stored in memory, including all the loops. So running a web test with many loops will run your machine

out of memory. You can still run these in a load test to avoid this, but for the reason stated above we

recommend against this.

Page 155: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 155

More Editor Features

I already talked about search and replace in the editor above. There is also a super-handy new Request

Details editor that enables you to quickly see and edit the think times, reporting name, and goal for each

page. You should use this view each time you record a new test.

Use the Reporting Name and Response Time Goals to really light up your excel load test reports, as both

are propagated to the reports.

Setting the response time goal will also help you to find slow requests in a load test, as by default there

is a new Response Time Goal validation rule added to the test. This rule will fail pages that exceed the

goal by the threshold you specify (by default the tolerance is 0). This rule will cause slow requests to fail,

and enable you to collect logs on the failures, which may help you determine why the page is slow.

New Extensibility Points to Handle Rich Data

One area we did not address in VS 2010 is better editor and result viewer handling of rich data types. If

you have AJAX code sending and receiving Web services, REST, or JSON requests, you know how difficult

these are to work with. Like other releases, our mantra was if we couldn't get it in the box, we wanted

to expose extensibility points to enable us and the community to add tooling to help in this area.

To this end, we have enabled two extensibility points that will enable us to address this out of band:

1) Web test editor request body editor plugins.

2) New tabs and menu items in Web test result viewer.

Page 156: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 156

We plan to release new editor and playback plugins around the time we RTM, so keep an eye on

codeplex.com\teamtestsplugins for new releases.

Web Test Editor Request Body Plugins

Web Test request body plugins provide a way to plug a custom editor into VS for editing form post

bodies. These plugins implement either IStringHttpBodyEditorPlugin or IBinaryHttpBodyEditorPlugin,

and enable you to customize the edit pane for different post body content.

The IStringHttpBodyEditorPlugin interface is super-simple:

public interface IStringHttpBodyEditorPlugin

{

object CreateEditor(string contentType, string initialValue);

string GetNewValue();

bool SupportsContentType(string contentType);

}

Basically, SupportsContentType allows you to specify which content types your editor supports. When

the editor encounters a particular content type, it will scan the list of editor plugins for the first one it

finds to support that type, then host the editor control. The CreateEditor call is used by the plugin to

instantiate an instance of the control and provides the initial value to be edited, and the GetNewValue is

the way the plugin returns the result of the editing session. The IBinaryHttpBodyEditorPlugin is the

same, except that it gets and puts byte arrays.

public interface IBinaryHttpBodyEditorPlugin

{

object CreateEditor(string contentType, byte[] initialValue);

byte[] GetNewValue();

bool SupportsContentType(string contentType);

}

We are working on creating new editors for the most common formats now, and will ship "out of band"

to codeplex.com\teamtestplugins around RTM. Here's a screen shot of the editor handling msbin1 data

in a rich way (I rubbed out some URLs of this public facing site):

Page 157: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 157

Web test editor plugins must be deployed to %ProgramFiles%\Microsoft Visual Studio

10.0\Common7\IDE\PrivateAssemblies\WebTestPlugins.

Web Test Result Viewer Plugins

The Web Test Result Viewer also supports design time plugins. There are many scenarios for these

plugins, here are some example scenarios:

1) The coolest comes from dynaTrace.

2) Tools that automatically analyze the result to point out potential performance problems (see

blogs.msdn.com\mtaute for an example).

3) Custom viewers for rich request and response data.

This third scenario is the one I want to delve into more in this section. Just as you want a rich editing

experience for working with Web services, REST, or JSON requests, you want a rich way to view this data

in the result viewer as well. The Web test result viewer plugins provide the perfect extensibility point for

this.

Page 158: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 158

Result viewer plugins are a bit more involved to code up and install the editor plugins.

Like the response body editor, we are working on out of band plugins for the Web test result viewer.

Here is a screen shot of the result view plugin for binary data:

Notice the tree views in the bottom panes, showing binary data as a tree.

Conclusion

Your takeaway after reading this blog post should be - "Wow, VS 2010 is fantastic and will save me tons

of time creating and maintaining Web tests, I have to have it!"

By working with you directly on the forums and through our blogs, we saw the types of problems you

are hitting developing scripts. We also listened to your feedback and folded it back into the tool. In

places we didn't have time to address, we've added extensibility points to enable us to deliver features

to you out of band, and for you to create your own solutions.

Now you can say: "I'm a performance tester, and Visual Studio 2010 was my idea!"

Page 159: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 159

Troubleshooting Network Emulation

This information was taken from nkamkolkar's blog site and is replicated in full here.

Recently we had a customer support issue on trouble shooting the Network Emulation driver in VS2010

Ultimate while doing load testing. I thought a blog on how we troubleshooted and isolated the

problem would be helpful, so here it is. In this blog, I discuss the problem, symptoms and also explain

how Network Emulation works in 2010. I also suggest specific steps to consider to isolate and narrow

down the problem.

Scope

This applies to Visual Studio 2010 Ultimate

Customer Scenario

The trouble shooting in this document is applicable to situations where you are attempting to use the

Network emulation capability newly available in VS 2010 Ultimate while creating a new Load Test and in

the "Edit Network Mix" screen of the wizard you select any other network type other than LAN.

Applies only to 2010

Page 160: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 160

What is Network Emulation?

Microsoft Visual Studio 2010 uses software-based true network emulation for all test types. This

includes load tests. True network emulation simulates network conditions by direct manipulation of the

network packets. The true network emulator can emulate the behavior of both wired and wireless

networks by using a reliable physical link, such as an Ethernet. The following network attributes are

incorporated into true network emulation:

Round-trip time across the network (latency)

The amount of available bandwidth

Queuing behavior

Packet loss

Reordering of packets

Error propagations.

True network emulation also provides flexibility in filtering network packets based on IP addresses or

protocols such as TCP, UDP, and ICMP. This can be used by network-based developers and testers to

emulate a desired test environment, assess performance, predict the effect of change, or make

decisions about technology optimization. When compared to hardware test beds, true network

emulation is a much cheaper and more flexible solution.

How Network Emulation Works in VS2010

Network emulation in VS 2010 Ultimate uses a network device driver that was designed and built by the

Microsoft Research Labs and is productized in Visual Studio 2010. The technology has been around since

2005 and is widely used within Microsoft across many server product teams.

To use Network Emulation, you will need to install the Visual Studio 2010 Ultimate SKU. Network

Emulation is configured as part of Add and new Load Test Type in Visual Studio and following the wizard

screens (see above). Once you have set up network emulation following instructions at

http://msdn.microsoft.com/en-us/library/dd997557.aspx, you will run your load tests. When the load test

starts, it allocates a range of available ports for each of the Network profiles (DSL, 56.K Modem etc.) that

you have selected in your network mix. This port range is available to the Network Emulation Driver

that is enabled at run time (by default the network emulation driver is disabled).

During load testing, when the load generator sends a request to the application under test it specifies a

port from the port range. When the network emulation driver sees this port from the select port range,

it is able to associate this port with the network profile that this request should follow. This enables the

driver to throttle the load in software ensure it meets the network profile you have selected.

Page 161: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 161

How to know Network Emulation is not working?

Often one of the symptoms you'll see is that load test records socket exceptions in the log such as the

one below:

"The requested address is not valid in its context xx.xx.xx.xxx:80"

NOTE: There may be other conditions that maybe causing such socket exceptions as well. The load test

may continue to work, but the socket exceptions get logged. The next section will help you isolate and

trouble shoot where the problem lies.

How to trouble shoot Network Emulation

To troubleshoot and isolate problems effectively you must ensure that you have done the basic tests.

1. Ensure that you have full network connectivity across all the machines that are participating in your load test.

2. Ensure you have configured the Network Emulation correctly by following the instructions and making sure admin rights are available for the agent.

3. Ensure that any/all firewalls are dis-abled (at least for trouble shooting) to ensure that firewall is NOT blocking specific ports or traffic on the lab network.

o a. Run tcpview (available here) to ensure that any socket connections are actually visible during run time (check for "red" highlights). You may also run your favorite port monitoring tool (portmon is another example)

4. Ensure that there is no virus software on the load generator machine that is possibly obstructing this software.

5. To isolate whether the problem is with the Network Emulation Driver or the Load Test Components you should:

o a. Eliminating the network emulation driver as a cause Run the load test with network emulation configured correctly (even though

you may be getting socket exceptions) Ping another host to see whether the output shows network show down and/or

higher latency. Check if the delay value matches selected network profile. If the latency values match the profile you have selected, then the network driver is working well.

From that agent machine where you are running the load test, attempt a connection to any host outside (like your favorite web page). This test verifies that while the load test is running and network driver is enabled, that external or lab connectivity is NOT a problem. This will isolate your network emulation driver from being a problem area.

o b. Eliminating the Load Test Components as cause You should download and run this sample test program (available as is, not

Microsoft supported) on the same machine as the load generator (agent machine). This sample program simulates the exact set of socket connection calls used in the load testing components. If this test program also displays Socket Exceptions (like in the image below) then this eliminates the Load Testing product as a cause for the socket exceptions and indicates the problem lies in the environment, machine, network or something external to the tooling. Please debug the external problem first before trying to run the load test again.

Page 162: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 162

If this sample program is working correctly, you will see the output as below and this will confirm that there is a likely problem in the load test program and the environment is not the likely cause. Please contact support or post your query or situation in the forums for further help in this case.

Page 163: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 163

Known Issues

There is a known issue with the Broadcom network cards where packets are dropped under heavy loads.

We recommend if you run into this, try another network card until Broadcom addresses this problem.

Also, if IPSEC is enabled, the ports in the network packet are encrypted and as such the network

emulation driver will not be able to determine that the packets are from the designated port range as

set by the load test engine (described above in "How Network Emulation Works in VS2010"). You must

disable IPSEC for network emulation to work.

Additional Resources:

http://msdn.microsoft.com/en-us/library/dd505008(VS.100).aspx

http://blogs.msdn.com/b/lkruger/archive/2009/06/08/introducing-true-network-emulation-in-visual-

studio-2010.aspx

Troubleshooting Guide for Visual Studio Test Controller and Agent

From: http://social.msdn.microsoft.com/Forums/en-US/VStest/thread/df043823-ffcf-46a4-9e47-

1c4b8854ca13

This guide is to help troubleshoot connection issues between Visual Studio Test Controller and Agent as

well as remote test execution issues. It gives an overview of main connection points used by Test

Controller and Agent and walks through general troubleshooting steps. In the end it provides a list of

common errors we have seen and ways to fix them, and a description of tools that can be useful for

troubleshooting as well as how to obtain diagnostics information for test execution components.

We would like to use this guide as running document, please reply to this post to add your comments.

2. Remote Test Execution: connection points

The following diagram illustrates main connection points between Test Controller, Agent and Client. It

outlines which ports are used for incoming and outgoing connections as well as security restrictions

used on these ports.

Page 164: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 164

The technology used to connect remote test execution components is .Net Remoting over Tcp ports. For

incoming connections, by default, Test Controller uses Tcp port 6901 and Test Agent uses port 6910. The

Client also needs to accept incoming connection in order to get test results from Controller, and, by

default, it is using random port for that. For information on how to configure incoming ports, refer to

the Tools section in Appendix. For outgoing connections random Tcp ports are used. For all incoming

connections Test Controller authenticates calling party and checks that it belongs to specific security

group.

All connectivity issues can be divided into 2 main groups: network issues and security/permission issues.

2.1. Network/Firewall issues (mainly implied by .Net Remoting technology):

Controller :

Listens on TCP port 6901 (can be configurable to use different port).

Needs to be able to make outgoing connection to Agents and to the Client.

Needs incoming "File and Printer sharing" connection open.

Agent:

Listens on TCP port 6910 (can be configurable to use different port).

Needs to be able to make outgoing connection to Controller.

Client:

Needs to be able to accept incoming calls. Usually you would get Firewall notification

when Controller tries to connect to Client 1<sup>st</sup> time. On Windows 2008

Server the notifications are disabled by default and you would need to manually add

Firewall exception for Client program (devenv.exe, mstest.exe, mlm.exe) so that it can

accept incoming connections.

Page 165: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 165

By default, random TCM port is used for incoming connections. If needed, the incoming

port can be configured (see the Tools section in Appendix).

Needs to be able to make outgoing connection to Controller.

2.2. Permissions

There are two scenarios which are different by how Test Controller is operating, and the permissions

used by Controller differ depending on the scenario:

Test Controller runs as standalone: physical environments (VS2008 or VS2010).

Test Controller is connected to TFS server: virtual environments (VS2010 only).

2.2.1. Permissions: Test Controller not connected to TFS server:

To run tests remotely, Client user must belong to either TeamTestControllerUsers, or

TeamTestControllerAdmins, or Administrators local group on Controller machine.

To manage Controller/Agent, Client user must belong to TeamTestControllerAdmins or

Administrators local group on Controller machine.

Agent service account must belong to either TeamTestAgentService or Administrators local

group on Controller machine.

Controller service account must belong to either TeamTestControllerUsers or Administrators

local group on Controller machine.

Service accounts with empty/no passwords are not supported.

2.3. Connection Points: Summary

Review of the connections gives high level picture of what can fail in Test Controller/Agent connectivity.

At this point you can already have a clear idea which requirement is not met for your specific scenario.

Next section provides step-by-step troubleshooting.

3. Step-by-step troubleshooting

Let's walk through general troubleshooting procedure for Test Controller/Agent connection issues. For

simplicity we'll do that in step-by-step manner.

Before following these steps you may take a look at Known Issues section in the Appendix to see if your

issue is one of known common issues. The troubleshooting is based on the key connection points and in

essence involves making sure that:

The services are up and running.

Permissions are set up correctly.

Network connectivity/Firewall issues.

There are two scenarios which are different by how Test Controller is operating, and troubleshooting

steps differ depending on the scenario; hence we will consider each scenario separately:

Test Controller runs as standalone: physical environments (VS2008 or VS2010).

Test Controller is connected to TFS server: virtual environments (VS2010 only).

Page 166: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 166

3.1. Step-by-step troubleshooting: VS2008 or VS2010 physical environments

Pre-requisites. Make sure you have necessary permissions.

Depending on what you need to troubleshoot, you may need Administrator permissions on

Agent and/or Controller machines.

Step 1. Make sure that the Controller is up and running and Client can connect to Controller.

Use Visual Studio or Microsoft Test Manager (see Tools section above) to view Controller status.

If you can't connect to Controller, make sure that Controller service is running:

On Controller machine (you can also do that remotely) re/start controller service (see

Tools section in Appendix).

(if you still can't connect) On Controller machine make sure that it can accept incoming

connections through Firewall

Open port 6901 (or create exception for the service program/executable).

Add Firewall Exception for File and Printer Sharing.

(if you still can't connect) make sure that the user you run the Client under has permissions to

connect to Controller:

On Controller machine, add Client user to the TeamTestControllerAdmins local group.

(if you still can't connect) On Client machine make sure that Firewall is not blocking incoming

and outgoing connections:

Make sure that there is Firewall exception for Client program (devenv.exe, mstest.exe,

mlm.exe) so that it can accept incoming connections.

Make sure that Firewall is not blocking outgoing connections.

(if you still can't connect)

VS2010 only: the simplest at this time is to re-configure the Controller:

On Controller machine log on as local Administrator, run the Test Controller

Configuration Tool (see Tools section above) and re-configure the Controller.

All steps should be successful.

(if you still can't connect) Restart Controller service (see the Service Management commands

section in Tools section above)

Step 2. Make sure that there is at least one Agent registered on Controller.

Use Visual Studio (Manage Test Controllers dialog) or Microsoft Test Manager (see Tools section

in the Appendix) to view connected Agents.

If there are no Agents on the Controller, connect the Agent(s).

VS2010 only:

On Agent machine log in as user that belongs to TeamTestAgentServiceAdmins.

On Agent machine open command line and run the Test Agent Configuration

Tool (see Tools section in the Appendix).

Page 167: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 167

Check 'Register with Test Controller', type controller machine name and click on

'Apply Settings'.

VS2008 only:

In Visual Studio (Manage Test Controllers dialog) click on Add Agent.

You may need to restart the Agent service.

Step 3. Make sure that Agent is running and Ready (for each Agent)

Agent status can be one of: Ready/Offline (temporary excluded from Test Rig)/Not Responding/Running

Tests.

Use Visual Studio or Microsoft Test Manager (see Tools section in the Appendix) to check Agent

status.

If one of the Agents is not shown as Ready, make sure that Agent service is running:

On Agent machine (you can also do that remotely) re/start Agent service (see Tools

section in the Appendix).

(if Agent is still not Ready)

VS2010 only: the simplest at this time is to re-configure the Agent:

On Agent machine log on as local Administrator and run the Test Agent

Configuration Tool (see Tools section in the Appendix) and re-configure the

Agent.

All steps should be successful.

(if Agent is still not Ready)

If Agent is shown as Offline, select it and click on the Online button.

On Agent machine make sure that agent service can accept incoming connections on

port 6901 (if Firewall in on, there must be Firewall exception either for the port or for

the service program/executable).

Make sure that Agent service account belongs to the TeamTestAgentService on the

Controller.

On Controller machine use Computer Management->Local Groups to add Agent

user to the TeamTestAgentService group.

Restart services: Stop Agent service/Stop Controller service/Start Controller

service/Start Agent service.

Make sure that Agent machine can reach Controller machine (use ping).

Restart Agent service (see the Service Management commands section in Tools section

above).

Step 4. If all above did not help, it is time now to analyze diagnostics information.

(VS2010 only) Agent/Controller services by default log errors into Application Event Log (see

Tools section in the Appendix).

Page 168: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 168

Check for suspicious log entries there.

Enable tracing – see Diagnostics section above.

Get trace for the components involved in your scenario, some/all of:

Controller

Agent

Client

Test Agent/Controller Configuration Tool

Make sure that Controller/Agent service accounts have write access to trace files.

Check for entries starting with "[E".

Step 5. Take a look at Known Issues section in the Appendix to see if your issue is similar to one of those.

Step 6. Collect appropriate diagnostics information and send to Microsoft (create Team Test Forum post

or Microsoft Connect bug).

Page 169: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 169

4. References

The following is a list of useful information sources related to Test Agent/Controller

troubleshooting.

Troubleshooting Test Execution in MSDN.

Troubleshooting Controllers, Agents and Rigs (VS2008) in MSDN.

Installing and Configuring Visual Studio Agents (VS2010) in MSDN.

Understanding Visual Studio Load Agent Controller (Load Test team blog).

Troubleshooting errors in lab management (Team Lab blog).

Visual Studio Team System – Test Forum.

Microsoft Connect – report bugs/suggestions.

Appendix 1. Tools

The following tools can be useful for remote execution/Agent/Controller troubleshooting:

Visual Studio: Premium (VS2010 only), Team Test Edition (VS2008 only).

Manage Test Controllers dialog (Main menu->Test->Manage Test Controllers): see status

of Controller and all connected Agents, add/remove Agents to Controller, restart

Agents/the whole test rig, bring Agents online/offline, configure Agent properties.

Note: on VS2008 this dialog is called Administer Test Controllers.

Run tests remotely:

VS2008: update Test Run Configuration to enable remote execution (Main Menu->Test-

>Edit Test Run Configurations->(select run config)->Controller and Agent->Remote-

>provide Test Controller name), then run a test.

VS2010: update Test Settings to use remote execution role (Main Menu->Test->Edit Test

Settings -> (select test settings)->Roles->Remote Execution), then run a test.

Microsoft Test Manager (VS2010 only)

Lab Center->Controllers: see status of Controller and all connected Agents, add/remove

Agents to Controller, restart Agents/the whole test rig, bring Agents online/offline,

configure Agent properties. Note that Lab Center only shows controllers that are

associated with this instance of TFS.

Test Controller Configuration Tool (TestControllerConfigUI.exe, VS2010 only):

It is run as last step of Test Controller setup.

You can use it any time after setup to re-configure Controller. The tool has embedded

diagnostics which makes it easier to detect issues.

Test Agent Configuration Tool (TestAgentConfigUI.exe, VS2010 only):

It is run as last step of Test Controller setup.

Page 170: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 170

You can use it any time after setup to re-configure Agent. The tool has embedded

diagnostics which makes it easier to detect issues.

Diagnostics information

Both Agent and Controller can be configured to trace diagnostics information (from

errors to verbose) to Application Event Log or trace file. Clients can also be configured to

trace (from errors to verbose) to trace file.

Tracing can be enabled via .config file or registry (VS2010 only), registry wins. Choose

the method that is more convenient for your scenario.

Enable tracing via .config file(s):

One of the advantages of using config files is that you can enable tracing for

each component separately and using trace settings specific only to this

component.

For Controller Service/Agent Service/Agent Process, you need the following

sections in the corresponding .config file (qtcontroller.exe.config,

qtagentservice.exe.config, qtagent.exe.config, qtagent32.exe.config which by

default are located in C:\Program Files (x86)\Microsoft Visual Studio

10.0\Common7\IDE):

Inside the <appSettings> section:

<add key="CreateTraceListener" value="yes"/>

Inside the <configuration> section (note: "Verbose" is equivalent

to "4"): <system.diagnostics>

<switches>

<add name="EqtTraceLevel" value="Verbose" />

</switches>

</system.diagnostics>

Trace files:

Controller: vsttcontroller.log

Agent Service: vsttagent.log

Agent Process: VSTTAgentProcess.log

For Client, add the following section to appropriate .config file

(devenv.exe.config, mstest.exe.config, mlm.exe.config):

Inside the <configuration> section (note: "Verbose" is equivalent

to "4"):

<system.diagnostics>

<trace autoflush="true" indentsize="4">

<listeners>

Page 171: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 171

<add name="EqtListener"

type="System.Diagnostics.TextWriterTraceListener"

initializeData="C:\EqtTrace.log" />

</listeners>

</trace>

<switches>

<add name="EqtTraceLevel" value="Verbose" />

</switches>

</system.diagnostics>

Trace file: trace will go to the file specified by the initializeData

attribute.

Important: please make sure that the location is writable by

controller/agent service/process.

Enable tracing via registry (VS2010 only):

One of the advantages of using registry is that you can enable tracing for all

components using just one setting, you don't have to modify multiple

configuration files.

Create a file with the following content, rename it so that it has .reg extension

and double click on it in Windows Explorer:

Windows Registry Editor Version 5.00

[HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\10.0\EnterpriseTools

\QualityTools\Diagnostics]

"EnableTracing"=dword:00000001

"TraceLevel"=dword:00000004

"LogsDirectory"="C:\"

Notes:

In case of Test Controller/Agent services the HKEY_CURRENT_USER is

the registry of the user the services are running under.

TraceLevel: 0/1/2/3/4 = Off/Error/Warning/Info/Verbose.

LogsDirectory is optional. If that is not specified, %TEMP% will be used.

Trace file name is <Process name>.EqtTrace.log, e.g.

devenv.EqtTrace.log.

Tracing from Test Controller Configuration Tool and Test Agent Configuration Tool:

To get trace file, click on Apply, then in the "Configuration Summary" window

on the view log hyperlink in the bottom.

SysInternals' DebugView can also be used to catch diagnostics information.

Application configuration files

Controller, Agent and Client use settings from application configuration files:

Page 172: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 172

Controller service: qtcontroller.exe.config

Agent service: qtagentservice.exe.config

Agent process: qtagent.exe.config (neutral/64bit agent), qtagent32.exe.config

(32bit agent).

VS: Devenv.exe.config.

Command line test runner: mstest.exe.config.

By default these files are located in C:\Program Files (x86)\Microsoft Visual

Studio 10.0\Common7\IDE.

How to configure listening ports:

This may be useful in the following scenarios:

Default ports used by Controller/Agent/Client can be used by some

other software.

There is firewall between controller and client. In this case you would

need to know which port to enable in the firewall so that Controller can

send results to the Client.

Controller Service: qtcontroller.exe.config:

<appSettings><add key="ControllerServicePort"

value="6901"/></appSettings>

Agent Service:

<appSettings><add key="AgentServicePort"

value="6910"/></appSettings>

Client: add the following registry values (DWORD). The Client will use one of the

ports from this range for receiving data from Controller:

HKEY_LOCAL_MACHINE\SOFTWARE\MICROSOFT\VisualStudio\10.0\EnterpriseTools\Qu

alityTools\ListenPortRange\PortRangeStart

HKEY_LOCAL_MACHINE\SOFTWARE\MICROSOFT\VisualStudio\10.0\EnterpriseTools\Qu

alityTools\ListenPortRange\PortRangeEnd

Service Management commands

UI: Start->Computer->Right-click->Manage-> Services and Applications->Services

§ Visual Studio Test Controller

Visual Studio Test Agent

Command line: net start/net stop: use to start/stop Agent/Controller

net start vsttcontroller

net start vsttagent

Windows Firewall

Page 173: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 173

Start->Control Panel->Windows Firewall.

IP Security Policy

Start->Run->rsop.msc (on both Agent and Controller machines)

Go to Computer configuration->windows settings->security settings->ip security policies

Check if there are any policies that may prevent connections. By default there are no

policies at all.

Computer Management

Local Groups

Start->Computer->Manage->Local Users and Groups->Groups.

Event Log (Application)

Start->Computer->Manage->Event Viewer->Windows Logs->Application.

Ping

You can use ping to make sure that general TCP/IP network connectivity works.

Telnet

You can use telnet to check that you can connect to Agent/Controller, i.e. Firewall is not

blocking, etc.

telnet <ControllerMachineName> 6901

telnet <AgentMachineName> 6910

Visual Studio Team System – Test Forum.

Microsoft Connect – report bugs/suggestions.

Appendix 2. Known issues

The following is a list of known issues and suggested resolutions for them.

2.1. The message or signature supplied for verification has been altered (KB968389)

Symptom: Agent cannot connect to Controller.

Affected scenarios: Windows XP/Windows 7 connecting to Windows 2003 Server.

Additional information:

EventL Log (Agent): The message or signature supplied for verification has been altered.

Trace file (Agent) contains:

I, <process id>, <thread id>, <date>, <time>, <machine name>\QTAgentService.exe,

AgentService: The message or signature supplied for verification has been altered.

I, <process id>, <thread id>, <date>, <time>, <machine name>\QTAgentService.exe,

AgentService: Failed to connect to controller.

Microsoft.VisualStudio.TestTools.Exceptions.EqtException: The agent can connect to the

controller but the controller cannot connect to the agent because of following reason:

An error occurred while processing the request on the server: System.IO.IOException:

The write operation failed, see inner exception. --->

System.ComponentModel.Win32Exception: The message or signature supplied for

verification has been altered

at System.Net.NTAuthentication.DecryptNtlm(Byte[] payload, Int32 offset, Int32 count,

Int32& newOffset, UInt32 expectedSeqNumber)

at System.Net.NTAuthentication.Decrypt(Byte[] payload, Int32 offset, Int32 count,

Page 174: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 174

Int32& newOffset, UInt32 expectedSeqNumber)

at System.Net.Security.NegoState.DecryptData(Byte[] buffer, Int32 offset, Int32 count,

Int32& newOffset)

at System.Net.Security.NegotiateStream.ProcessFrameBody(Int32 readBytes, Byte[]

buffer, Int32 offset, Int32 count, AsyncProtocolRequest asyncRequest)

at System.Net.Security.NegotiateStream.ReadCallback(AsyncProtocolRequest asyncRequest)

--- End of inner exception stack trace ---

at System.Net.Security.NegotiateStream.EndRead(IAsyncResult asyncResult)

at

System.Runtime.Remoting.Channels.SocketHandler.BeginReadMessageCallback(IAsyncResult

ar)

Server stack trace:

at

Microsoft.VisualStudio.TestTools.Controller.AgentMachine.VerifyAgentConnection(Int32

timeout)

Root cause: You installed KB968389 either via Windows Update or manually.

Resolution: uninstall KB968389 from Start->Control Panel->Programs and Features->View Installed

Updates.

2.2. Controller/Agent in untrusted Windows domains or one is in a workgroup and another one

is in domain.

Symptom: Agent cannot connect to Controller.

Affected scenarios: Test Controller and Agent are not in the same Windows domain. They are either in

untrusted domains or one of them is in a domain and another one is in a workgroup.

Additional information:

Trace file (Agent) contains:

W, <process is>, <thread id>, <date>, <time>, <mMachine name>\QTController.exe,

Exception pinging agent <agent name>:

System.Security.Authentication.AuthenticationException: Authentication failed on the

remote side (the stream might still be available for additional authentication

attempts). ---> System.ComponentModel.Win32Exception: No authority could be contacted

for authentication

Server stack trace:

at System.Net.Security.NegoState.ProcessReceivedBlob(Byte[] message, LazyAsyncResult

lazyResult)

at System.Net.Security.NegotiateStream.AuthenticateAsClient(NetworkCredential

credential, ChannelBinding binding, String targetName, ProtectionLevel

requiredProtectionLevel, TokenImpersonationLevel allowedImpersonationLevel)

at System.Net.Security.NegotiateStream.AuthenticateAsClient(NetworkCredential

credential, String targetName, ProtectionLevel requiredProtectionLevel,

TokenImpersonationLevel allowedImpersonationLevel)

at

System.Runtime.Remoting.Channels.Tcp.TcpClientTransportSink.CreateAuthenticatedStream(

Stream netStream, String machinePortAndSid)

at

System.Runtime.Remoting.Channels.BinaryClientFormatterSink.SyncProcessMessage(IMessage

msg)

Root cause: Due to Windows security, Agent cannot authenticate to Controller, or vice versa.

Resolution:

Page 175: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 175

The simplest is to use Workgroup authentication mode:

Mirror user account on Controller and Agent: create a user account with same user

name and password on both Controller and Agent machine.

Use mirrored user account to run Controller and Agent services under this account.

If you are using VS2010 RC+ version (i.e. RC or RTM but not Beta2), add the following

line to the qtcontroller.exe.config file under the <appSettings> node:

<add key="AgentImpersonationEnabled" value="no"/>

Restart Controller/Agent services (see Tools section in the Appendix).

Make sure there is no IP Security Policy that prevents the connection (see IP Security Policy

under Tools section in the Appendix).

By default for domain machines Windows uses domain (Kerberos) authentication, but if

it fails it will fall back to workgroup (NTLM) authentication. This behavior can be and

often is altered by IP Security policies, for instance, there could be a policy to block

connections from machines which do not belong to the domain.

Restart or re-configure Controller and Agent.

Page 176: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 176

Best Practice: Understanding the ROI for test automation

This is a very enlightening blog post on the true nature of getting proper returns when planning manual

vs. automated testing:

http://blogs.msdn.com/b/edglas/archive/2009/06/13/increasing-the-roi-of-our-automation.aspx

Best Practice: Blog on various considerations for web tests running under load

The following blog entry describes a number of different features and settings to consider when running

web tests under a load test in VSTT (a link to the blog entry is at the bottom of this topic). The following

topics are covered:

General Load Test Considerations

o Verify web tests and unit tests

o Choose an appropriate load profile

Using a Step Load Profile

Using a Goal-Based Load Profile

o Choosing the location of the Load Test Results Store

o Consider including Timing Details to collect percentile data

o Consider enabling SQL Tracing

o Don't Overload the Agent(s)

o Add an Analysis Comment

Consideration for Load Tests that contain Web Tests

o Choose the Appropriate Connection Pool Model

ConnectionPerUser

ConnectionPool

o Consider setting response time goals for web test requests

o Consider setting timeouts for web test requests

o Choose a value for the "Percentage of New Users" property

o Consider setting the "ParseDependentRequests" property of your web test requests to false

http://blogs.msdn.com/billbar/articles/517081.aspx

Page 177: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 177

User Account requirements and how to troubleshoot authentication

The following information comes from a blog entry by Durgaprasad Gorti. The link at the end of this

section will take you to the full article which includes a walkthrough on troubleshooting authentication

issues on a test rig.

Workgroup authentication

In a Microsoft® Windows® domain environment, there is a central authority to validate credentials. In a

workgroup environment, there is no such central authority. Still, we should be able to have computers in

a workgroup talk to each other and authenticate users. To enable this, local accounts have a special

characteristic that allows the local security authority on the computer to authenticate a "principal" in a

special way.

If you have two computers and a principal "UserXYZ" on both machines the security identifiers are

different for MACHINE1\UserXYZ and MACHINE2\UserXYZ and for all practical purposes they are two

completely different "Principals". However if the passwords are the same for them on each of these

computers, the local security authority treats them as the same principal.

So when MACHINE1\UserXYZ tries to authenticate to MACHINE2\UserXYZ, and if the passwords are the

same, then on MACHINE2, the UserXYZ is authenticated successfully and

is treated as MACHINE2\UserXYZ. Note the last sentence. The user MACHINE1\UserXYZ

is authenticated as MACHINE2\UserXYZ if the passwords are the same.

http://blogs.msdn.com/dgorti/archive/2007/10/02/vstt-controller-and-agent-setup.aspx

Page 178: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 178

Troubleshooting

How to enable logging for test recording

In 2008

You can create a log file of each recording which will show headers and post body as well as returned

headers and response. The way to enable this is to add the following 2 keys:

[HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\9.0\EnterpriseTools

\QualityTools\WebLoadTest]

"CreateLog"=dword:00000001 [NOTE: 1=create; 0=do not create]

"RecorderLogFolder"="C:\\recordlogs"

In 2010

The Web test recorder automatically logs the requests and responses. See the section Recorder Log

Available

--UPDATED-- Diagnosing and fixing Web Test recorder bar issues

When you start recording a web test and the recorder bar is disabled or doesn't show up it can be hard

to diagnose and fix the issue.

Michael Taute's blog provides a list with common reasons for this to happen and potential fixes for each.

Most of the times the reasons are security related. One of the most common reasons for this problem is:

Issue: recorder bar comes up, but the controls are disabled.

Fix:

the web test recorder bar does not work with Internet Explorer Enhanced Security Configuration

(IE ESC) enabled. IE ESC can be removed from within the Control panel -> Add Remove Programs

/ Windows Components and uncheck ESC (Windows Server 2003, Vista).

Windows Server 2008 requires a different process to disable this security feature. Start the

Server Manager, browse to the Security Information section and click Configure IE ESC. In the

next window decide for whom you want to enable or disable this feature. For more details and

screenshots: http://blogs.techrepublic.com.com/datacenter/?p=255

Changed in 2010

Page 179: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 179

Issue: When recording web test new browser opens but recorder controls are not present

Fix: Enable Add-ons -

Open IE > Tools > Internet Options > Program Tab > Manage Add-ons Button

Highlight “Web Test Recorder 10.0” Press Enable Button

Page 180: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 180

How to enable Verbose Logging on an agent for troubleshooting

If you need to have verbose logging to debug or isolate issues with the agents including IP switching, you

can turn on verbose logging in the config files.

1. Go to c:\Program files\Microsoft Visual Studio 2008 Team Test Load Agent\LoadTest on the agent machine.

2. Edit the QTAgentServiceUI.exe.config file a. change the EqtTraceLevel to 4

<switches>

<add name="EqtTraceLevel" value="4" />

b. Change the CreateTraceListener value to yes <appSettings>

<add key="CreateTraceListener" value="yes"/>

The above settings also apply to the QTAgent.exe.config, QTController.exe.config and the QTControllerService.exe.config files.

Note: These files have moved in VS 2010 to C:\Program Files\Microsoft Visual Studio 10.0\Common7\IDE.

Troubleshooting invalid view state and failed event validation

ASP.NET uses __VIEWSTATE and __EVENTVALIDATION hidden fields to round-trip information across

HTTP requests. The values for these fields are generated on the server and should be posted unchanged

on a post back request. By default, these values are signed with a so-called validationKey to prevent

tampering with the values on the client.

If you just record the values in a web test and post the recorded values, you can run into ASP.NET error

messages about invalid view state or failed event validation. The Visual Studio web test recorder will

normally automatically detect the __VIEWSTATE and __EVENTVALIDATION hidden fields as dynamic

parameters. This means the dynamically extracted values will be posted back instead of the recorded

values.

However, if the web server is load balanced and part of a web farm you may still run into invalid view

state and failed event validation errors. This occurs when not all servers in the web farm use the same

validationKey and the post back request is routed to a different server in the farm than the one on

which the page was rendered.

To troubleshoot, ViewState MAC checking can be disabled by setting enableViewStateMac to false.

However, this is not suitable for use on a production environment because it disables an important

security feature and has performance implications. The recommended fix is to define the same value for

the validationKey on all machines.

Instructions for manually creating a validationKey are detailed at http://msdn.microsoft.com/en-

us/library/ms998288.aspx. For IIS 7 a machine key can easily be created through IIS Manager, see

http://technet.microsoft.com/en-us/library/cc772287(WS.10).aspx.

Page 181: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 181

For more background information on ViewState and EventValidation go to

http://msdn.microsoft.com/en-us/magazine/cc163512.aspx.

Page 182: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 182

Troubleshooting the VS Load Testing IP Switching Feature

1) Make sure that the Agent Service is running as a user than is an admin on the agent machine; this is

required because the agent service attempts to configure the IP addresses specified in the agent

properties on the chosen NIC, and admin permission is required to do this.

2) Make sure that none of the IP addresses in the range specified for a particular agent are already

configured on the chosen NIC.

3) Enable verbose logging for the Agent Service:

* Edit the file QTAgentService.exe.config: (located at: <Program Files>\Microsoft Visual Studio

9.0 Team Test Load Agent\LoadTest\QTAgentService.exe.config)

* Change: <add key="CreateTraceListener" value="no"/> to "yes"

* Change: <add name="EqtTraceLevel" value="3" /> to "4"

* Restart the Load Test Agent service

* The file "VSTTAgent.log" will be created in the same directory as QTAgentService.exe.config.

* Re-run the load test with verbose logging configured, and look for lines in the log file that contain the

text: "Attempting to configure IP address:" and "Configured IP address:" This will tell you whether or not

you the agent service is attempting to configure the IP address you've specified. If you see the

"Configured IP address:" line, it has succeeded in configuring this IP address. If not, there should be

some error logged.

If you have verified the items in step 1 & 2 above, and the log indicates that the configuration of the IP

address is failing but you cannot determine the cause of the failure from the error message in the log (or

if there is no error message in the log), post a new thread to the Web and Load testing forum, or open a

Microsoft Support incident for further assistance, and provide details on the setup including the relevant

portions of the log file.

4) Make sure that the load test you are running is set to use IP Switching: Click on each of the "Scenario"

nodes in the load test editor, go to the property sheet, and verify that the "IP Switching" property is set

to True (normally it should be since this is the default, but it's worth checking).

5) Enable verbose logging for the Agent process.

If the log file created in step 3 shows that the IP addresses are being successfully configured, the next

step is to check the agent process log file to verify that the load test is actually sending requests using

those IP addresses.

Page 183: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 183

To enable verbose logging for the agent process: * Edit the file QTAgent.exe.config: (located at <Program Files>\Microsoft Visual Studio 9.0 Team Test Load Agent\LoadTest\QTAgent.exe.config)

* Change:

<add key="CreateTraceListener" value="no"/> to "yes" * Change:

<add name="EqtTraceLevel" value="3" /> to "4"

* The file "VSTTAgentProcess.log" will be created in the same directory as QTAgent.exe.config. * Re-run the load test, and look for lines in the log file that look something like: "Bound request on

connection group M to IP address NNN.NNN.NNN.NNN" If verbose logging is enabled and these lines are

present in the log file, IP Switching should be working.

6) If the number of unique IP addresses being used as shown by the log entries in step 5 is less than the

number in the range that was configured, it could be because your load test is configured to use a

connection pool with a smaller number of connections than the number of IP addresses specified. If

this is the case, you can increase the size of the connection pool, or switch to "Connection per User"

mode in the load test's run settings properties.

--NEW--Performance Counters in .NET 4.0 help with analysis of Agent machines

http://msdn.microsoft.com/en-us/library/70xadeyt(VS.100).aspx

HttpWebRequest Average

Queue Time

The average time-on-queue for all HttpWebRequest objects that left the

queue in the last interval within the AppDomain since the process started.

One example of how these counters can help with load testing is shown in this comment from the VS

Test Product Team: "…it could provide a better indication than any other existing performance counter

to determine when an agent running a load test that contains Web tests is overworked", and a good

indication of how much that is affecting the response times reported for the load test. If that’s

correct, then it would be very valuable to add this category and counter to the Agent counter set in your

load test and define a threshold rule on it. We haven’t experimented with these performance counters

yet, but Yun plans to soon.

Note that it doesn’t help at all for load tests containing unit tests unless those unit tests happen to be

coded to use the HttpWebRequest class.

Page 184: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 184

How To, Gotchas and Best Practices

How to call one coded web test from another

If you want to have two coded web tests and have one called from within the other, you need to follow

a certain order to make it work:

1. Record the web tests

2. Generate code for the child

3. Include a call to the coded child in the declarative

4. Generate code for the parent

If you try to connect the two web tests before generating any code, your test will fail with the following

error:

There is no declarative Web test with the name 'DrillDown_Coded' included in this Web

test; the string argument to IncludeWebTest must match the name specified in an

IncludeDeclarativeWebTest attribute.

How to use methods other than GET and POST in a web test

Summary

FormPostHttpBody and StringHttpBody are the two built-in classes for generating HTTP request bodies.

If you need to generate requests containing something other than form parameters and strings then you

can implement an IHttpBody class.

More information

http://blogs.msdn.com/joshch/archive/2005/08/24/455726.aspx

How to filter out certain dependent requests

Summary

One of the new Web Test features in Visual Studio 2008 is the ability to filter dependent requests. If you

have a request in your web test that fetches a lot of content such as images, JavaScript files or CSS files,

it's possible to programmatically determine which requests are allowed to execute during the course of

the web test, and which aren't.

More information

http://blogs.msdn.com/densto/pages/new-in-orcas-filtering-dependent-requests.aspx

Page 185: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 185

How to handle ASP.NET Cookie-less Sessions

ASP.NET allows session IDs to be passed as part of the URL requests, which can cause problems with VS

playback. To deal with this issue, use the following steps:

1. Record the web test as normal

2. When the recording is done, you should see the very first request has no session ID in it, but all

of the rest do. This first request also has a REDIRECT to the same URL, but with the session ID

included.

3. For this request, turn off "Follow Redirects" and then add an extraction rule to get the value of

the session (see example below).

4. Since you turned off redirects on the first request, you need to add a second request manually

to the redirected page to capture any HIDDEN parameters.

5. Use "Quick Replace" to change all other hard-coded session IDs to the context you extracted in

step 3

How to use extracted values inside Web Request URLs

How to use Client-side certificates in web tests

Client-side certificates are also supported in web tests, but additional code is required. The certificates

need to be added to the WebTestRequest.ClientCertificates collection. This can be done in a coded web

test, or by using a request plug-in in a declarative web test.

The following link describes how to use X509 certificate collections to make a SOAP request in .NET;

code for using them in a web test will be similar.

More information

http://msdn.microsoft.com/en-us/library/ms819963.aspx

Turn off Redirects here

Use “Extract Text” and use STARTS WITH ‘/FMS/’ and ENDS WITH ‘/login/’

Context ID added to URL here and in all subsequent requests

This is a copy of first request with EXTRACT HIDDEN FIELDS added

Page 186: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 186

How to remove the "If-Modified-Since" header from dependent requests

The reason that If-Modified-Since headers are sent by default with dependent requests is that the web

test engine attempts to emulate the behavior of Internet Explorer in its default caching mode. In many

cases IE will send If-Modified-Since headers.

However, with VS 2008 if you want to completely disable caching of all dependent requests and always

fetch them, you can so with the following WebTestPlugin:

public class WebTestPlugin_DisableDependentCaching : WebTestPlugin

{

public override void PostRequest(object sender, PostRequestEventArgs e)

{

foreach (WebTestRequest dependentRequest in e.Request.DependentRequests)

{

dependentRequest.Cache = false;

}

}

}

How to handle custom data binding in web tests

In 2008

Summary

It is possible to create a custom data binding to bind to something other than a table, such as a select

statement. This blog post describes one possible method – creating one class which will manage the

data and creating a web test plug-in to add the data into the web test context.

More information

http://blogs.msdn.com/slumley/pages/custom-data-binding-in-web-tests.aspx

In 2010

http://blogs.msdn.com/slumley/archive/2010/01/04/VS-2010-feature-data-source-enhancements.aspx

How to add a datasource value to a context parameter

If you try to assign a datasource value to a context parameter in a web test, it will not work properly.

This is because VSTT does not replace datasource values in the context parameters. To work around this,

you can add code directly into a coded web test or in a web test plugin. Use the following syntax for

adding the binding:

this.Context.Add("ContextNameToUse",this.Datasource1["ColumnToUse"]);

Changed in 2010

Page 187: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 187

How to test Web Services with Unit Tests

If you need some help or a starting point for building Web Service tests using Unit tests, the below blog

gives a great walkthrough.

http://blogs.msdn.com/slumley/pages/load-testing-web-services-with-unit-tests.aspx

How to add random users to web tests

The following code can be used to generate random users for loading up sample sites with user

accounts. The key to this is to randomize against a time stamp and to add another unique number (in

this case, the vuser ID) so that two different instances of the load test won't accidently try to insert the

same user. Issues can occur where multiple agent machines randomly generate the same user when

under heavy load. The code below does not guarantee you'll never hit identical accounts, but it

significantly increases the chance of never hitting it.

public string sRndName = "User";

public string sRndExt = @"@contoso.lab";

public int x,y;

public string sUserName;

// Generate our random user

Random randObj = new Random();

x = randObj.Next();

y = this.Context.WebTestUserId;

sUserName = sRndName + Convert.ToString(x) + Convert.ToString(y) +

sRndExt;

Or, in a declarative test this can be achieved by setting the username value to:

UserName{{$Random(0,10000)}}{{$WebTestUserId}}UserNameExt

How to add think time to a Unit Test

When you use a web test, the VS environment provides a property for each request called ThinkTime.

This is the preferred method to use. However, there is no such property for Unit Tests. In order to

simulate think time within Unit Tests, use the Windows API "Sleep" and pass in the appropriate value

(the parameter for sleep is in milliseconds, so use 1000 to simulate 1 second of sleep time). The Sleep

API will work well here because it is a non-CPU intensive API. The reason it is NOT recommended for

web tests is because it is a blocking API and more than one web test can share a thread, therefore it can

adversely affect more than one vuser. Unit tests do not share threads, therefore they are not affected

by this.

Page 188: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 188

How to add details of a validation rule to your web test

There are no properties on the WebTestResponse object or WebTestRequest object that indicate the

outcome of a specific validation rule. The best approach is to have the validation rule place the result

text in the WebTestContext, and then access the WebTestContext object from the WebTest object's

Context property in the PostRequest or PostWebTest event handler. The following approach should

work. If you have multiple validation rules, you may want to use different names for the key on the call

to this.Context.Add.

public class WebTest13Coded : WebTest

{

public WebTest13Coded()

{

this.PreAuthenticate = true;

}

public override IEnumerator<WebTestRequest> GetRequestEnumerator()

{

WebTestRequest request1 = new WebTestRequest("http://vsncts01/StoreCSVS");

request1.ExpectedResponseUrl = "http://vsncts01/StoreCSVS/";

if ((this.Context.ValidationLevel >=

Microsoft.VisualStudio.TestTools.WebTesting.ValidationLevel.High))

{

// request1.ValidateResponse += new

EventHandler<ValidationEventArgs>(validationRule2.Validate);

// Specify a wrapper validation event handler …

request1.ValidateResponse += new

EventHandler<ValidationEventArgs>(request1_ValidateResponse);

}

yield return request1;

request1 = null;

// Check the validation rule result of the previous request

if ((bool)(this.Context["validationRule_Passed"]))

{

WebTestRequest request2 = new

WebTestRequest("http://vsncts01/testwebsite");

yield return request2;

}

}

private void request1_ValidateResponse(object source, ValidationEventArgs

validationEventArgs)

{

ValidationRuleRequiredAttributeValue validationRule = new

ValidationRuleRequiredAttributeValue();

validationRule.TagName = "DIV";

validationRule.AttributeName = "id";

validationRule.MatchAttributeName = "id";

validationRule.MatchAttributeValue = "LeftContent";

validationRule.ExpectedValue = "LeftContent";

validationRule.IgnoreCase = false;

validationRule.Index = -1;

validationRule.Validate(source, validationEventArgs);

// Add the validation rule result to the WebTestContext

this.Context.Add("validationRule_Passed", validationEventArgs.IsValid);

this.Context.Add("validationRule_Message", validationEventArgs.Message);

}

}

Page 189: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 189

How to mask a 404 error on a dependent request

When running web tests, you may find that certain dependent requests always fail with a 404 error.

Normally you would resolve this issue by fixing the broken link, or removing the reference. However,

sometimes (for the sake of moving forward with your testing) you might want to have VSTT ignore the

error. Ed Glas has a blog outlining one way to do this quickly

(http://blogs.msdn.com/edglas/archive/2008/08/06/masking-a-404-error-in-a-dependent-request.aspx)

but that may not work in all cases. For example if an ASPX page has some code that returns a link to a

local file that is not present, then the blog post above will not work. In this case, you should consider

using a plugin similar to the following (thanks to Ed Glas for the sample):

//************************************************************************************

*************

// WebTestDependentFilter.cs

// Owner: Ed Glas

//

// This web test plugin filters dependents from a particular site.

// For example, if the site you are testing has ads served by another company

// you probably don't want to hit that site as part of a load test.

// This plugin enables you to filter all dependents from a particular site.

//

// Copyright(c) Microsoft Corporation, 2008

//************************************************************************************

*************

using Microsoft.VisualStudio.TestTools.WebTesting;

namespace SampleWebTestRules

{

public class WebTestDependentFilter : WebTestPlugin

{

string m_startsWith;

public string FilterDependentRequestsThatStartWith

{

get { return m_startsWith; }

set { m_startsWith = value; }

}

public override void PostRequest(object sender, PostRequestEventArgs e)

{

WebTestRequestCollection depsToRemove = new WebTestRequestCollection();

// Note, you can't modify the collection inside a foreach, hence the

second collection

// requests to remove.

foreach (WebTestRequest r in e.Request.DependentRequests)

{

if (!string.IsNullOrEmpty(FilterDependentRequestsThatStartWith) &&

r.Url.StartsWith(FilterDependentRequestsThatStartWith))

{

depsToRemove.Add(r);

}

}

foreach (WebTestRequest r in depsToRemove)

{

e.Request.DependentRequests.Remove(r);

}

}

}

}

Page 190: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 190

How to parameterize Web Service calls within Web Tests

By default, VS does not expose an automated way of parameterizing the data passed in the body of a

Web Service call. However, it does still honor the syntax used to define parameters in the string. To

manually add a parameter definition in the body, edit the string and add the parameters where you

need them. The syntax is:

{{Datasource.Table.Column}}

Here is a sample:

<?xml version="1.0" encoding="utf-8"?> <soap:Envelope xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"

xmlns:xsd="http://www.w3.org/2001/XMLSchema"

xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/"> <soap:Body> <OrderItem xmlns="http://tempuri.org/"> <userName>[email protected]</userName> <password>IBS_007</password> <productID>{{DataSource1.Products.ProductID}}</productID> <quantity>1</quantity> </OrderItem> </soap:Body> </soap:Envelope>

How to pass Load Test Context Parameters to Unit Tests

http://blogs.msdn.com/slumley/archive/2006/05/15/passing-load-test-context

How to create Global Variables in a Unit Test

If you need to have a global variable shared among iterations of a unit test, use the following:

Define a static member variable of the unit test class, or if you have multiple unit test classes that need

to share the data, create a singleton object that is accessed by all of the unit tests. The only case in

which this would not work is if you have multiple unit test assemblies being used in the same load test

that all need to share the global data and you also need to set the "Run Unit Tests in Application

Domain" load test setting to true. In that case each unit test assembly has its own app domain and its

own copy of the static or singleton object.

CAVEAT: This will not work in a multi-agent test rig. If you have a multi-agent rig and you want truly

global data, you'd either need to create a common Web service or use a database that all of the agents

access.

Page 191: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 191

How to use Unit Tests to Drive Load with Command Line Apps

The following code can be used in a Unit Test to drive a command line tool (such as a testing tool). The

Unit test can then be driven by a load test to emulate multiple copies of the app.

using System.Threading;

using System.Diagnostics;

using System.IO;

.......

[TestMethod]

public void TestMethod1()

{

int x=0;

int iDuration = 10000;

try

{

Process myProcess = new Process();

myProcess = Process.Start("c:\\temp\\conapp2.exe", "arg1", "arg2");

myProcess.WaitForExit(iDuration); //Max iDuration milliseconds to return

if (!myProcess.HasExited) //If the app has not exited, kill it manually

{

myProcess.Kill();

Console.WriteLine("Application hung and was killed manually.");

}

else

{

x = myProcess.ExitCode;

Console.WriteLine("Completed. Exit Code was {0}", x);

}

}

catch (Exception e)

{

Console.WriteLine("The following exception was raised: " + e.Message);

}

finally

{

}

}

How to add Console Output to the results store when running Unit tests under load

The following link points to a write-up on how to allow unit tests to write custom output messages to

the Load Test Results Store database from Unit tests while they are running in a load test:

http://blogs.msdn.com/billbar/pages/adding-console-output-to-load-tests-running-unit-tests.aspx

Page 192: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 192

How to add parameters to Load Tests

To add a parameter to a Load Test, open the load test and right-click on the "Run Settings1" line (or

wherever you want to add the parameter) and then choose to add a context parameter. Make sure it

uses the same name as the parameter you wish to override in the web tests if that is your intent.

Adding parameters to load tests

How to Change the Standard Deviation for a NormalDistribution ThinkTime

Find the <test_name>.loadtest file in the VSTT project directory and edit it directly. You will find a

section like the one below for each scenario in the loadtest. Change the ThinkProfile Value to whatever

standard deviation you wish to use. The default value in VSTT is 20% (0.2)

<Scenario Name="Scenario1" DelayBetweenIterations="2"

PercentNewUsers="0" IPSwitching="true"

TestMixType="PercentageOfTestsStarted">

<ThinkProfile Value="0.2" Pattern="NormalDistribution" />

Any ThinkTime that has a value of zero will remain zero regardless of the distribution settings.

Page 193: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 193

How to programmatically access the number of users in Load Tests

In a load test plug-in, you can get the current user load. For an example of this, see Ed Glas's blog post

at: http://blogs.msdn.com/edglas/archive/2006/02/06/525614.aspx (listed as "Custom Load Patterns

for VS" in the offline pages collection). This blog post actually does much more than that, but the line

where it updates the current load is:

((LoadTestScenario)m_loadTest.Scenarios[0]).CurrentLoad = newLoad;

In VS 2008 SP1 and later, you can access the load profile using the LoadTestScenario.LoadProfile

property, and casting this to the appropriate LoadProfile class (such as LoadTestConstantLoadProfile).

How to create a webtest plugin that will only execute on a predefined interval

If you want to write a webtest plugin that will only fire on certain intervals (maybe for polling or

reporting), then use the following as a starting point.

public class WebTestPluginActingInfrequently : WebTestPlugin

{

public override void PostWebTest(object sender, PostWebTestEventArgs e)

{

if (e.WebTest.Context.WebTestIteration % 100 == 1)

{

// Do something

}

}

}

The WebTestIteration property is guaranteed to be unique, so no need to worry about locking. If you

run this web test by itself it will "do something" because the WebTestIteration will be 1 (unless you run

the web test by itself with multiple iterations or data binding).

Rather than hard coding the frequency as 1 in 100, you could make the frequency a property of the

plugin that you set in the Web test editor, or a Web test context parameter or a load test context

parameter: the LoadTestPlugin would need to pass that down to the WebTestPlugin either by setting it

in the WebTestContext or just make the frequency a property on the plugin.

Note that the WebTestIteration property is incrememented separately for each Scenario (on each agent)

in the load test, but if you want the frequency to be across all Web iterations on an agent then you

could define a static int in the WebTestPlugin (and use Interlocked.Increment to atomically increment

it).

Page 194: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 194

How to support Context Parameters in a plug-in property

If you develop a plug-in or an extraction rule and you want to allow the properties you expose to be

Context Parameters that the user specifies you need to add some code to your plugin to check for the

existence of a Context Paramter using the curly brace '{{xyz}}' syntax.

For example suppose the user had a Context Parameter {{ComparisonEventTarget}} that they want to

provide as the property value for the EventTarget property in your plugin, (see the screen shot), then

use the following code snippet to have your extraction/plugin checks the value supplied to determine if

it contains the syntax "{{".

Here is a partial code snippet:

public class DynamicFormFields : WebTestRequestPlugin

{

// this is our property that is exposed in the Visual Studio UI

//we want to allow either supplying a string literal, or a context

paramerter name

public string EventTarget {get;set;}

public override void PreRequest(object sender, PreRequestEventArgs e)

{

//we will check to see if our EventTarget is a string or do

they want us to get it from a context param

if ( this.EventTarget.Contains("{{") )

{

string contextParamKey = this.EventTarget.Replace("{{",

string.Empty).Replace("}}", string.Empty);

this.EventTarget =

e.WebTest.Context[contextParamKey].ToString();

}

//. . . . . . . code to do your work starts here…

Page 195: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 195

How to stop a web test in the middle of execution

If you want to stop a web test in the middle of execution based on a certain condition, you can hook into

a couple of methods (GetRequestEnumerator or PostRequestEvent) and use the following code to stop

the execution:

Coded Web Test

if (<condition>)

{

this.Stop();

yield break;

}

WebTest Plugin (Note the caveat for this from the entry: "How to stop a test in the PreRequest Event)

{

e.WebTest.Stop();

}

How To: Modify the ServicePointManager to force SSLv3 instead of TLS (Default)

If you need to modify the type of SSL connection to force SSLv3 instead of TLS (Default) then you must

modify the ServicePointManager.SecurityProtocol property to force this behavior. This can happen if

you are working with a legacy server that requires an older SSLv3 protocol and cannot negotiate for the

higher TLS security protocol. In addition, you may need to write code in your test to handle the

ServerCertificateValidationCallback to determine if the server certificate provided is valid. A code

snippet is provided below.

[TestMethod]

public void TestMethod1()

{

// We're using SSL3 here and not TLS. Without this line, nothing works.

ServicePointManager.SecurityProtocol = SecurityProtocolType.Ssl3;

//we wire up the callback so we can override the behavior, force it to

accept the cert from the server.

ServicePointManager.ServerCertificateValidationCallback =

RemoteCertificateValidationCB;

--------- <XX SNIPPED XX> ---------

public static bool RemoteCertificateValidationCB(Object sender, X509Certificate

certificate, X509Chain chain, SslPolicyErrors sslPolicyErrors)

{

//If it is really important, validate the certificate issuer here.

//string resultsTrue = certificate.Issuer.ToString(true);

//For now, accept any certificate

return true;

}

Page 196: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 196

How To: Stop a Test in the PreRequest event

Stopping a test using the WebTest.Stop() method in the PreRequest event will not stop the current

request from executing. If you wish to stop the current request from firing then you need to set the

current request Instruction property to WebTestExecutionInstruction.Skip and then issue the

WebTest.Stop().

void MSDNsiteCoded_PreRequest(object sender, PreRequestEventArgs e)

{

e.Instruction = WebTestExecutionInstruction.Skip;

e.WebTest.Stop();

}

How to make a validation rule force a redirection to a new page

Suppose you have a scenario in which you have a custom validation rule which detects an error

condition. When you hit that error condition, you want to redirect to a new error page. Here are three

ways to accomplish this.

coded test.

In a coded test, you can easily add new requests that get returned in the body of the test. You would

just create a new WebTestRequest object and "yield return" it. For example if the rule adds a context

parameter called ErrorUrl, you would have following in code:

if(this.Context.ContainsKey("ErrorUrl"))

{

WebTestRequest request4 = new

WebTestRequest(this.Context["ErrorUrl"].ToString());

request4.Encoding = System.Text.Encoding.GetEncoding("utf-8");

yield return request4;

request4 = null;

}

Validation rule.

First you will need to add a dummy request after the page you want to check. The URL is not important

because you are going to change it based on outcome of the validation rule. In your validation rule set a

context parameter that contains the URL you want to redirect to. Here is a very simple rule that does

this. If return code is great than 400, it adds the URL to the context. In this case, it is just redirecting to

home page of the site.

public class ErrorCheckValidationRule : ValidationRule

{

public override void Validate(object sender, ValidationEventArgs e)

{

if (((int) e.Response.StatusCode) >= 400)

{

e.WebTest.Context.Add("ErrorUrl", e.WebTest.Context["WebServer1"].ToString()+"/storecsvs/");

}

}

}

Page 197: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 197

WebTestRequestPlugin

First you will need to add a dummy request after the page you want to check. The URL is not important

because you are going to change it in the plugin. Add a WebTestRequestPlugin to the dummy request.

The plug-in will look for the parameter and if it exists, it will change URL of request. If the parameter

does not exist, it will set the skip instruction for the request. Here is a simple plug-in which does this:

public class ErrorCheckPlugin : WebTestRequestPlugin

{

public override void PreRequest(object sender, PreRequestEventArgs e)

{

object errorUrl;

if (e.WebTest.Context.TryGetValue("ErrorUrl", out errorUrl))

{

e.Request.Url = errorUrl.ToString();

}

else

{

//if it does not exist then skip the request

e.Instruction = WebTestExecutionInstruction.Skip;

}

}

}

Here is what the web test looks like: The dummy request is http://localhost.

Page 198: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 198

Here is what the result looks likes when it is skipped. You can see the status of Not Executed:

a. Here is what it looks when it does the redirect:

A solution for VS 2010 using the new conditional rule logic that works for declarative editor. In VS 2010

you can now do branching and looping in declarative editor. So instead of a web test request plug-in,

we can do the redirect with a conditional rule. So you would do the following:

1. Add the validation rule to a request.

2. Still add the dummy request below the one that the validation rule is on

3. Set the URL for this dummy request to {{ErrorUrl}}

4. Right click on this request and choose "Insert Condition…"

5. Choose the Context Parameter Exists rule

6. Set the context parameter Name to ErrorUrl. This rule will execute if the ErrorUrl parameter is

in the context.

7. Click Ok

Page 199: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 199

Here is what editor looks like with the conditional rule:

Here is what it looks like when the condition is met.

Here is what it looks like when condition is not met:

Page 200: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 200

How to add a Web Service reference in a test project - testing services in Unit Tests

If you follow along Sean Lumley's blog (http://blogs.msdn.com/slumley/pages/load-testing-web-services-

with-unit-tests.aspx), as referenced in the cheat sheet, you'll see that step 2 is to create a New Web

Reference. Unfortunately, right-clicking on either the project or references does not give you the option

for Add Web Reference. To add the reference, add a service reference:

In 2008

http://blogs.msdn.com/slumley/pages/load-testing-web-services-with-unit-tests.aspx

In 2010

In the dialog, click on the "Advanced" button:

Page 201: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 201

In the "Advanced" dialog, click the "Add Web Reference…" button

You will get the following dialog and can add the reference there:

Page 202: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 202

How to remotely count connections to a process

If you are troubleshooting connectivity issues, you may need to count connections to a particular

process over a particular port. You can easily do this using TCPVCON.EXE and a filter:

D:\TcpView>Tcpvcon.exe -c smsvchost.exe | find "808" /c

TCPVCON is a sysinternals tool that is part of "TCPView" and can be downloaded from:

http://technet.microsoft.com/en-us/sysinternals/bb795532.aspx

If you need to run this command (or others) remotely, you can also look at the tool "PsTools" at the

same web page.

How to hook into LoadTest database upon completion of a load test

You might want to run automatic custom actions after the completion of a load test, for example doing

some automated reporting. To do this, you must change the LoadTest database used by Visual Studio.

For Visual Studio 2010 the default name of the database is LoadTest2010.

The stored procedure

Prc_UpdateSummaryData [In 2008]

Prc_UpdateSummaryData2 [In 2010]

is the last one that is called when the load test finishes, assuming the Timing Details Storage is set to

something other than None in the Run Settings for your load test.

You can change this stored procedure by appending a call to your own stored procedure that

implements or starts your custom action. That stored procedure could be implemented as .NET code by

employing a CLR SQL Stored Procedure (see http://msdn.microsoft.com/en-us/library/5czye81z.aspx).

NOTE: changing the LoadTest database is an unsupported action that might interfere with automatic

upgrades to new versions of the database schema.

Page 203: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 203

How to deploy DLLs with MSTEST.EXE

You can use MSTEST.EXE to start your load test outside Visual Studio. In that case you might run into

errors with missing DLLs for plugins that you do not encounter when running your load test inside Visual

Studio. Visual Studio looks at references to figure out what to deploy, while MSTEST.EXE does not. To fix

this you have to manually add the DLLs as deployment items in the test settings (VS2010) or test run

configuration file (VS2008).

Select the test settings file that you want to use with MSTEST.EXE. This will be one of the files in the

Solution Items folder of your solution with the

.testsettings extension [In 2010]

.testrunconfig extension [In 2008]

Open it in the Test Settings Editor. Go to the Deployment page. Select "Add File…" and select the DLLs

you want to deploy.

Specify the test settings file you have edited on the command line for MSTEST.EXE with the

/testsettings switch [In 2010]

/testrunconfig switch [In 2008]

Changed in 2010

Page 204: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 204

How to authenticate with proxy before the test iteration begins

If you encounter HTTP 407 Proxy authentication required errors while playing back your web test, you

might have to explicitly authenticate to a proxy server first to be able to run your web test. First you

have to consider if you really need to go through this proxy server to be able to reach the web server

under test. If you cannot get around the proxy server, you can authenticate through code in a

WebTestPlugin. You have to use a plugin for this since you cannot set the credentials through the Visual

Studio UI.

using System; using Microsoft.VisualStudio.TestTools.WebTesting; using System.Net; namespace WebTestPluginNamespace { public class MyWebTestPlugin : WebTestPlugin { public override void PreWebTest(object sender, PreWebTestEventArgs e) { // Create credentials to authenticate to your proxy NetworkCredential proxyCredentials = new NetworkCredential(); proxyCredentials.Domain = "yourDomain"; proxyCredentials.UserName = "yourUserName"; proxyCredentials.Password = "yourPassword"; // Create a WebProxy object for your proxy WebProxy webProxy = new WebProxy("<http://yourproxy>"); webProxy.Credentials = proxyCredentials; //Set the WebProxy so that even local addresses use the proxy // webProxy.BypassProxyOnLocal = false; // Use this WebProxy for the Web test e.WebTest.WebProxy = webProxy; e.WebTest.PreAuthenticate = true; } } }

Page 205: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 205

How to enumerate WebTextContext and Unit TestContext objects

Web and Unit TestContext objects contain similar information, but are actually collections of different

types of objects. The Microsoft.VisualStudio.TestTools.WebTesting.WebTestContext class is a collection

of KeyValuePair<string,object> objects, but the

Microsoft.VisualStudio.TestTools.UnitTesting.TestContext class has a property called Properties that is a

collection of DictionaryEntry objects. Thus, the collections need to be enumerated in a slightly different

way.

// Web Test

// using System.Collections.Generic;

// using Microsoft.VisualStudio.TestTools.WebTesting;

public static void DumpArgs(WebTestContext context)

{

foreach (KeyValuePair<string, object> kvp in context)

{

Debug.WriteLine(kvp.Key + " = " + kvp.Value);

}

}

// Unit Test

// using System.Collections;

// using Microsoft.VisualStudio.TestTools.UnitTesting;

public static void DumpArgs(TestContext context)

{

foreach (DictionaryEntry kvp in context.Properties)

{

Debug.WriteLine(kvp.Key + " = " + kvp.Value );

}

}

How to manually move the data cursor

Add the following line of code to force the parameter database to advance by one row. This is useful if

you need to loop through sections of code in a single iteration and want to use different data.

this.MoveDataTableCursor("DataSource1", "Products");

VS 2010 also allows you to set the cursor to a specific row:

this.MoveDataTableCursor("DataSource1", "Products",32);

New to 2010

Page 206: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 206

How to programmatically create a declarative web test

Declarative web tests are non-coded web tests that can be displayed and modified in the web test UI. In

Visual Studio 2008 the APIs needed to programmatically create declarative web tests have been

exposed. If you want to programmatically generate web tests you can now do this using the

DeclarativeWebTest and DeclarativeWebTestSerializer classes.

DeclarativeWebTestSerializer loads the contents of a .webtest file into an instance of the

DeclarativeWebTest class and can also save an instance of the DeclarativeWebTest class back out to a

.webtest file.

DeclarativeWebTest exposes all of the properties, requests, and rules of the loaded web test so they can

be manipulated in whatever way necessary and then resaved.

For example, if something in your web application has changed that affects a large group of your existing

Web Tests, rather than modify the tests by hand you could write some code to do this for you. Here's an

example of modifying an existing declarative web test in a C# console application:

static void Main(string[] args)

{

DeclarativeWebTest decWebTest

DeclarativeWebTestSerializer.Open(@"c:\test.webtest");

//Add a Request to this WebTest

WebTestRequest newRequest = new

WebTestRequest("http://newRequest/default.aspx");

decWebTest.Items.Add(newRequest);

//Set ExpectedHttpStatus to 404 on the 1st Request

WebTestRequest reqToModify = null;

foreach (WebTestItem item in decWebTest.Items)

{

if (item is WebTestRequest)

{

reqToModify = item as WebTestRequest;

break;

}

}

if (reqToModify != null)

{

reqToModify.ExpectedHttpStatusCode = 404;

}

//Save Test

DeclarativeWebTestSerializer.Save(decWebTest, @"c:\test.webtest");

}

Page 207: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 207

How to modify the string body programmatically in a declarative web test

The string body of a web test may be modified programmatically by setting e.Request.Body.BodyString

in the request.

public class EditBodyString : WebTestRequestPlugin { public override void PreRequest(object sender, PreRequestEventArgs e) { StringHttpBody body = e.Request.Body as StringHttpBody; if (body != null) { body.BodyString = "blah"; } e.Request.Body = body; } }

How to Add Agents To A Test Rig

When you uninstall the controller software and reinstall it, the local user group that contains the agent

accounts used to connect is reset. You must repopulate the group with the appropriate users. From

Start -> Run, type in "lusrmgr.msc" and then expand the Groups items and open the

"TeamTestAgentService" group. Add the user account(s) used when setting up your agents.

Next, open VS and open up the Test Rig Management dialog (Test -> Administer Test Controllers) and

add each agent back to the list.

Or if you have VS 2010, you can go to each agent and re-run the config tool, which will automatically add

the agent back to the controller.

How to Change the Default Port for Agent-Controller Communication

The default port for communication is 6910. To change this, see the following post:

http://blogs.msdn.com/billbar/archive/2007/07/31/configuring-a-non-default-port-number-for-the-vs-

team-test-controller.aspx

Page 208: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 208

How to create guaranteed unique user IDs for UNIT tests

Web tests have the ability to create unique user IDs for every user, but UNIT tests do not. The UserId

property of the LoadTestUserContext object is guaranteed to be unique among all running users. You

could use this instead of data binding the user ID – unfortunately with a sequential data source in a load

test it is possible that two different tests running at the same time will be using the same data row. The

below code will generate unique IDs for every user running in a test method.

using System;

using System.Diagnostics;

using System.Collections.Generic;

using Microsoft.VisualStudio.TestTools.UnitTesting;

using Microsoft.VisualStudio.TestTools.LoadTesting;

namespace TestProject1

{

public class MyUserObject

{

public int UserId { get; set; }

public string SomeData { get; set; }

}

[TestClass]

public class UnitTestWithUserObjects

{

private static object s_userObjectsLock = new object();

private static Dictionary<int, MyUserObject> s_userObjects = new

Dictionary<int, MyUserObject>();

private TestContext testContextInstance;

public UnitTestWithUserObjects()

{

}

public TestContext TestContext

{

get{

return testContextInstance;}

set{

testContextInstance = value;}

}

[TestMethod]

public void TestWithUserObjects()

{

MyUserObject userObject = GetUserObject();

Console.WriteLine("UserId: " + userObject.UserId);

DoSomeThingWithUser(userObject);

}

private MyUserObject GetUserObject()

{

int userId;

if (this.TestContext.Properties.Contains("$LoadTestUserContext"))

{

LoadTestUserContext loadTestUserContext =

TestContext.Properties["$LoadTestUserContext"] as LoadTestUserContext;

userId = loadTestUserContext.UserId;

}

else

{

Page 209: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 209

userId = 1;

}

MyUserObject userObject;

lock (s_userObjectsLock)

{

if (!s_userObjects.TryGetValue(userId, out userObject))

{

userObject = new MyUserObject();

userObject.UserId = userId;

s_userObjects.Add(userId, userObject);

}

}

return userObject;

}

private void DoSomeThingWithUser(MyUserObject userObject)

{

}

}

}

The solution proposed gives you a unique ID (a load test user Id) as an int. You would need to write

code to map the integer value to a unique user name. There are several ways to do this – I would

suggest that you use a DB table (or .csv file) where each row contains the load test user ID integer as

well as the data you need for each user (username, password, anything else). You would then need to

write code in your unit test (not using the unit test data binding feature) that reads a row from the

database using the LoadTestUserId to get the correct row for that user. A more efficient and only

slightly more complex solution would be to load all of the data from this user DB table into memory in

the unit test's ClassInitialize method and store it in a static member variable of type Dictionary<int,

UserObject> where the int key is the LoadTestUserId. Then as each test method runs it gets the

LoadTestUserId as shown in the code attached to the attached email and looks up the user data in this

static Dictionary.

Page 210: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 210

How to create a sync point for starting load tests

If you want to create a sync point for starting load tests where they are paused at the beginning until

you manually release them, you can use a plugin to handle this. Visual Studio does not have this feature

out of the box, but the following blog post gives an example of creating a plugin:

http://blogs.msdn.com/b/billbar/archive/2006/02/09/528649.aspx

How to set default extensions that the WebTest recorder will ignore

The following registry entries will dictate the behavior of the webtest recorder:

[HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\10.0\EnterpriseTools\QualityTools\WebLoadTest]

"WebTestRecorderMode"="exclude"

"ExcludeMimeTypes"="image;application/x-javascript;application/x-ns-proxy-autoconfig;text/css"

"ExcludeExtensions"=".js;.vbscript;.gif;.jpg;.jpeg;.jpe;.png;.css;.rss"

How to get the LoadTestRunId from a load test

If you want to get the LoadTestRunId from a load test (this is the ID used in the results database), then

you can use the following code inside a load test plugin:

public class LoadTestPlugin1 : ILoadTestPlugin { LoadTest m_loadTest; public void Initialize(LoadTest loadTest) { m_loadTest = loadTest; m_loadTest.LoadTestStarting += new EventHandler(LoadTest_LoadTestStarting); } void LoadTest_LoadTestStarting(object sender, EventArgs e) { long x; long.TryParse(m_loadTest.Context["LoadTestRunId"].ToString(), out x); } }

Applies only to 2010

Page 211: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 211

--NEW--How To: Add comments to a web recording where IE is in KIOSK mode

Single Machine running tests

If your web site opens IE in KIOSK mode, the recorder bar wil disappear along with all other toolbars,

etc. The recording is still occurring, but you no longer have the option to pause/stop/comment your

code. One possible way around this is to:

Start the webtest recording as normal in Visual Studio

BEFORE entering the starting URL, open a new copy of IE from the Start Menu (or quick launch).

This copy should also have the recorder bar.

Enter the URL. The browser will go into KIOSK mode and the recorder will disappear, however

the recorder bar will still exist in the other open instance of I.E.

You will see the recording build in this copy of I.E. as you navigate the KIOSK instance of I.E. You

can add comments as you wish. Switch back and forth to record and add comments.

Page 212: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 212

--NEW--How to access a data source before it is bound to an object

When you add a data source to a webtest, it defaults to a behavior of only loading columns that are

bound to a request. If you need to access the data source in a webtest where it is not bound (for use in a

plugin, or for passing to sub webtests, you can change the column selection to "Select all columns" and

the datasource will load prior to the webtest starting and can be manually accessed throughout the test.

Page 213: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 213

--NEW--How to store and view transaction times for Unit and Coded UI tests

When running unit tests or Coded UI tests, the standard timer information is not viewable in the

standard test results view and only viewable if you view the Results Summary by clicking on the test run

complete link. Here is a simple unit test:

[TestMethod] public void TimerTestExanple() { this.TestContext.BeginTimer("Timer1"); Thread.Sleep(2000); this.TestContext.EndTimer("Timer1"); Console.WriteLine("Completed the Timer1 code"); }

When you run the test and click on the test results for that specific test, the following view appears.

Page 214: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 214

The timer information does not appear. To view the timer information, you must click on the test run

completed link to see the timers.

--UPDATED--HOW TO: Handle 404 errors in dependent requests so the main request

does not fail.

A common issue is a dependent request getting a 404 will fail the main request and abort the run. (more

information is in the article "ERRORS IN DEPENDENT REQUESTS IN A LOAD TEST DO NOT SHOW UP IN THE

DETAILS TEST LOG" To get around this, you could either do the plugin or simply follow the steps below.

1. Select the failing dependent request in the playback log 2. Copy the request (right click) 3. Go to Web Test 4. This should highlight the parent request 5. Right click and “Add Dependant Request” 6. Change the properties of new dependant request with the URI you copied above. 7. Change HTTP Staus from “0” to 404

This will allow you to continue this test.

Page 215: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 215

--NEW--HOW TO: Minimize the amount of data a webtest retains for Response Bodies

If you wish to minimize the footprint of a webtest and want to make the test a bit faster, you can the

ResponseBodyCaptureLimit size to 0/1 and the test run will not store any of the response data and it will

not do any type of processing on it. The full response body WILL be downloaded so the timing of the

request/response is still valid. To see how to implement this, look at the article "FILE DOWNLOADS,

DOWNLOAD SIZE AND STORAGE OF FILES DURING WEB TESTS"

--NEW--HOW TO: Schedule tests to execute

PROBLEM:

I have a customer that wants to schedule their load testing with VS2010 at midnight, is that possible?

RESOLUTION:

Yes. You use MSTest.exe to run the workload instead of Visual Studio and then you schedule MSTest to

run at the needed time using the Windows scheduler. More Information from an internal discussion

alias.

MSTEST: STARTING POINT, HAS LINKS TO MORE DETAILS.

http://msdn.microsoft.com/en-us/library/ms182487(VS.80).aspx

SEAN LUMLEYS BLOG ON HOW TO RUN FROM COMMAND LINE:

http://blogs.msdn.com/b/slumley/archive/2008/12/22/running-web-and-load-tests-from-the-

command-line.aspx

MY TIP HAVING DONE THIS BEFORE…

Create .bat / .cmd files that contain the full mstest command line, and call those .bat/.cmd files

from Scheduled Tasks. If you do this, you can then easily update the .bat/.cmd file and not have

to mess with the tasks which are cumbersome to setup initially. Also, you will be able to put

those files under source control and live in your project.

SCHEDULED TASKS:

http://support.microsoft.com/kb/324283

Page 216: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 216

--NEW--HOW TO: NOT send an "accept-language" in webtests

PROBLEM:

Hi folks, we are trying simulate our server calls without a accept-language header so our webtests don’t

specifically add this header. However I notice that when the webtest is run the accept-language header

is automatically sent out by VSTS and that is messing up the test we want to run. How do we disable an

accept-language header from being sent out for a webtest?

RESOLUTION:

You can modify the header collection sent by editing the ".browser" files associated with the

browser emulation you are using. Open the ".browser" file in notepad and modify the <Headers>

section as needed.

NOTE: You can also modify this DIRECTLY in the .loadtest file and that way you do not change the built

in types everywhere, just for the loadtest. Below is a sample of a custom browser with headers

removed.

Page 217: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 217

Existing:

<BrowserMix>

<BrowserProfile Percentage="100">

<Browser Name="Internet Explorer 7.0" MaxConnections="2">

<Headers>

<Header Name="User-Agent" Value="Mozilla/4.0 (compatible; MSIE 7.0;

Windows NT 5.1)" />

<Header Name="Accept" Value="*/*" />

<Header Name="Accept-Language" Value="{{$IEAcceptLanguage}}" />

<Header Name="Accept-Encoding" Value="GZIP" />

</Headers>

</Browser>

</BrowserProfile> </BrowserMix>

Modified:

<BrowserMix>

<BrowserProfile Percentage="100">

<Browser Name="EMSCOMM" MaxConnections="2" />

</BrowserProfile> </BrowserMix>

--NEW--How to upload a file in a Web test

To upload a file, first record the file upload. I have a simple site that I did that on to generate this web

test. The key parameter here is obviously the file upload parameter, which recorded the file as

"MyImage.gif".

Page 218: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 218

At runtime, the web test engine will look for a file with the name MyImage.gif in the test deployment

directory. If I just go ahead and run this the test fails with:

Request failed: Could not find file

'C:\Users\edglas.000\Documents\Visual Studio

2008\Projects\TestProject1\TestResults\edglas_EDGLAS-LT2 2008-08-05

08_57_58\Out\MyImage.gif'.

Of course the first thing you'll notice is the file path was not recorded in the web test. So how will this

file be found, and why is it looking in that directory? When you run tests in VS, the files required to run

the tests are "deployed" to a directory for the test run (see my post on deployment), in this case it is

edglas_EDGLAS-LT2 2008-08-05 08_57_58. You'll get a different directory every time you run your tests.

Notice this directory is available in the web test context:

Page 219: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 219

The best way to handle this in your test is to actually add the file to be uploaded to your test project,

then add it as a deployment item on your test. That way it will be copied to the out directory prior to

running the test, and will be available during execution. This also has the advantage that if you are

working with others, the test will also run on their machine (hard-coded full paths in a test are bad, as

these tests will fail on another person's machine if their drive isn't set up the same way).

Once you've added the file to your project, add it as a deployment item. There's two ways to do this, on

the run config or on the test. Since this file is really associated with the test, I recommend putting it on

the test. This is not discoverable at all. First, open test view (Test menu | Windows | Test View) and

select the web test. Then set the Deployment Items property for the web test's test element.

Page 220: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 220

Now add MyImage.gif as a deployment item by clicking ... next to the deployment items property. Since

this is a property on the web test, the path is relative to the web test:

Now my test runs successfully:

Page 221: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 221

Another approach is to create a folder in your project where you put all your file upload files. Then

specify the relative path in the deployment item properties (relative to the web test in the project). So if

my file to upload is in the FileUpload folder

Now in deployment items specify the relative path, which again is relative to the web test. Note the path

in the file upload web test parameter is not relative since it will be published "flat" with no

subdirectories (no changes required):

Page 222: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 222

Now the test runs successfully again.

Another option is to deploy the files or entire directory using the run config settings. For the run config,

go to the Deployment tab and use the Add Directory to add your folder with files to upload. Note that

this path is solution relative, since the run config is in the solution directory. The <Solution Directory>

macro is automatically inserted after you select the file or directory.

Page 223: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 223

Now if I go to the deployment dir using Explorer (easiest way to find it is from the web test context

parameter), I see that both of my images were deployed, and my test still runs successfully. Also any

new files I want to upload I can just drop them in this folder in my solution and don't have to add it as a

deployment item to my test. Note that all files are published "flat", which means you can't have two

different deployment files in different folders with the same name.

Gotcha: Check Your Validation Level in the Load Test Run Settings

By default, all validation rules added to a web test are marked HIGH. By default, all load tests have a

validation level of LOW. This means that NONE of the validation rules will run in a load test by default.

You either need to lower the level in the web test, or raise the level in the load test.

Gotcha: Do not adjust goals too quickly in your code

When you are changing the goals used for the test, or if you are using multiple goals and switching

between them, be careful not to change the goal too often. One thing that may not be obvious is that

as the user load decreases because of the goal, the number of tests running does not decrease until

some tests complete. If your tests take more time than the time used to change the goals you use, it's

quite possible the effective user load will never go down when the goal changes and will only go up.

Page 224: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 224

Gotcha: Response body capture limit is set to 1.5 MB by default

The ResponseBodyCaptureLimit property on a web test defaults to 1,500,000 bytes. If you are trying to

parse or extract data beyond this size, your test will fail. In order to work around this use a coded web

test or a plugin with a declarative web test and set the RequestBodyCaptureLimit property. Here is a

sample of a web test plug-in that sets this property in the PreWebTest event.

public class MyWebTestPlugin : WebTestPlugin

{

public override void PreWebTest(object sender, PreWebTestEventArgs e)

{

e.WebTest.RequestBodyCaptureLimit = 10 * 1024 * 1024; // 10 MB

}

}

Gotcha: Caching of dependent requests is disabled when playing back Web Tests

Caching for all dependent requests is disabled when you are playing back a web test in Visual Studio.

You will notice that if, for example, the same image file is used in multiple web pages in your web test,

the image will be fetched multiple times from the web server.

Gotcha: VS 2008 and out of memory

Each time you run a web test, the web test result chews up memory on the client. This can result in the

following out of memory exception:

This is not a Test Rig exception but a VS client exception. The resolution is to restart VS to release

memory. It is fixed in 2010.

Gotcha: Timeout attribute in coded web test does not work during a load test

If you use the [Timeout()] attribute in a coded web test, it works as expected. However, if you then

run that webtest inside a load test, the attribute is ignored. This is expected behavior. To set timeouts,

use the request.timeout attribute instead.

public class Coded01 : WebTest { private RandomServerWorkTime testPlugin0 = new RandomServerWorkTime(); [Timeout(1000)]

This attribute is ignored during a load test

Page 225: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 225

--NEW--Gotcha: Cannot programmatically set .counterset mappings at runtime

PROBLEM:

I want to configure my loadtest counter set mappings at runtime. Is there a way to configure

counter set mappings programmatically, with a plugin perhaps? The counter sets will remain the

same. I just need a way to change the computer that it’s counting.

RESOLUTION:

There is not a way to do this at runtime.

If the machines are different sets of static machines, you can configure them in different run settings,

and then use an environment variable to control which run setting gets used.

If the machines are dynamically provisioned and determined at the time you run the test, you will need

to modify the load test xml file to put in the new machine names.

Best Practice: considerations when creating a dynamic goal based load test plugin:

If there is a chance that the load test will be run on a test rig, be sure to limit the code to running on

only one agent machine. Running the code on multiple agents will cause contention in the behavior and

will yield unexpected results. You will not receive an error. The following code from a load test plugin

Initialize method will force the code to run on only one agent and will work for rigs AND for locally run

tests:

public void Initialize(LoadTest loadTest)

{

// ONLY run this on one agent to avoid contention.

if (loadTest.Context.AgentId == 1)

{

LoadTestGoalBasedLoadProfile goalLoadProfile = new LoadTestGoalBasedLoadProfile();

// Since the heartbeat handler is inside the conditional, The event will be setup

// only on one machine All LoadProfile changes are sent to the controller and

// propogated across the rig automatically

loadTest.Heartbeat += new EventHandler<HeartbeatEventArgs>(_loadTest_Heartbeat);

}

}

Best Practice: Coded web tests and web test plug-ins should not block threads

http://blogs.msdn.com/billbar/archive/2007/06/13/coded-web-tests-and-web-test-plug-ins-should-not-

block-the-thread.aspx

Page 226: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 226

Best Practice: Add an Analysis Comment

After the load test is complete and you have spent some time analyzing the results, you can add a short

one line description and an arbitrarily long analysis comment to be stored permanently with the load

test result. To do this, in the load test result viewer, right click and choose the "Analysis" option. This

brings up a dialog that allows you to enter your analysis text which is stored in the load test results

database when you click OK to close the dialog. NOTE: This can be done while the test is running. You do

not need to wait for the test to finish.

Any comments and descriptions added will show up in the "Manage Load Test Results" dialog and will

make it much easier to determine which result set maps to the test run you wish to look at.

Extensibility

New Inner-text and Select-tag rules published on Codeplex

In 2008

All of the rules in this release on CodePlex relate to the inner text of a tag. For example, for a select tag

(list box and combo box), the option text is stored in inner text rather than an attribute:

<select name="myselect1"> <option>Milk </option> <option>Coffee</option> <option selected="selected">Tea</option> </select> In order to extract the value of the list box, we need to parse out the inner text of the selected option.

TextArea is another tag that does this, but there are also a lot of other examples in HTML where you

might want to extract or validate inner text. The new project has these new rules as well as a parser for

inner text and select tag:

1. ExtractionRuleInnerText

2. ExtractionRuleSelectTag

3. ValidationRuleInnerText

4. ValidationRuleSelectTag

Download location

http://codeplex.com

In 2010

Many of the features above are now built into VS 2010. Here is a list of these:

http://msdn.microsoft.com/en-us/library/bb385904(VS.100).aspx

Changed in 2010

Page 227: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 227

Page 228: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 228

How to Add Custom Tabs to the Playback UI

Another new feature of the 2010 Web Test Playback UI is the ability to add new tabs to the

WebTestResultViewer. Here is a tab that demonstrates how to get VIEWSTATE data from webtest

responses and add that data to a table in a custom results tab:

Page 229: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 229

Steps to implement your own custom tab

1) Create the new project

Create a Visual Studio Add-In: Create a new Visual Studio Add-In project (see picture below).

This starts the Add-In Wizard. Complete the wizard

You should now have a project that looks like:

Page 230: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 230

Reference the following assemblies directly in the project:

o Microsoft.VisualStudio.QualityTools.LoadTestFramework

o Microsoft.VisualStudio.QualityTools.WebTestFramework

o Any other assemblies or code you will need to do the functional work of your addin.

Add a user control to the project (right click -> Add new -> user control). This will house the

items to be displayed on the tab.

Add the necessary controls to the main user control. For my example, I needed a checkbox,

textbox and a listbox.

Set the listbox Dock property to Fill: Note that when you do this, it will cause the listbox to

cover the other controls. We will fix this next.

Set the margins for the main control. This will correct the size of the listbox from the previous

step. Make sure the value for TOP is big enough to uncover the other controls.

Make sure just the listbox is

highlighted

Make sure the main control is

highlighted

Page 231: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 231

2) Modify and add the tab control code

You will need to do a fair amount of work inside the "connect.cs" file to make the plugin work. However,

you should have your functional code (or at least the shell of it) in place before doing the connect.cs

work so the methods you reference will already exist. For my example, the only extra code I need is the

backing code for the user control. Double-Click on the listview and add the following methods:

public void AddAValueToTheListView(string sReqName, string sSize, int iTotalSize) { tbTotalSize.Text = iTotalSize.ToString(); tbTotalSize.Update(); ListViewItem item = new ListViewItem(new string[] { sReqName, sSize }); listViewTagCounts.Items.Add(item); } private void cbShowNonViewState_CheckedChanged(object sender, EventArgs e) { // Add code to handle hiding non viewstate pages }

sReqName is the URL of the current request.

sSize is the calculated size of the ViewState.

iTotalSize is the cumulative value.

All of these properties are calculated and set in the connect.cs code. The

code here is solely for modifying the values displayed in the tab.

Page 232: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 232

3) Modify and add the Addin handler code

Now we can jump into the connect.cs code. Here are the main items of interest for us:

public class Connect : IDTExtensibility2 { Dictionary<Guid, Dictionary<Guid, UserControl>> m_controls = new Dictionary<Guid, Dictionary<Guid, UserControl>>(); LoadTestPackageExt wpe; int iViewStateTotalSize = 0; public void OnConnection(object application, ext_ConnectMode connectMode, object addInInst, ref Array custom) { _applicationObject = (DTE2)application; _addInInstance = (AddIn)addInInst; wpe = _applicationObject.GetObject("Microsoft.VisualStudio.TestTools.LoadTesting.LoadTestPackageExt") as LoadTestPackageExt; //process open windows foreach (WebTestResultViewer p in wpe.WebTestResultViewerExt.ResultWindows) { WindowCreated(p); } wpe.WebTestResultViewerExt.WindowCreated += new EventHandler<WebTestResultViewerExt.WindowCreatedEventArgs>(wpe_WebtestPlaybackWindowCreated); wpe.WebTestResultViewerExt.WindowClosed += new EventHandler<WebTestResultViewerExt.WindowClosedEventArgs>(WebTestResultViewerExt_WindowClosed); wpe.WebTestResultViewerExt.SelectionChanged += new EventHandler<WebTestResultViewerExt.SelectionChangedEventArgs>(WebTestResultViewerExt_SelectionChanged); wpe.WebTestResultViewerExt.TestCompleted += new EventHandler<WebTestResultViewerExt.TestCompletedEventArgs>(WebTestResultViewerExt_TestCompleted); iViewStateTotalSize = 0; }

The iViewStateTotalSize is specific to my

particular addin. The highlighted lines need

to be added to all web test addins.

This method already exists. Delete

everything in the method and add this code.

This line of code is specific to my addin. You should add any

initialization code you might need right here.

The highlighted method names correspond to the matching method definitions below.

Page 233: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 233

private void WindowCreated(WebTestResultViewer viewer) { UserControl1 c = new UserControl1(); c.Dock = DockStyle.Fill; //add the dictionary of open playback windows System.Diagnostics.Debug.Assert(!m_controls.ContainsKey(viewer.TestResultId)); Dictionary<Guid, UserControl> userControls = new Dictionary<Guid, UserControl>(); //add the summary Guid summaryGuid = Guid.NewGuid(); Guid responseGuid = Guid.NewGuid(); userControls.Add(responseGuid, c); m_controls.Add(viewer.TestResultId, userControls); //add tabs to playback control viewer.AddResultPage(responseGuid, "ViewState Info", c); }

void WebTestResultViewerExt_TestCompleted(object sender, WebTestResultViewerExt.TestCompletedEventArgs e) { foreach (UserControl userControl in m_controls[e.TestResultId].Values) { } } void WebTestResultViewerExt_WindowClosed(object sender, WebTestResultViewerExt.WindowClosedEventArgs e) { if (m_controls.ContainsKey(e.WebTestResultViewer.TestResultId)) { //process open windows foreach (Guid g in m_controls.Keys) { e.WebTestResultViewer.RemoveResultPage(g); } m_controls.Remove(e.WebTestResultViewer.TestResultId); } } void wpe_WebtestPlaybackWindowCreated(object sender, WebTestResultViewerExt.WindowCreatedEventArgs e) { WindowCreated(e.WebTestResultViewer); }

This is stock code. Copy all of it and simply change

the user control name to whatever name you gave

your control in the previous section.

The text here is the name that

appears on the added tab

This is stock code. No

modification is needed.

This is stock code. No

modification is needed.

This is stock code. No

modification is needed.

Page 234: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 234

And here is the workhorse method:

void WebTestResultViewerExt_SelectionChanged(object sender, WebTestResultViewerExt.SelectionChangedEventArgs e) { if (e.WebTestRequestResult != null) { foreach (UserControl userControl in m_controls[e.TestResultId].Values) { UserControl1 userControl1 = userControl as UserControl1; if (userControl1 != null) { WebTestResponse response = e.WebTestRequestResult.Response; // Count the number of occurrences of each tag in the response Dictionary<string, int> tagCounts = new Dictionary<string, int>(StringComparer.OrdinalIgnoreCase); if (response != null && response.BodyBytes != null) { string str1 = response.ResponseUri.ToString(); string str2 = "No VIEWSTATE Detected"; if (response.BodyString.Contains("__VIEWSTATE")) { //<input type="hidden" name="__VIEWSTATE" id="__VIEWSTATE" value="/wEPQUe...z+aZmiNA==" /> int x = response.BodyString.IndexOf("id=\"__VIEWSTATE\" value=\""); int y = response.BodyString.IndexOf("\" />", x); if ((y - x - 24) > 0) { str2 = Convert.ToString(y - x - 24); iViewStateTotalSize = iViewStateTotalSize + (y - x - 24); } } userControl1.AddAValueToTheListView(str1, str2, iViewStateTotalSize); } } } } }

This

is s

tock

co

de.

This is the user control

you created.

This

is c

od

e th

at d

oes

th

e w

ork

fo

r th

e a

dd

in. H

ere

I get

all

of

the

dat

a an

d t

hen

cal

l my

con

tro

l to

po

pu

late

th

e ta

b.

The call to my user control to populate the tab.

Page 235: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 235

How to extend recorder functionality with plugins

Taken from Sean Lumley's Blog Site and reproduced in full.

In this post I am going to talk about a new feature that can help with web test recording. The feature is

extensible recorder plug-ins for modifying recorded web tests. Basically we are giving you the

opportunity to modify the recorded web test after you click stop on the web test recorder bar but prior

to the web test being fully saved back to the web test editor. So what problems does this help with?

The main one is performing your own custom correlation. In VS 2008 we added a process which runs

post recording that attempts to find dynamic fields. You can read this blog post for more information:

http://blogs.msdn.com/slumley/pages/web-test-correlation-helper-feature-in-orcas.aspx

This process still exists, but this process does not always find all dynamic fields for an application. So if

we did not find the dynamic fields in your application you had to manually perform the correlation

process. Here is a blog post that goes into detail about the manual process:

http://blogs.msdn.com/slumley/pages/how-to-debug-a-web-test.aspx Also there are cases that our

correlation process does not find the dynamic values, such as dynamic values in the URL.

At a high level, you have to:

1) Determine what parameters are dynamic

2) Then for each parameter find the first occurrence of this in a response body.

3) Add an extraction rule to pull the value out of the response and add it to the context

4) Then modify each query string or form post parameter that needs this value by changing the

value to pull the value out of the context.

This new feature allows you to write your own plug-in which can perform correlation or modify the web

test in many ways prior to it being saved back to the web test editor. So once you figure out that certain

dynamic variable have to be correlated for each of your recordings, you can automate the process. To

demonstrate how this works, I am going to write a recorder plug-in which will perform the correlation

that I manually walked through in my previous post. Please quickly read that:

http://blogs.msdn.com/slumley/pages/vs-2010-feature-web-test-playback-enhancements.aspx

Overview

Create the plug-in

Recorder plug-ins follow the same pattern as WebTestPlugins or WebTestRequestPlugins. To create a

plug-in, you will create a class that extends WebTestRecorderPlugin and then override the

PostWebTestRecording method:

Applies only to 2010

Page 236: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 236

public class Class1 : WebTestRecorderPlugin

{

public override void PostWebTestRecording(object sender,

PostWebTestRecordingEventArgs e)

{

base.PostWebTestRecording(sender, e);

}

}

Modify the web test The event args will give you 2 main objects to work with: the recorded result and the recorded web

test. This will allow you to iterate through the result looking for certain values and then jump to the

same request in the web test to make modifications. You can also just modify the web test if you

wanted to add a context parameter or maybe parameterize parts of the URL. If you do modify the

web test, you also need to set the ReocrdedWebTestModified property to true. e.RecordedWebTestModified = true;

Deploy the plug-in

After compiling the plug-in, you will need to place the dll in 1 of 2 spots:

1) Program Files\Microsoft Visual Studio

10.0\Common7\IDE\PrivateAssemblies\WebTestRecorderPlugins

2) %USERPROFILE%\My Documents\Visual Studio 10\WebTestRecorderPlugins

Executing the plug-in

After you deploy the plug-in, you will need to restart VS for the plug-in to be picked up. Now when you

create a web test, you will see a new dialog. The dialog will display all of the available plug-ins that can

be executed. Select your plug-in and hit ok. Once you are done recording your web test, the plug-in will

be executed.

Creating the Sample Plug-in

First a quick review of the correlation that we are going to automate. Here is the screen shot from

correlation tool after I recorded my web test against a reporting services site.

Page 237: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 237

We are going to correlate the ReportSession parameter.

1) Create a class library project

2) Right click references and select Add Reference

3) Choose Microsoft.VisualStudio.QualityTools.WebTestFramework

4) Here is the code for my plug-in:

using System.ComponentModel;

using Microsoft.VisualStudio.TestTools.WebTesting;

using Microsoft.VisualStudio.TestTools.WebTesting.Rules;

namespace RecorderPlugins

{

[DisplayName("Correlate ReportSession")]

[Description("Adds extraction rule for Report Session and binds this to

querystring parameters that use ReportSession")]

public class CorrelateSessionId : WebTestRecorderPlugin

{

public override void PostWebTestRecording(object sender,

PostWebTestRecordingEventArgs e)

{

//first find the session id

bool foundId = false;

Page 238: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 238

foreach (WebTestResultUnit unit in e.RecordedWebTestResult.Children)

{

WebTestResultPage page = unit as WebTestResultPage;

if (page != null)

{

if (!foundId)

{

int indexOfReportSession =

page.RequestResult.Response.BodyString.IndexOf("ReportSession");

if (indexOfReportSession > -1)

{

//add an extraction rule to this request

// Get the corresponding request in the Declarative Web

test

ExtractionRuleReference ruleReference = new

ExtractionRuleReference();

ruleReference.Type = typeof(ExtractText);

ruleReference.ContextParameterName = "SessionId";

ruleReference.Properties.Add(new

PluginOrRuleProperty("EndsWith", "&ControlID="));

ruleReference.Properties.Add(new

PluginOrRuleProperty("HtmlDecode", "True"));

ruleReference.Properties.Add(new

PluginOrRuleProperty("IgnoreCase", "True"));

ruleReference.Properties.Add(new

PluginOrRuleProperty("Index", "0"));

ruleReference.Properties.Add(new

PluginOrRuleProperty("Required", "True"));

ruleReference.Properties.Add(new

PluginOrRuleProperty("StartsWith", "ReportSession="));

ruleReference.Properties.Add(new

PluginOrRuleProperty("UseRegularExpression", "False"));

WebTestRequest requestInWebTest =

e.RecordedWebTest.GetItem(page.DeclarativeWebTestItemId) as WebTestRequest;

if (requestInWebTest != null)

{

requestInWebTest.ExtractionRuleReferences.Add(ruleReference);

e.RecordedWebTestModified = true;

}

foundId = true;

}

}

else

{

//now update query string parameters

WebTestRequest requestInWebTest =

e.RecordedWebTest.GetItem(page.DeclarativeWebTestItemId) as WebTestRequest;

if (requestInWebTest != null)

{

foreach (QueryStringParameter param in

requestInWebTest.QueryStringParameters)

{

if (param.Name.Equals("ReportSession"))

{

param.Value = "{{SessionId}}";

}

}

}

}

Page 239: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 239

}

}

}

}

}

5) Let's review parts of the class.

a. Iterate through the result to find first page with ReportSession. This code fragment

iterates through each of the recorded objects and searches the response body for

ReportSession.

foreach (WebTestResultUnit unit in e.RecordedWebTestResult.Children)

{

WebTestResultPage page = unit as WebTestResultPage;

if (page != null)

{

if (!foundId)

{

int indexOfReportSession =

page.RequestResult.Response.BodyString.IndexOf("ReportSession");

if (indexOfReportSession > -1)

{

b. Now that we found the response, we need to add an extraction rule. This code creates

the extraction rule and then finds the correct request in the web test to add the

extraction rule to. Each result object has a property called DeclaraticveWebTestItemId

which is what we will use to get correct request from the web test.

ExtractionRuleReference ruleReference = new ExtractionRuleReference();

ruleReference.Type = typeof(ExtractText);

ruleReference.ContextParameterName = "SessionId";

ruleReference.Properties.Add(new PluginOrRuleProperty("EndsWith",

"&ControlID="));

ruleReference.Properties.Add(new PluginOrRuleProperty("HtmlDecode",

"True"));

ruleReference.Properties.Add(new PluginOrRuleProperty("IgnoreCase",

"True"));

ruleReference.Properties.Add(new PluginOrRuleProperty("Index", "0"));

ruleReference.Properties.Add(new PluginOrRuleProperty("Required", "True"));

ruleReference.Properties.Add(new PluginOrRuleProperty("StartsWith",

"ReportSession="));

ruleReference.Properties.Add(new

PluginOrRuleProperty("UseRegularExpression", "False"));

WebTestRequest requestInWebTest =

e.RecordedWebTest.GetItem(page.DeclarativeWebTestItemId) as WebTestRequest;

if (requestInWebTest != null)

{

requestInWebTest.ExtractionRuleReferences.Add(ruleReference);

e.RecordedWebTestModified = true;

} c. Now we need to find all query string parameters that have ReportSession as name and

change the value to {{SessionId}}

WebTestRequest requestInWebTest =

e.RecordedWebTest.GetItem(page.DeclarativeWebTestItemId) as WebTestRequest;

Page 240: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 240

if (requestInWebTest != null)

{

foreach (QueryStringParameter param in

requestInWebTest.QueryStringParameters)

{

if (param.Name.Equals("ReportSession"))

{ param.Value = "{{SessionId}}";

}

}

}

6) Now that we have our plug-in, I need to compile and deploy it to one of the locations listed

above.

7) Restart VS

8) Open a test project and create a new web test. I now see the following dialog with my plug-in

available:

Page 241: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 241

9) Select the plug-in

10) Record the same web test against my reporting services site and click stop to end the web test.

11) Now when the correlation process runs, you will see that it does not find the ReportSession

parameter. This is because we have already correlated it.

12) Now look at the first request in the web test and you will see the extraction rule.

Page 242: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 242

13) Now look at the other requests to see where we are referencing the extraction rule.

This is a slightly more advanced feature, but it provides a huge time savings for automating

changes to your recorded web test. If you have multiple people creating web tests, you can use

this plug-in to make sure the same parameters or rules are added to each web test. And of

course you can automate correlation of parameters or URLs which the built in correlation tool

does not find.

Page 243: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 243

Items not specific to the VS testing platform

Stand-Alone Network Emulation and CodePlex

Taken directly from Lonny Kruger's blog and reproduced in full

In my post "Creating a Stand-Alone Network Emulator using VS2010 – Beta 1, I showed you how to

create a stand alone network emulator using the network emulation functionality introduced in the Beta

1 release of VS2010. Since then, the API for Network Emulation has gone through several changes and

long story short, the API for Beta 1 will not work for RC or RTM. To make things a bit easier, I have

created a new "Stand-Alone" Network Emulator UI (NEUI) that will allow you to take advantage of the

Network Emulation features in VS2010 without having to fire up VS2010 and start a unit or load test. I

have posted the source for this project on CodePlex for everyone to enjoy :). For now, it is in the project

"Web and Load Test Plugins for Visual Studio Team Test", but my hope is that it will gain enough community

support and involvement that it will warrant going through the process of creating and maintaining it as

a separate project.

Currently, the NEUI project is a simple UI that:

uses WPF

allows the user to select one Network Profile to emulate a specific network.

when minimized, displays in the system tray

when in the system tray, allows for the starting, stopping of network emulation and the

selection of the network profile.

Please feel free to download and use the emulator. Also, if you feel strongly enough, feel free to

suggest or contribute new features.

Page 244: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 244

Using the VS Application Profiler

Various articles to consider:

http://blogs.msdn.com/profiler/archive/2008/10/15/walkthroughs-using-VS-test-and-profilers-to-find-

performance-issues.aspx

http://msdn.microsoft.com/en-us/magazine/cc337887.aspx?pr=blog

http://www.codeguru.com/cpp/v-s/devstudio_macros/visualstudionet/article.php/c14823__1/

VS 2008 Application Profiler New Features

The VS perf team has added some blog posts outlining new features of the VS profiler and how to use

them. These features include a quick tool to find "hotspots" in your app, and the ability to use

performance counters to enhance your profiler diagnosis. See the following links to get this info:

http://blogs.msdn.com/profiler/archive/2007/10/19/articles-on-new-visual-studio-team-system-2008-

profiler-features.aspx

Using System.NET Tracing to debug Network issues

http://blogs.msdn.com/dgorti/archive/2005/09/18/471003.aspx

Page 245: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 245

Logparser tips and tricks

1. Use square brackets around an alias name to allow spaces in the names. Example: [User Data]

2. If you need to get a substring but also need to delete characters at the end of the substring, you

can use an abbreviated syntax:

Substr(mystring, 1, sub(strlen(mystring),19)) can be written as Substr(mystring,1,-19)

Logparser WEB Queries

Count (and percentage) of status codes from Web Logs -i:IISW3C -recurse:-1 -Q:on "SELECT sc-status, COUNT(*), MUL(PROPCOUNT(*),100.0) AS

Percentage INTO StatusCount.txt FROM ex*.log GROUP BY sc-status ORDER BY sc-status"

Breakdown of web status codes by pagetype from Web Logs -i:IISW3C -recurse:-1 -Q:on "SELECT EXTRACT_EXTENSION(TO_UPPERCASE(cs-uri-stem)) AS

PageType, sc-status, COUNT(*) AS Amount INTO StatusCodes.txt FROM ex*.log GROUP BY sc-

status, PageType ORDER BY sc-status ASC" -o:TSV

Number of hits by pagetype -i:IISW3C -recurse:-1 -Q:on "SELECT EXTRACT_EXTENSION(TO_UPPERCASE(cs-uri-stem)) AS

PageType, COUNT(*) AS Amount INTO pagetype.txt FROM ex*.log GROUP BY PageType ORDER BY

Amount DESC" -o:TSV

List of requests asking for non existant pages -i:IISW3C -recurse:-1 -Q:on "SELECT DISTINCT cs-uri-stem AS Url USING sc-status AS

statuscode INTO not-found.txt FROM ex*.log WHERE statuscode = 404" -o:TSV

Top 10 slowest page responses -i:IISW3C -recurse:-1 -Q:on "SELECT TOP 10 MAX(time-taken) AS Processing-Time,

AVG(time-taken) AS Average, MIN(time-taken) AS Minimum, cs-uri-stem AS Url, COUNT(cs-

uri-stem) AS PageCount INTO longrunning.txt FROM ex*.log GROUP BY cs-uri-stem ORDER BY

Average DESC" -o:TSV

Average Max and Min time taken for each page type -i:IISW3C -recurse:-1 -Q:on "SELECT EXTRACT_EXTENSION(cs-uri-stem) as Type, AVG(time-

taken) AS Average, MAX(time-taken) AS Maximum, MIN(time-taken) AS Minimum INTO

PageTimes.txt FROM ex*.log WHERE time-taken &amp;amp;gt; 0 GROUP BY Type ORDER BY

Average DESC"

Requests and Total Bytes per hour -i:IISW3C -recurse:-1 -Q:on "SELECT QUANTIZE(TO_TIMESTAMP(date, time), 3600) AS Hour,

COUNT(*) AS Total, SUM(sc-bytes) AS TotBytesSent INTO HitsByHour.txt FROM ex*.log

GROUP BY Hour ORDER BY Hour" -o:TSV

List and count of pages returning a status code of 500 -i:IISW3C -recurse:-1 -Q:on "SELECT cs-uri-stem, sc-status, COUNT(*) FROM ex*.log

WHERE sc-status=500 GROUP BY cs-uri-stem, sc-status ORDER BY cs-uri-stem" -o:TSV

Page 246: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 246

LogParser Non-Web Queries

Parsing Event Viewer files on Vista with LogParser

You need to convert the files to Vista format. The following command line will do this: wevtutil epl app.evtx app.evt /lf:true

Query For Text Strings in a file -i:TEXTLINE "SELECT LTRIM(extract_token(text, 1,'Text to find')) as string FROM *.txt

WHERE string is not null"

Pulling data from inside the body string of event viewer logs logparser -i:evt "SELECT extract_prefix(extract_suffix(Strings,0,'left text'),0,'right

text') as String INTO optimizer.txt FROM *.EVT WHERE Strings LIKE '%Optimizer

Results%'" -q:ON

(variation) Pulling data from inside the body string of event viewer logs constrained by timeframe logparser -i:evt -q:ON "SELECT Count(*) AS Qty, SUBSTR(extract_suffix(Message, 0,

'Message :'), 0, 75) as String FROM Error! Hyperlink reference not

valid.name>\Application WHERE SourceName LIKE '%Enterprise%' AND Message LIKE

'%Timestamp: %' AND TimeGenerated > TIMESTAMP ('2008-06-06 07:23:15', 'yyyy-MM-dd

hh:mm:ss' ) GROUP BY String ORDER BY Qty DESC"

List of exceptions from saved event logs searching for keywords in the text output -I:evt "SELECT QUANTIZE(TimeGenerated, 3600) AS Hour, COUNT(*) As Total, ComputerName

FROM *.evt WHERE EventID = 100 AND strings like '%overflow%' GROUP BY ComputerName,

hour"

Logparser command for querying netstat netstat.exe -anp TCP | LogParser "SELECT [Local Address] AS Server,[Foreign Address]

AS Client,State FROM STDIN WHERE Server LIKE '%:443' OR Server LIKE '%:80'" -i:TSV -

iSeparator:space -nSep:2 -fixedSep:OFF -nSkipLines:3 -o:TSV -headers:ON

Command to query Active Directory® Logparser -i:ADS "SELECT * FROM 'LDAP://Redmond/CN=Microsoft.com FTE,OU=Distribution

Lists,DC=redmond,DC=corp,DC=microsoft,DC=com'" -objClass:user

Command to query IIS and get site configuration information Logparser "select * from IIS://localhost"

Command to query Netmon file and list out data on each TCP conversation LogParser -fMode:TCPConn -rtp:-1 "SELECT DateTime, TO_INT(TimeTaken) AS Time,

DstPayloadBytes, SUBSTR(DstPayload, 0, 128) AS Start_Of_Payload INTO IE-Take2.txt FROM

IE-Take2.cap WHERE DstPort=80 ORDER BY DateTime ASC" -headers:ON

Command to query Netmon and find frame numbers based on specific text in payload LogParser -fMode:TCPIP -rtp:-1 "SELECT Frame, Payload INTO 3dvia.txt FROM 3dvia.cap

WHERE DstPort=80 AND Payload LIKE '%ppContent%' " -headers:ON

Command to get logged start time of an entry in custom log files LogParser –i:TEXTLINE " SELECT TOP 1 TO_TIME(TO_TIMESTAMP(EXTRACT_PREFIX(Text,2,' '),

'M/dd/yyyy h:mm:ss tt')) AS [Start Time], 'FirstStartTime' FROM *.log WHERE Text LIKE

'%text tag to search for%' ORDER BY [Start Time] ASC

Page 247: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 247

--NEW--Keyboard shortcut for adding "USING" statements automatically

Visual Studio has this very handy feature where if you type in a class name and don’t have the using (or

import for the VB programmers) added to your project, it will give you an option to automatically add

it. It lets you know this by putting a little underscore under the first character. If you move your mouse

over this and click, a popup window will appear, you select your namespace, and it adds it to your

project. It’s also does something similar if you rename a variable and you get a dropdown to

automatically rename all instances, and other little helper operations that made writing code more

efficient.

The hard part has always been getting your mouse over the right spot at the right time. <ctrl><period>

is a keyboard shortcut that will drop down that box for you and not require that you have to get your

mouse cursor on that little spot.

Page 248: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 248

Older articles

Content-Length header not available in Web Request Object

Currently the web request header "Content-Length" in not in the WebTestRequest object. This is

expected to be changed in SP1

SharePoint file upload test may post the file twice

If you have a web test that posts a file to a SharePoint site, the test may try to post the file twice.

SharePoint will only process one copy, but the request time and upload size will be incorrect due to the

double attempt. This occurs if you are using integrated authentication. The client requests a POST

expecting a 100-continue response. It gets a 404 instead (this is expected behavior). However, instead of

VS restarting the request with credentials, it continues posting the initial request (which SharePoint

ignores). When the initial request is done, VS will re-post with credentials, and this post will succeed. A

fix is available in SP1.

Some Hidden Fields are not parameterized within AJAX calls

When recording web tests with AJAX panel updates, you may find some FORM POST parameters where

HIDDEN values (such as VIEWSTATE) are not parameterized. From an email thread:

The problem is that in the Microsoft-Ajax partial rendering (update panel) responses, hidden fields can

appear in two places: a field that is marked by the type "|hiddenField|" (where we were looking), but

also in a regular hidden field input tag in the HTML within an "|updatePanel|" field in the Ajax response

(which we were not looking at).

A fix for this issue is in SP1.

(FIX) Unit Test threading models and changing them

The default threading model for unit tests is STA. The fix in SP1 was to have load tests honor this setting

(unit test in a load test would not honor the ApartmentState property). See the following blog for more

info:

http://blogs.msdn.com/irenak/archive/2008/02/22/sysk-365-how-to-get-your-unit-tests-test-project-in-

visual-studio-2008-a-k-a-mstest

Page 249: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 249

Bug in VS 2008 SP1 causes think time for redirected requests to be ignored in a load

test

When a web test is run in a load test, any test requests that result in redirects suffer from a timing bug.

Any think time that is specified on the request is ignored. This is fixed in a POST-SP1 hotfix:

KB 956397 (http://support.microsoft.com/kb/956397/en-us)

http://blogs.msdn.com/billbar/archive/2008/08/04/bug-in-VS-2008-sp1-causes-think-time-for-

redirected-requests-to-be-ignored-in-a-load-test.aspx

New Load Test Plugin Enhancements in VS 2008 SP1

http://blogs.msdn.com/billbar/pages/load-test-api-enhancements-in-VS-2008-sp1-beta.aspx

Four New Methods added to the WebTestPlugin Class for 2008 SP1

http://blogs.msdn.com/billbar/pages/web-test-api-enhancements-available-in-VS-2008-sp1-beta.aspx

Page 250: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 250

Index .NET Garbage Collection, 5, 75

AJAX, 9, 25, 46, 122, 142, 151, 155, 248

Application Domain, 14, 190

authentication, 7, 11, 89, 135, 174, 175, 177, 204, 248

Caching, 4, 5, 9, 17, 18, 20, 82, 100, 186, 224

caspol, 116

CodePlex, 3, 9, 226, 243

context, 8, 12, 13, 25, 31, 41, 45, 46, 51, 69, 71, 109, 112,

119, 135, 137, 144, 145, 150, 153, 161, 185, 186, 187,

188, 190, 192, 193, 194, 196, 197, 198, 205, 210, 218,

223, 225, 235, 236

correlation, 25, 140, 144, 151, 235, 236, 241, 242

CSV Files, 6, 111, 112

Data Collectors, 109

data source, 4, 6, 8, 16, 31, 42, 43, 44, 98, 101, 111, 112,

148, 186, 208, 212

declarative web test, 6, 8, 11, 32, 41, 45, 69, 70, 110, 113,

117, 150, 152, 184, 185, 187, 198, 206, 207, 224, 238

dependent requests, 6, 8, 9, 13, 17, 33, 37, 46, 47, 94, 108,

109, 122, 184, 186, 189, 214, 224

Deployment, 5, 73, 81, 82, 84, 89, 203, 218, 219, 220, 221,

222, 223

Execution Interleaving, 27

extract, 4, 6, 12, 25, 26, 33, 34, 35, 36, 37, 39, 41, 42, 43,

44, 51, 119, 144, 146, 148, 149, 150, 152, 153, 185, 194,

224, 226, 235, 237, 238, 239, 241, 242, 246

Fiddler, 5, 7, 72, 110, 127, 132

HIDDEN parameters, 9, 119, 185, 248

HTTP Headers, 4, 11, 48, 94, 109, 125, 178, 186, 216, 217,

246

Content-Type, 7, 11, 129

If-Modified-Since, 8, 186

Pragma, 11

Referrer, 11

SOAPAction, 11

x-microsoftajax, 11

Internet Explorer, 17, 72, 178, 186, 217

IP Address, 5, 77

Licensing, 4, 61, 117

Load Test Options

Agents to Use, 68

Delay Start Time, 68

Disable During Warmup, 68

Logging, 6, 7, 117, 122, 153, 178, 180

lusrmgr.msc, 207

MSTest, 7, 15, 27, 109, 130, 164, 166, 170, 172, 215, 248

Network

Firewall, 161, 164, 165, 166, 167, 172, 173

Netmon, 110, 246

Netstat, 118, 246

TCP Parameters, 118

TCPView, 202

Tracing, 6, 7, 9, 103, 137, 138, 168, 170, 171, 176, 244

NUnit, 27

Parameter Data

Data Source, 4, 6, 8, 16, 31, 98, 101, 111, 148, 208, 212

Random, 4, 8, 16, 164, 165, 187

Sequential, 4, 16, 65, 208

Unique, 4, 8, 16, 77, 94, 105, 109, 144, 183, 187, 193,

208, 209

Parameters, 8, 13, 26, 46, 48, 49, 51, 67, 112, 118, 119,

135, 143, 144, 145, 146, 147, 148, 149, 150, 151, 153,

180, 184, 186, 187, 190, 192, 193, 194, 196, 197, 198,

205, 217, 221, 223, 235, 236, 237, 238, 239, 241, 242,

248

performance counters, 5, 7, 22, 82, 90, 92, 93, 94, 103,

114, 132, 183, 244

Performance Monitor, 5, 93, 94

Permissions, 165, 166

phishing, 113

processor, 6, 22, 63, 91, 92, 117

proxy server, 5, 8, 72, 129, 132, 134, 135, 204, 210

random, 4, 8, 16, 164, 165, 187

redirection, 8, 196

regedit, 11, 81

REGISTRY Settings

HKEY_CURRENT_USER, 11, 171, 178, 210

HKEY_LOCAL_MACHINE, 81, 113, 118, 172

Reporting Name, 4, 13, 52, 53, 155

RequestHeadersToRecord, 11

Results

WebTestResult, 6, 107, 108, 109, 147

SOAP, 185

SSL

Certificates, 8, 185, 195

HTTPS, 6, 113, 115

ServicePointManager, 8, 129, 195

SecurityProtocol, 195

SecurityProtocolType, 195

ServicePointManager.

ServerCertificateValidationCallback, 195

SSLv3, 8, 195

TLS, 8, 195

X509Certificate, 185, 195

Symbols, 82

Sysinternals

Page 251: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 251

PsTools, 202

Sysinternals, 171

TeamTestAgentService, 165, 167, 207

test rig, 4, 5, 8, 22, 29, 30, 31, 61, 74, 76, 78, 80, 81, 82,

113, 117, 167, 169, 177, 190, 207, 224, 225

TIME_WAIT, 118

timeouts, 4, 5, 9, 15, 46, 73, 84, 92, 120, 174, 176, 224

Transactions, 6, 32, 57, 58, 99, 102, 103, 104, 105, 114

URL, 52, 67, 105, 116, 126, 185, 196, 197, 198, 211, 235,

236

validate, 7, 8, 9, 12, 13, 32, 33, 34, 35, 36, 37, 39, 40, 41,

42, 43, 44, 135, 144, 148, 152, 155, 177, 180, 188, 195,

196, 198, 223, 226

verbose logging, 180, 182, 183

VIEWSTATE, 25, 109, 180, 181, 228, 231, 233, 234, 248

Virtual User Pack, 61, 62, 63, 64, 71

VSTT 2010

branching, 144, 152, 198

conditional rule, 152, 153, 198, 199

looping, 16, 144, 152, 153, 198

VSTT Classes

FormPostHttpBody, 184

IHttpBody, 184

StringHttpBody, 125, 184, 207

WebTest, 8, 32, 33, 34, 35, 36, 37, 38, 40, 41, 42, 43, 44,

45, 66, 72, 135, 188, 193, 194, 195, 196, 197, 204,

206, 210, 224

WebProxy, 72, 204

WebTestContext, 7, 137, 188, 193, 205

WebTestPlugin, 9, 11, 40, 45, 186, 189, 193, 204, 224,

249

WebTestRequest, 17, 41, 45, 94, 125, 149, 150, 185,

186, 188, 189, 196, 206, 238, 239, 248

ClientCertificates, 185

WebTestResponse, 188, 234

VSTT Configuration Files

Counterset files

DefaultCounter, 93

DefaultCountersForAutomaticGraphs, 93

HigherIsBetter, 91

LoadTestCounterCategoryExistsTimeout, 92

LoadTestCounterCategoryReadTimeout, 92

Range, 91

RangeGroup, 91

QTAgent.exe.config, 75, 124, 180, 183

QTAgentService.exe.config, 72, 182

QTAgentServiceUI.exe.config, 180

QTController.exe.config, 74, 92, 180

Test Run Configuration files, 84

Test Setting files, 84

VSTestHost.exe.config, 74, 75, 92

VSTT Extraction Rules

ExtractionRuleInnerText, 226

ExtractionRuleSelectTag, 226

VSTT Methods

Add Call to Web Test, 66

adding a context parameter, 186, 188

ClassCleanUp, 27, 28

ClassInitialize, 27, 28, 98, 129, 209

GetRequestEnumerator, 41, 188, 195

PostRequest, 12, 33, 34, 35, 36, 37, 40, 41, 42, 43, 44,

186, 188, 189

PostRequestEvent, 195

StringHttpBody, 125, 184, 207

System.Net.HttpWebRequest, 115, 125

TestCleanUp, 28

TestInitialize, 27, 28

WebTestExecutionInstruction, 196, 197

VSTT Plugins

LoadTestAborted, 13, 38

LoadTestFinished, 13, 38

LoadTestStarting, 13, 38, 210

LoadTestWarmupComplete, 13, 38, 39

PostPage, 12, 33, 34, 35, 36, 37, 40, 41, 42, 43, 44

PostRequest, 12, 33, 34, 35, 36, 37, 40, 41, 42, 43, 44,

186, 188, 189

PostTransaction, 12, 34, 40, 41, 43

PostWebTest, 12, 38, 40, 41, 44, 188, 193

PrePage, 11, 12, 33, 34, 35, 36, 37, 40, 41, 42, 43, 44

PreRequest, 8, 12, 33, 34, 35, 36, 37, 40, 41, 42, 43, 44,

194, 195, 196, 197, 207

PreTransaction, 11, 12, 32, 40, 41, 42

PreWebTest, 11, 12, 32, 40, 41, 42, 45, 204, 224

TestFinished, 13, 38, 39

TestSelected, 13, 38, 39

TestStarting, 13, 31, 38, 39

ThresholdExceeded, 13, 38, 39

VSTT Properties

All Individual Details, 54, 102, 117

Cache Control, 17

EventTarget, 194

Follow Redirects, 185

Goal Based Load Pattern, 22, 114

Initial User Count, 22

LoadTestMaxErrorsPerType, 74

Lower Values Imply Higher Resource Use, 22

MaximumUserCount, 22

Percentage of New Users, 17, 18, 176

ResponseBodyCaptureLimit, 45, 110, 215, 224

Run unit tests in application domain, 4, 14

Sample Rate, 22, 93

Statistics Only, 102

Page 252: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 252

Stop Adjusting User Count When Goal Achieved, 22

Target Range for Performance Counter, 22

Test Iterations, 4, 6, 14, 15, 18, 23, 119, 120

Think Time, 4, 6, 8, 9, 13, 15, 23, 57, 104, 112, 155, 187,

192, 249

TimingDetailsStorage, 6, 54, 57, 102, 117, 136, 176, 202

Use Test Iterations, 14

WebTestIteration, 193

VSTT Runtime

QTAgent, 75, 119, 124, 180, 183

QTController, 74, 81, 92, 174, 180

VSTestHost, 74, 75, 92, 120

VSTT Settings

Administer Test Controllers, 68, 81, 169, 207

Analysis Comment, 9, 176, 226

Follow Redirects, 185

VSTT Test Types

Sequential Test Mix, 4, 65

Web Test Composition, 66

VSTT Validation Rules

ValidationRuleInnerText, 226

ValidationRuleSelectTag, 226

Page 253: Visual Studio Performance Testing Quick Reference Guide 3_6

Visual Studio Performance Testing Quick Reference Guide Page 253