The data ingestion patterns presented here are all designed for high ingestion rates. When implementing these in Windows Azure, it’s imperative to test these patterns with sufficient load, to understand how to scale them in production systems.
While developing these sample applications, Visual Studio Test provided the load engine, which works with both self-hosted configurations and TFS-as-a-Service.

The Test Model

The included load-test sample builds upon a hypothetical sensor-tracking system that transmits:
  • Timestamp
  • Latitude/longitude
  • Current speed
The load generator creates random data representing these fields. To provide content suitable for analysis, the latitude/longitude values are bound to a specific geographic area: Seattle, Washington.

Data ingestion

Data created by the load-test framework is sent directly to the front-end, via one of the defined inputs, as described in Data Sources:
  • IIS/Web API
  • OWIN/Web API
  • Node.js

Throughput measurement

Visual Studio provides details around inserts/second (specifically measuring ingestion at the Data Source point).

Load generation recipe

The included load generation configuration starts with a minimal load to allow all code across all Front End instances to adequately warm up:
  • Compile any on-demand code (JIT)
  • Allocate any needed system resources
Windows Azure Storage also adjusts itself to a particular load pattern, rebalancing storage partitions accordingly over time. Typically this takes around 20 minutes of sustained load. It’s important to allow this to take place before gathering measurements.

Considerations for Hosted TFS

TFS-as-a-Service provides a limited quote for load-testing. This may need to be augmented by self-hosted TFS.

Last edited Nov 8, 2013 at 5:37 PM by DMakogon, version 1