Skip to content

Commit abd257b

Browse files
committed
PerfTests: documentation for the configuration system
1 parent b25a711 commit abd257b

File tree

2 files changed

+35
-24
lines changed

2 files changed

+35
-24
lines changed

PerfTests/Docs/Concepts.md

Lines changed: 21 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# CodeJam.PerfTests.
1+
# CodeJam.PerfTests overview
22

33
> **META-NOTE**
44
>
@@ -15,7 +15,7 @@
1515
The main thing introduced by CodeJam.PerfTests is a concept of _competition_. Competiton plays the same role the Benchmark class in BenchmarkDotNet does: it contains methods to measure. Main differences between benchmarks and competitions are:
1616

1717
* Competitions always include a baseline method. Baseline is required to provide relative timings (see below). Use `[CompetitionBaseline]` attribute to mark the baseline method.
18-
* Competitions are meant to be run multiple times and their results should be comparable even if previous run was performed on another machine. Therefore competition results are stored as a relative-to-baseline timings.
18+
* Competitions are meant to run multiple times and their results should be comparable even if previous run was performed on another machine. Therefore competition results are stored as a relative-to-baseline timings.
1919
* Competition methods (except baseline method) are annotated with competition limits that describe expected execution time (relative-to-baseline time is used). Use `[CompetitionBenchmark]` to mark the competition method and set limits for it.
2020
* The `Competition.Run()` method should be used to run the competition (BenchmarkDotNet uses `BenchmarkRunner.Run()`).
2121
* Single competition run can invoke `BenchmarkRunner.Run()`multiple times (for example, additional runs are performed if competition limits were adjusted).
@@ -28,7 +28,7 @@ The main thing introduced by CodeJam.PerfTests is a concept of _competition_. Co
2828
2929
In additional to the list above there are some limitations:
3030

31-
* Competitions use its own configuration system. Please do not apply BenchmarkDotNet's `[Config]` attributes to the competition classes, the behavior is undefined.
31+
* Competitions use its own configuration system. Please do not apply BenchmarkDotNet's `[Config]` attributes to the competition classes, resulting behavior is undefined.
3232

3333
* Competitions do not support diagnosers by default. You need to set up toolchain from BenchmarkDotNet to enable diagnosers.
3434

@@ -57,7 +57,7 @@ In additional to the list above there are some limitations:
5757
> public void SomeMethod() => ...
5858
> ```
5959
>
60-
> Not a best solution, I agree. But at least it does not tease your brain with "What limits should I rely on?"
60+
> Not a best solution, I do agree. But at least it does not tease your brain with "What limits should I rely on?".
6161
>
6262
> If you want to do quick investigation on multiple cases, consider to use raw BenchmarkDotNet benchmark.
6363
>
@@ -69,11 +69,11 @@ In additional to the list above there are some limitations:
6969
7070
## Configuration system
7171
72-
CodeJam.PerfTests configuration uses almost same approach the BenchmarkDotNet do. However, there are additions aimed to ease configuration of large projects with hundreds or thousands of perftetests. Here's how it works:
72+
CodeJam.PerfTests configuration uses almost same approach the BenchmarkDotNet does. However, there are additions aimed to ease configuration of large projects with hundreds or thousands of perftetests. Here's how it works:
7373
7474
### 0. Attribute annotations
7575
76-
Almost all configuration features rely on attribute annotations. Attributes are checked in the following order:
76+
Almost all configuration features rely on attribute annotations. Attributes are checked in following order:
7777
7878
1. Attributes applied to the competition class or to it's base types.
7979
2. Attributes applied to the container types or to it's base types (if the competition class is nested type)
@@ -83,8 +83,8 @@ If the configuration system expects only one attribute (as with `CompetitionConf
8383
8484
If multiple attributes supported (`CompetitionFeaturesAttribute` as example), they are applied in reversed order: assembly level attributes go first, container type attributes are the next and the competition class attributes are the last ones.
8585
86-
***~NOTE~***
87-
86+
> ***~NOTE~***
87+
>
8888
> There's no ordering for attributes applied at the same level. If there are multiple attributes applied to the type or to the assembly, they are enumerated in random order
8989
9090
@@ -93,7 +93,7 @@ If multiple attributes supported (`CompetitionFeaturesAttribute` as example), th
9393
9494
> **NOTE**
9595
>
96-
> Explicit config passing is an advanced technique and should be used only when you want to have a perfect control over the configuration. It skips entire configuration pipeline and therefore it's up to you to pass correct config into competition.
96+
> Explicit config passing is an advanced technique and should be used only when you want to have a perfect control over the configuration. It skips entire configuration pipeline and therefore it's up to you to pass correct config into competition.
9797
9898
9999
@@ -182,7 +182,7 @@ When the test is run the configuration system will check the competition's type,
182182
183183
> **NOTE**
184184
>
185-
> All declarative config annotations are honored only if the config was not passed explicitly (as a `Competition.Run()` argument or via `CompetitionConfigAttribute`).
185+
> All declarative config annotations do apply only if the config was not passed explicitly (as a `Competition.Run()` argument or via `CompetitionConfigAttribute`).
186186
187187
It should be obvious for now that CodeJam.PerfTests has very complex configuration system. At the same time most end-user use cases are very simple. You may want to enable/disable source annotations or specify target platform or just enable troubleshooting mode. You do not want to know anything about the configs or what properties should be changed to enable particular scenario. Meet the CompetitionFeatures.
188188
@@ -200,7 +200,7 @@ As with explicit config scenario, features should be passed explicitly only when
200200
this,
201201
new CompetitionFeatures
202202
{
203-
// Tunes config to Detailed logging, allow debug builds, export measure
203+
// Detailed logging, allow debug builds, export measurements and so on
204204
TroubleshootingMode = true,
205205
// We do not care whether the benchmark is run as x86 or x64
206206
Platform = Platform.AnyCpu
@@ -269,7 +269,7 @@ Want to add CI service or have an idea howto make the feature better? *~Create a
269269
270270
#### 2.4 Set competition features via attributes
271271
272-
While default features can be good for most perftests there always are tests that require individual approach. To adjust the features just apply `[CompetitionFeatures]` attribute (or any derived attribute) to the competition class, container type (if the competition class is a nested type) or to the assembly. Check the *~Attribute annotations TODO: link*~* section for explanation how the attributes are applied.
272+
While default features can be good for most perftests there always are tests that require own feature set. If you want to add (or disable) some particular features apply the `[CompetitionFeatures]` attribute (or any derived attribute) to the competition class, container type (if the competition class is a nested type) or to the assembly. Check the *~Attribute annotations TODO: link*~* section for explanation how the attributes are applied.
273273
274274
Here's example that covers all possible annotations for the competition features.
275275
@@ -312,7 +312,7 @@ Here's example that covers all possible annotations for the competition features
312312
313313
> **NOTE**
314314
>
315-
> All declarative config annotations are honored only if the config was not passed explicitly (as a `Competition.Run()` argument or via `CompetitionConfigAttribute`).
315+
> All declarative config annotations do apply only if the config was not passed explicitly (as a `Competition.Run()` argument or via `CompetitionConfigAttribute`).
316316
317317
Okay, you've set up competition features but you do want to change some options that are not exposed as a competition features. CodeJam.PerfTests provide `ICompetitionModifier` interface for tasks like this. Implement your own
318318
@@ -367,32 +367,34 @@ As with `CompetitionFeaturesAttribute`, modifiers can be combined together. Chec
367367
368368
> **NOTE**
369369
>
370-
> All declarative config annotations are honored only if the config was not passed explicitly (as a `Competition.Run()` argument or via `CompetitionConfigAttribute`).
370+
> All declarative config annotations do apply only if the config was not passed explicitly (as a `Competition.Run()` argument or via `CompetitionConfigAttribute`).
371371
372372
> **NOTE**
373373
>
374374
> As with explicit config passing, this is an advanced feature and it is recommended to check for existing implementations and study them at first. There's no safety net anymore.
375375
376-
If all of the above is not enough for you there's a backdoor: you can override entire creation pipeline. Implement `ICompetitionConfigFactory` or derive from existing one:
376+
If all of the above is not enough for you there's a backdoor: you can override entire config factory pipeline. Implement `ICompetitionConfigFactory` or derive from existing one:
377377
378378
```c#
379379
public class MyCompetitionFactory : CompetitionConfigFactory
380380
{
381381
public MyCompetitionFactory(string configId) : base(configId) { }
382382
383-
protected override CompetitionFeatures CompleteFeatures(CompetitionFeatures competitionFeatures)
383+
protected override CompetitionFeatures CompleteFeatures(
384+
CompetitionFeatures competitionFeatures)
384385
{
385386
// Disable CI support.
386387
competitionFeatures.ContinuousIntegrationMode = false;
387388
388389
return base.CompleteFeatures(competitionFeatures);
389390
}
390391
391-
protected override ICompetitionConfig CompleteConfig(ManualCompetitionConfig competitionConfig)
392+
protected override ICompetitionConfig CompleteConfig(
393+
ManualCompetitionConfig competitionConfig)
392394
{
393395
// No idea what to do here. Let's sort something
394-
competitionConfig.Analysers.Sort(
395-
(IAnalyser a, IAnalyser b) => String.Compare(a.Id, b.Id, StringComparison.Ordinal));
396+
competitionConfig.Analysers.Sort((IAnalyser a, IAnalyser b) =>
397+
String.Compare(a.Id, b.Id, StringComparison.Ordinal));
396398
397399
// and remove some stuff.
398400
competitionConfig.Exporters.Clear();
@@ -419,7 +421,6 @@ and apply it to the benchmark class, it's container class (if the benchmark clas
419421
420422
// ...
421423
}
422-
423424
```
424425
425426
When the test is run the configuration system will check the competition's type, it's container type (if any) and competition's assembly for the `CompetitionConfigFactoryAttribute`. First found attribute wins.

PerfTests/PerfTests.ToDo.md

Lines changed: 14 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -13,22 +13,23 @@
1313
* Skip annotation on 1st run
1414
* Exclude nunit-related assembly from tests
1515
Prepare PR, https://github.com/nunit/nunit-console/issues/62#issuecomment-262599181
16-
* absolute timings on not match warning
16+
* Message with absolute timings if limits failed: improve readability
1717
* Burst mode feature: rename to LargeSampleSet?
1818
* AnnotateSourcesOnRun: rename to something like skipFirstRuns
19-
* WithCompetitionOptions - preserve Id!
2019
* Output: option to not log output from toolchain?
2120
* Concurrency: lock should be performed on entire benchmark run.
2221
* Logging: write validator messages immediately?
2322
* Log resulting competition features / competition options?
2423
* LogColors.Hint: use it for something?
2524
* Better message for "X has empty limit. Please fill it." + do not rerun if empty limit
26-
* Better message for "run faster /slower than". Provide some suggestions?
25+
* Better message for "run faster / slower than". Provide some suggestions?
2726
* Warning if job count > 1
27+
* Apply with id for Competition options / features
28+
* Metadata attributes - order by inheritance
2829

2930
## TODOs (tests):
3031
* Source annotations: test for partial files / methods
31-
* high-priority test for TestProcessCycleTimeClock
32+
* High-priority test for TestProcessCycleTimeClock
3233
* Tests for broken log annotations.
3334
* app.config in the test integration projects: do we need it?
3435
* xUnit: tests: run as x64?
@@ -86,3 +87,12 @@ https://github.com/xunit/xunit/issues/908
8687

8788
### Layer 7: Reusable parts of the runners
8889
* Wrapping all of above into simple, configurable and reusable API
90+
91+
92+
##Long-term task: reusable limits, draft notes
93+
* Support for third-party limits, use limit provider + id
94+
* Target stores limits as a `Dictionary<provider_id, Range<double>>`
95+
* Limit provider specifies attribute name and additional parameters to be applied
96+
TODO: exact format?
97+
TODO: Use same properties for XML annotations or prefer something better?
98+
TODO: Range extension method: Min/MaxValue to infinity?

0 commit comments

Comments
 (0)