You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: PerfTests/Docs/Concepts.md
+21-20Lines changed: 21 additions & 20 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,4 +1,4 @@
1
-
# CodeJam.PerfTests.
1
+
# CodeJam.PerfTests overview
2
2
3
3
> **META-NOTE**
4
4
>
@@ -15,7 +15,7 @@
15
15
The main thing introduced by CodeJam.PerfTests is a concept of _competition_. Competiton plays the same role the Benchmark class in BenchmarkDotNet does: it contains methods to measure. Main differences between benchmarks and competitions are:
16
16
17
17
* Competitions always include a baseline method. Baseline is required to provide relative timings (see below). Use `[CompetitionBaseline]` attribute to mark the baseline method.
18
-
* Competitions are meant to be run multiple times and their results should be comparable even if previous run was performed on another machine. Therefore competition results are stored as a relative-to-baseline timings.
18
+
* Competitions are meant to run multiple times and their results should be comparable even if previous run was performed on another machine. Therefore competition results are stored as a relative-to-baseline timings.
19
19
* Competition methods (except baseline method) are annotated with competition limits that describe expected execution time (relative-to-baseline time is used). Use `[CompetitionBenchmark]` to mark the competition method and set limits for it.
20
20
* The `Competition.Run()` method should be used to run the competition (BenchmarkDotNet uses `BenchmarkRunner.Run()`).
21
21
* Single competition run can invoke `BenchmarkRunner.Run()`multiple times (for example, additional runs are performed if competition limits were adjusted).
@@ -28,7 +28,7 @@ The main thing introduced by CodeJam.PerfTests is a concept of _competition_. Co
28
28
29
29
In additional to the list above there are some limitations:
30
30
31
-
* Competitions use its own configuration system. Please do not apply BenchmarkDotNet's `[Config]` attributes to the competition classes, the behavior is undefined.
31
+
* Competitions use its own configuration system. Please do not apply BenchmarkDotNet's `[Config]` attributes to the competition classes, resulting behavior is undefined.
32
32
33
33
* Competitions do not support diagnosers by default. You need to set up toolchain from BenchmarkDotNet to enable diagnosers.
34
34
@@ -57,7 +57,7 @@ In additional to the list above there are some limitations:
57
57
>publicvoidSomeMethod() =>...
58
58
> ```
59
59
>
60
-
>Notabestsolution, Iagree. Butatleastitdoesnotteaseyourbrainwith"What limits should I rely on?"
60
+
>Notabestsolution, Idoagree. Butatleastitdoesnotteaseyourbrainwith"What limits should I rely on?".
@@ -69,11 +69,11 @@ In additional to the list above there are some limitations:
69
69
70
70
## Configuration system
71
71
72
-
CodeJam.PerfTests configuration uses almost same approach the BenchmarkDotNet do. However, there are additions aimed to ease configuration of large projects with hundreds or thousands of perftetests. Here's how it works:
72
+
CodeJam.PerfTests configuration uses almost same approach the BenchmarkDotNet does. However, there are additions aimed to ease configuration of large projects with hundreds or thousands of perftetests. Here's how it works:
73
73
74
74
### 0. Attribute annotations
75
75
76
-
Almost all configuration features rely on attribute annotations. Attributes are checked in the following order:
76
+
Almost all configuration features rely on attribute annotations. Attributes are checked in following order:
77
77
78
78
1. Attributes applied to the competition class or to it's base types.
79
79
2. Attributes applied to the container types or to it's base types (if the competition class is nestedtype)
@@ -83,8 +83,8 @@ If the configuration system expects only one attribute (as with `CompetitionConf
>There's no ordering for attributes applied at the same level. If there are multiple attributes applied to the type or to the assembly, they are enumerated in random order
89
89
90
90
@@ -93,7 +93,7 @@ If multiple attributes supported (`CompetitionFeaturesAttribute` as example), th
93
93
94
94
>**NOTE**
95
95
>
96
-
>Explicitconfigpassingisanadvancedtechniqueandshouldbeusedonly when you want to have a perfect control over the configuration. It skips entire configuration pipeline and therefore it's up to you to pass correct config into competition.
96
+
>Explicitconfigpassingisanadvancedtechniqueandshouldbeusedonly when you want to have a perfect control over the configuration. It skips entire configuration pipeline and therefore it's up to you to pass correct config into competition.
97
97
98
98
99
99
@@ -182,7 +182,7 @@ When the test is run the configuration system will check the competition's type,
182
182
183
183
> **NOTE**
184
184
>
185
-
> All declarative config annotations are honored only if the config was not passed explicitly (as a `Competition.Run()` argumentorvia `CompetitionConfigAttribute`).
185
+
> All declarative config annotations do apply only if the config was not passed explicitly (as a `Competition.Run()` argumentorvia `CompetitionConfigAttribute`).
186
186
187
187
Itshouldbeobviousfor now that CodeJam.PerfTests has very complex configuration system. At the same time most end-user use cases are very simple. You may want to enable/disable source annotations or specify target platform or just enable troubleshooting mode. You do not want to know anything about the configs or what properties should be changed to enable particular scenario. Meet the CompetitionFeatures.
188
188
@@ -200,7 +200,7 @@ As with explicit config scenario, features should be passed explicitly only when
200
200
this,
201
201
newCompetitionFeatures
202
202
{
203
-
//Tunes config to Detailed logging, allow debug builds, export measure
203
+
// Detailed logging, allow debug builds, export measurements and so on
204
204
TroubleshootingMode=true,
205
205
// We do not care whether the benchmark is run as x86 or x64
206
206
Platform=Platform.AnyCpu
@@ -269,7 +269,7 @@ Want to add CI service or have an idea howto make the feature better? *~Create a
269
269
270
270
#### 2.4 Set competition features via attributes
271
271
272
-
While default features can be good for most perftests there always are tests that require individual approach. To adjust the features just apply `[CompetitionFeatures]` attribute (or any derivedattribute) tothecompetitionclass, containertype (ifthecompetitionclassisanestedtype) ortotheassembly. Checkthe*~AttributeannotationsTODO:link*~*sectionfor explanation how the attributes are applied.
272
+
While default features can be good for most perftests there always are tests that require own feature set. If you want to add (ordisable) someparticularfeaturesapplythe `[CompetitionFeatures]` attribute (oranyderivedattribute) tothecompetitionclass, containertype (ifthecompetitionclassisanestedtype) ortotheassembly. Checkthe*~AttributeannotationsTODO:link*~*sectionfor explanation how the attributes are applied.
273
273
274
274
Here's example that covers all possible annotations for the competition features.
275
275
@@ -312,7 +312,7 @@ Here's example that covers all possible annotations for the competition features
Okay, you've set up competition features but you do want to change some options that are not exposed as a competition features. CodeJam.PerfTests provide `ICompetitionModifier` interface for tasks like this. Implement your own
318
318
@@ -367,32 +367,34 @@ As with `CompetitionFeaturesAttribute`, modifiers can be combined together. Chec
367
367
368
368
> **NOTE**
369
369
>
370
-
> All declarative config annotations are honored only if the config was not passed explicitly (as a `Competition.Run()` argumentorvia `CompetitionConfigAttribute`).
370
+
> All declarative config annotations do apply only if the config was not passed explicitly (as a `Competition.Run()` argumentorvia `CompetitionConfigAttribute`).
@@ -419,7 +421,6 @@ and apply it to the benchmark class, it's container class (if the benchmark clas
419
421
420
422
// ...
421
423
}
422
-
423
424
```
424
425
425
426
Whenthetestisruntheconfigurationsystemwillcheckthecompetition'stype, it's container type (if any) and competition'sassemblyfor the `CompetitionConfigFactoryAttribute`. First found attribute wins.
0 commit comments