You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/getting_started/install.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,4 +1,4 @@
1
-
You can access our models through our API (https://github.com/automl/tabpfn-client) or via our user interface built on top of the API (https://www.ux.priorlabs.ai/).
1
+
You can access our models through our API (https://github.com/automl/tabpfn-client), via our user interface built on top of the API (https://www.ux.priorlabs.ai/) or locally.
2
2
3
3
=== "Python API Client (No GPU, Online)"
4
4
@@ -28,4 +28,4 @@ You can access our models through our API (https://github.com/automl/tabpfn-clie
28
28
!!! warning
29
29
R support is currently under development.
30
30
You can find a work in progress at [TabPFN R](https://github.com/robintibor/R-tabpfn).
Copy file name to clipboardExpand all lines: docs/getting_started/intended_use.md
+7-6Lines changed: 7 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -3,15 +3,17 @@
3
3
!!! note
4
4
For a simple example getting started with classification see [classification tutorial](../tutorials/classification.md).
5
5
6
-
We provide a comprehensive demo notebook that guides through installation and functionalities at [Interactive Colab Tutorial (with GPU usage)](https://tinyurl.com/tabpfn-colab-local) and [Interactive Colab Tutorial (without GPU usage)](https://tinyurl.com/tabpfn-colab-online).
6
+
We provide two comprehensive demo notebooks that guides through installation and functionalities. One [colab tutorial using the cloud](https://tinyurl.com/tabpfn-colab-online) and one [colab tutorial using the local GPU](https://tinyurl.com/tabpfn-colab-local).
7
7
8
8
### When to use TabPFN
9
9
10
-
TabPFN excels in handling small to medium-sized datasets with up to 10,000 samples and 500 features. For larger datasets, approaches such as CatBoost, XGB, or AutoGluon are likely to outperform TabPFN.
10
+
TabPFN excels in handling small to medium-sized datasets with up to 10,000 samples and 500 features. For larger datasets, methods such as CatBoost, XGBoost, or AutoGluon are likely to outperform TabPFN.
11
11
12
12
### Intended Use of TabPFN
13
13
14
-
While TabPFN provides a powerful drop-in replacement for traditional tabular data models, achieving top performance on real-world problems often requires domain expertise and the ingenuity of data scientists. Data scientists should continue to apply their skills in feature engineering, data cleaning, and problem framing to get the most out of TabPFN.
14
+
TabPFN is intended as a powerful drop-in replacement for traditional tabular data prediction tools, where top performance and fast training matter.
15
+
It still requires data scientists to prepare the data using their domain knowledge.
16
+
Data scientists will see benefits in performing feature engineering, data cleaning, and problem framing to get the most out of TabPFN.
15
17
16
18
### Limitations of TabPFN
17
19
@@ -21,7 +23,7 @@ While TabPFN provides a powerful drop-in replacement for traditional tabular dat
21
23
22
24
### Computational and Time Requirements
23
25
24
-
TabPFN is computationally efficient and can run on consumer hardware for most datasets. Training on a new dataset is recommended to run on a GPU as this speeds it up significantly. However, TabPFN is not optimized for real-time inference tasks.
26
+
TabPFN is computationally efficient and can run inference on consumer hardware for most datasets. Training on a new dataset is recommended to run on a GPU as this speeds it up significantly. TabPFN is not optimized for real-time inference tasks, but V2 can perform much faster predictions than V1 of TabPFN.
25
27
26
28
### Data Preparation
27
29
@@ -33,5 +35,4 @@ TabPFN's predictions come with uncertainty estimates, allowing you to assess the
33
35
34
36
### Hyperparameter Tuning
35
37
36
-
TabPFN provides strong performance out-of-the-box without extensive hyperparameter tuning. If you have additional computational resources, you can further optimize TabPFN's performance using random hyperparameter tuning or the Post-Hoc Ensembling (PHE) technique.
37
-
38
+
TabPFN provides strong performance out-of-the-box without extensive hyperparameter tuning. If you have additional computational resources, you can automatically tune its hyperparameters using [post-hoc ensembling](https://github.com/PriorLabs/tabpfn-extensions/tree/main/src/tabpfn_extensions/post_hoc_ensembles) or [random tuning](https://github.com/PriorLabs/tabpfn-extensions/tree/main/src/tabpfn_extensions/hpo).
0 commit comments