tuned_tabpfn ¶
Hyperparameter Optimization (HPO) for TabPFN models.
This module provides automatic tuning capabilities for TabPFN models using Bayesian optimization via Hyperopt. It finds optimal hyperparameters for both the TabPFN model and its inference configuration.
Key features: - Optimized search spaces for classification and regression tasks - Support for multiple evaluation metrics (accuracy, ROC-AUC, F1, RMSE, MAE) - Proper handling of categorical features through automatic encoding - Compatible with both TabPFN and TabPFN-client backends - Implements scikit-learn's estimator interface for easy integration - Built-in validation strategies for reliable performance estimation
Example usage
from tabpfn_extensions.hpo import TunedTabPFNClassifier
# Create a tuned classifier with 50 optimization trials
tuned_clf = TunedTabPFNClassifier(
n_trials=50, # Number of hyperparameter configurations to try
metric='accuracy', # Metric to optimize
categorical_feature_indices=[0, 2], # Categorical features
random_state=42 # For reproducibility
)
# Fit will automatically find the best hyperparameters
tuned_clf.fit(X_train, y_train)
# Use like any scikit-learn estimator
y_pred = tuned_clf.predict(X_test)
MetricType ¶
Bases: str
, Enum
Supported evaluation metrics for TabPFN hyperparameter tuning.
This enum defines the metrics that can be used to evaluate and select the best hyperparameter configuration during optimization.
Values
TunedTabPFNBase ¶
Bases: BaseEstimator
Base class for tuned TabPFN models with proper categorical handling.
TunedTabPFNClassifier ¶
Bases: TunedTabPFNBase
, ClassifierMixin
TabPFN Classifier with hyperparameter tuning and proper categorical handling.
TunedTabPFNRegressor ¶
Bases: TunedTabPFNBase
, RegressorMixin
TabPFN Regressor with hyperparameter tuning and proper categorical handling.