Skip to content

 

PriorLabs is building breakthrough foundation models that understand spreadsheets and databases. While foundation models have transformed text and images, tabular data has remained largely untouched. We're tackling this opportunity with technology that could revolutionize how we approach scientific discovery, medical research, financial modeling, and business intelligence.

TabPFN Integrations

  • API Client


    The fastest way to get started with TabPFN. Access our models through the cloud without requiring local GPU resources.

    TabPFN Client

  • User Interface


    Visual interface for no-code interaction with TabPFN. Perfect for quick experimentation and visualization.

    Access GUI

  • Python Package


    Local installation for research and privacy sesitive use cases with GPU support and scikit-learn compatible interface.

    TabPFN Local

  • R Integration


    Bringing TabPFN's capabilities to the R ecosystem for data scientists and researchers. We have an experimental R package and an alternative tutorial on usage in R. Contributions welcome!

Why TabPFN

  • Rapid Training


    TabPFN significantly reduces training time, outperforming traditional models tuned for hours in just a few seconds. For instance, it surpasses an ensemble of the strongest baselines in 2.8 seconds compared to 4 hours of tuning.

  • Superior Accuracy


    TabPFN consistently outperforms state-of-the-art methods like gradient-boosted decision trees (GBDTs) on datasets with up to 10,000 samples. It achieves higher accuracy and better performance metrics across a range of classification and regression tasks.

  • Robustness


    The model demonstrates robustness to various dataset characteristics, including uninformative features, outliers, and missing values, maintaining high performance where other methods struggle.

  • Generative Capabilities


    As a generative transformer-based model, TabPFN can be fine-tuned for specific tasks, generate synthetic data, estimate densities, and learn reusable embeddings. This makes it versatile for various applications beyond standard prediction tasks.

  • Sklearn Interface


    TabPFN follows the interfaces provided by scikit-learn, making it easy to integrate into existing workflows and utilize familiar functions for fitting, predicting, and evaluating models.

  • Minimal Preprocessing


    The model handles various types of raw data, including missing values and categorical variables, with minimal preprocessing. This reduces the burden on users to perform extensive data preparation.