Welcome to the Docs

Everything you need to create, deploy, and share ML models.

Quick Start Guide

Get up and running with your first ML model in minutes.

Step 1: Upload a Dataset

Navigate to Datasets in your project and upload your data. We support CSV and XLSX formats.

  • Ensure your data has clear column headers
  • Files must be under 100MB
  • Automatic column type detection with manual overrides

Step 2: Create a Model

You have two options for creating a model:

Option A: Studio

Open Studio from the sidebar. Write Python to train your model, then export it to ML-Dash:

import mldash

# Train your model as usual...
model.fit(X_train, y_train)

# Export to ML-Dash
mldash.export(
    model=model,
    name="my-model",
    target={"label": "categorical"},
)

Option B: SDK Export

Train in Colab, Jupyter, or VS Code. Install the SDK and export your model:

pip install mldash-sdk

import mldash
mldash.login()

model.fit(X_train, y_train)

mldash.export(
    model=model,
    name="my-model",
    target={"label": "categorical"},
)

Step 3: View Your Model

Once created, view your model in your project's models page:

  • See training metrics and metadata
  • Set visibility (Private / Public)
  • Edit name, description, and tags
  • View model changelog for all events

Step 4: Test Your Endpoint

Go to your project's API page to test predictions:

  • Select a model and send a test prediction
  • Copy the endpoint URL for external use
  • Or use Dash AI to test via chat
  • View Python, cURL, and JavaScript code examples

Dataset Management

Learn how to manage and prepare your data.

Uploading Datasets

The platform supports the following file formats:

  • CSV: Comma-separated values (recommended)
  • XLSX: Excel spreadsheets

Dataset Requirements

  • Upload limits depend on plan: Free 25 MB, Plus 100 MB, Pro 250 MB
  • Column headers are required in the first row
  • At least 50 rows recommended for training
  • Numeric features should be properly formatted
  • Categorical features should be consistent (no typos)

Data Preprocessing

The platform uses sentence transformers for text processing:

  • Text descriptions are converted to 384-dimensional embeddings
  • Numerical features scaled with StandardScaler
  • Categorical features one-hot encoded
  • Train/test split configurable (default 80/20)

Multi-Feature Support

You can combine text, numerical, categorical, and datetime columns in a single model. Datetime columns are auto-expanded to year, month, day, day of week, is weekend, and hour.

Dataset Management

Click on any dataset in the list to view its changelog. You can also:

  • Refresh datasets with new data (same schema required)
  • Download datasets for offline analysis
  • Edit dataset names and descriptions
  • Delete datasets no longer needed

Common Issues

Missing or Invalid Data

Remove rows with excessive missing values or preview your data to identify issues before use.

Imbalanced Classes

For classification problems, ensure your target classes are reasonably balanced. Consider using sampling techniques if needed.

Large File Sizes

For datasets exceeding your plan's upload limit, consider downsampling or feature selection before uploading.


Models

Create, import, and manage your ML models.

Studio

Studio is an in-browser Python notebook for training models directly on the platform.

  • Train any model (sklearn, XGBoost, LightGBM, CatBoost, TensorFlow)
  • Export with mldash.export(model=model, name="...", target={...})
  • Pre-built templates available for common workflows
  • Full Python environment with popular ML libraries

Quick Export

After training, call mldash.export() and your model gets a REST endpoint automatically.

SDK Export

Bring models trained in your own environment into ML-Dash using the SDK.

Supported Formats

The SDK automatically serializes your model. Supported frameworks for SDK export: scikit-learn, XGBoost, LightGBM, and CatBoost. Other frameworks (TensorFlow/Keras, PyTorch, statsmodels) can be imported via the web UI Import Wizard.

Using mldash.export()

pip install mldash-sdk

import mldash
mldash.login()

# Train your model
model.fit(X_train, y_train)

# Export to ML-Dash
mldash.export(
    model=model,
    name="my-model",
    target={"label": "categorical"},
)

The SDK bundles your model with metadata and uploads it. Your model gets a REST endpoint automatically.

Supported Frameworks

ML-Dash supports models from these frameworks:

Classification

  • scikit-learn (SDK export)
  • Logistic Regression, KNN, Decision Tree, SVM, Naive Bayes, SGD
  • Random Forest, Extra Trees, Bagging, AdaBoost, Gradient Boosting
  • Boosting libraries (SDK export)
  • XGBoost, LightGBM, CatBoost
  • Deep learning (web import)
  • TensorFlow / Keras (.keras, .h5)

Regression

  • scikit-learn (SDK export)
  • Linear, KNN, Ridge, Lasso, Elastic Net, Decision Tree, SVR, SGD
  • Random Forest, Extra Trees, Bagging, AdaBoost, Gradient Boosting
  • Boosting libraries (SDK export)
  • XGBoost, LightGBM, CatBoost
  • Cross-framework (web import)
  • ONNX (any framework that exports to ONNX)

Training Metrics

Metrics are displayed when available (set during training or import).

Classification

  • Accuracy
  • Precision (weighted)
  • Recall (weighted)
  • F1-Score (weighted)
  • ROC-AUC

Regression

  • MSE (Mean Squared Error)
  • RMSE (Root MSE)
  • MAE (Mean Absolute Error)
  • R-squared

Model Comparison

Compare multiple models side by side to find the best performer.

  • Select 2 or more models from your project's models page
  • Click "Compare" to open the comparison view
  • Side-by-side metrics, hyperparameters diff, and training metadata

Features

Platform features for sharing, chat, embedding, and notifications.

Explore & Sharing

Share your models and datasets with the community, or browse what others have built.

Visibility Controls

  • Model visibility: Public/private toggle on your project's models page
  • Dataset visibility: Public/private toggle on your project's datasets page
  • Public items appear on the Explore page for all users

Explore Pages

  • /explore — Browse public models
  • /explore/datasets — Browse public datasets with schema preview and download
  • /explore/saved — View your bookmarked models and datasets

Dash AI

An AI assistant that helps non-technical users interact with models through conversation.

Getting Started

  • Access via /dash-ai or the navigation bar
  • Select a trained model to chat about
  • Conversations are persisted with auto-generated titles

Capabilities

  • Predict: Describe your data in natural language and the LLM extracts features and calls the model
  • Explain: Ask "why?" to get SHAP-based explanations narrated by the LLM
  • Explore: Ask about model metrics, training data, or comparisons
  • Navigate: Ask Dash AI to open Studio, model pages, or project views
  • 30 tools available including listing datasets, previewing data, managing models, editing notebooks, and cleaning data

Embeddable Playground

Embed a prediction form for any public model on your own site.

  • URL pattern: /embed/{username}/{model-slug}
  • Compact prediction form with no dashboard chrome
  • Find the embed code snippet on Explore → model detail → API tab
  • Use an <iframe> with configurable height

Notifications

Stay informed about model sharing and account activity.

  • Dataset processed
  • Model published / unpublished
  • Rate limit warnings

Bell icon in the header shows unread count. Configure per-type preferences (in-app and email toggles) at /settings/notifications.


API Reference

Access your trained models via HTTP API.

Overview

The Prediction API allows you to integrate your trained models into external applications. Send input data and receive predictions in real-time.

All trained models can be accessed via the API using your API key. Public models can also be accessed by anyone with an API key — the endpoint checks your models first, then falls back to public models.

Authentication

All API requests require an API key. Create and manage keys in the API Keys section.

Header: X-API-Key
Value: Your API key starting with mldash_

Security Notice

Keep your API keys secure. Never commit them to version control or share them publicly. Store them as environment variables.

Rate Limiting & Subscription Plans

API requests are rate-limited based on your subscription plan. See pricing →

Free

  • 200 requests/day, 1,000/month
  • 15K AI tokens/day
  • 3 public models, 10,000 dataset rows
  • 500 MB storage, 25 MB per upload, 5 public datasets

Plus

  • 5,000 requests/day, 50,000/month
  • 100K AI tokens/day
  • 20 public models, 50,000 dataset rows
  • 5 GB storage, 100 MB per upload, 25 public datasets

Pro

  • Unlimited requests
  • Unlimited AI tokens
  • Unlimited public models and dataset rows
  • 25 GB storage, 250 MB per upload, unlimited public datasets

Prediction Endpoint

POST/api/v1/predict/{identifier}/

Make predictions with a trained model. The identifier can be a model slug or ID. Your own models are checked first, then public models.

Request

{
  "text": "Coffee at Starbucks $4.50",
  "features": {           // optional
    "amount": 4.50,
    "category": "food"
  }
}

Response

{
  "prediction": "Food & Dining",
  "probabilities": {      // classifiers only
    "Food & Dining": 0.87,
    "Shopping": 0.09,
    "Transport": 0.04
  },
  "model_id": 1,
  "model_slug": "transaction-categorizer",
  "model_name": "Transaction Categorizer",
  "model_type": "logistic_regression"
}

Code Examples

import requests
import os

API_KEY = os.getenv('ML_DASH_API_KEY')

response = requests.post(
    "https://api.mldash.com/api/v1/predict/my-model/",
    headers={
        'X-API-Key': API_KEY,
        'Content-Type': 'application/json'
    },
    json={"text": "Coffee at Starbucks"}
)

result = response.json()
print(f"Prediction: {result['prediction']}")

Error Responses

401

Invalid or missing API key

403

Model is not trained or not accessible

404

Model not found

400

Invalid request (missing required fields)

429

Rate limit exceeded (based on your subscription plan)

500

Prediction failed (internal error)