Skip to content

Setup

This guide sets up your Rust environment for the AI/ML in Rust tutorial, installing Rust, key ML libraries, and verifying with a simple program. Basic Rust familiarity is helpful, but no ML experience is needed. You’ll write your own code to maximize learning, but example code is available in our GitHub repository for reference.

Step 1: Install Rust

  1. Visit rust-lang.org.
  2. Install rustup: On Unix-like systems (macOS/Linux), run:
    bash
    curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
    On Windows, follow the website’s instructions.
  3. Choose default options during installation.
  4. Update Rust:
    bash
    rustup update
  5. Verify:
    bash
    rustc --version
    cargo --version
    Expect versions like rustc 1.86.0, cargo 1.86.0.

Step 2: Set Up an IDE

Use Visual Studio Code (VS Code) for coding efficiency:

  1. Install VS Code: Download from code.visualstudio.com.
  2. Add rust-analyzer: In VS Code Extensions (Ctrl+Shift+X), install rust-analyzer for code completion and diagnostics.

Step 3: Create a Rust Project

  1. Create project:
    bash
    cargo new rust_ml_tutorial
    cd rust_ml_tutorial
  2. Test:
    bash
    cargo run
    Expect “Hello, world!” output.

Step 4: Install ML Libraries

Add the following libraries to Cargo.toml under [dependencies]:


[dependencies]
linfa = "0.7.1"
linfa-datasets = "0.7.1"
tch = "0.17.0"
polars = { version = "0.46.0", features = ["lazy"] }
nalgebra = "0.33.2"
rust-bert = "0.23.0"
actix-web = "4.4.0"
ndarray = "0.15.6"
linfa-linear = "0.7.1"
plotters = "0.3.7"
rand = "0.9.1"
rand_distr = "0.5.1"
statrs = "0.18.0"
linfa-logistic = "0.7.1"
linfa-trees = "0.7.1"
linfa-svm = "0.7.2"

Note: Use the specified library versions to ensure compatibility with the tutorial’s sample code, as all examples are tested against these versions. Check tch compatibility with rust-bert at github.com/guillaume-be/rust-bert. Run:

bash
cargo build

Library Overview

  • linfa: Regression, clustering (Core ML).
  • linfa-datasets: Standard datasets like Breast Cancer (Core ML).
  • tch-rs: Neural networks (Deep Learning).
  • polars: Data preprocessing (Practical ML Skills).
  • nalgebra: Matrix operations (Mathematical Foundations).
  • rust-bert: NLP tasks (Advanced Topics).
  • actix-web: Model deployment (Practical ML Skills).
  • ndarray: Array operations for ML algorithms.
  • plotters: Data visualization for results.
  • rand, rand_distr, statrs: Random sampling and statistical functions.

Step 5: Using the Example Code Repository

To deepen your understanding, write your own Rust code for each lab, following the tutorial instructions. For reference, debugging, or verification, you can use the example code in our GitHub repository: https://github.com/ravishankarkumar/aiunderthehood-code-rust.

  1. Clone the repository:
    bash
    git clone https://github.com/ravishankarkumar/aiunderthehood-code-rust.git
    cd aiunderthehood-code-rust
  2. Build the project:
    bash
    cargo build
  3. Run an example (e.g., SVM):
    bash
    cargo run --example svm
    Find example files in the examples/ directory (e.g., examples/svm/main.rs). We encourage typing the code yourself to learn Rust and AI/ML concepts, using the repository only as a fallback.
  4. Run sample code from this page:
    bash
    cargo run --example setup
    Find example files in the examples/ directory (e.g., examples/setup/main.rs).

Step 6: Verify Setup

Test with a nalgebra matrix operation in src/main.rs:

rust
use nalgebra::Matrix2;

fn main() {
    let matrix = Matrix2::new(1.0, 2.0, 3.0, 4.0);
    println!("Matrix:\n{}", matrix);
}

Run:

bash
cargo run

Troubleshooting

  • Rust issues: Reinstall via rust-lang.org.
  • Dependency errors: Run cargo clean and cargo build.
  • IDE issues: Reinstall rust-analyzer.
  • Additional help: Check the repository’s README for troubleshooting tips.

Next Steps

Continue to Rust Basics or First ML Lab.

Further Reading