Setup
This guide sets up your Rust environment for the AI/ML in Rust tutorial, installing Rust, key ML libraries, and verifying with a simple program. Basic Rust familiarity is helpful, but no ML experience is needed. You’ll write your own code to maximize learning, but example code is available in our GitHub repository for reference.
Step 1: Install Rust
- Visit rust-lang.org.
- Install rustup: On Unix-like systems (macOS/Linux), run:bashOn Windows, follow the website’s instructions.
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
- Choose default options during installation.
- Update Rust:bash
rustup update
- Verify:bashExpect versions like
rustc --version cargo --version
rustc 1.86.0
,cargo 1.86.0
.
Step 2: Set Up an IDE
Use Visual Studio Code (VS Code) for coding efficiency:
- Install VS Code: Download from code.visualstudio.com.
- Add
rust-analyzer
: In VS Code Extensions (Ctrl+Shift+X), installrust-analyzer
for code completion and diagnostics.
Step 3: Create a Rust Project
- Create project:bash
cargo new rust_ml_tutorial cd rust_ml_tutorial
- Test:bashExpect “Hello, world!” output.
cargo run
Step 4: Install ML Libraries
Add the following libraries to Cargo.toml
under [dependencies]
:
[dependencies]
linfa = "0.7.1"
linfa-datasets = "0.7.1"
tch = "0.17.0"
polars = { version = "0.46.0", features = ["lazy"] }
nalgebra = "0.33.2"
rust-bert = "0.23.0"
actix-web = "4.4.0"
ndarray = "0.15.6"
linfa-linear = "0.7.1"
plotters = "0.3.7"
rand = "0.9.1"
rand_distr = "0.5.1"
statrs = "0.18.0"
linfa-logistic = "0.7.1"
linfa-trees = "0.7.1"
linfa-svm = "0.7.2"
Note: Use the specified library versions to ensure compatibility with the tutorial’s sample code, as all examples are tested against these versions. Check tch
compatibility with rust-bert
at github.com/guillaume-be/rust-bert. Run:
cargo build
Library Overview
linfa
: Regression, clustering (Core ML).linfa-datasets
: Standard datasets like Breast Cancer (Core ML).tch-rs
: Neural networks (Deep Learning).polars
: Data preprocessing (Practical ML Skills).nalgebra
: Matrix operations (Mathematical Foundations).rust-bert
: NLP tasks (Advanced Topics).actix-web
: Model deployment (Practical ML Skills).ndarray
: Array operations for ML algorithms.plotters
: Data visualization for results.rand
,rand_distr
,statrs
: Random sampling and statistical functions.
Step 5: Using the Example Code Repository
To deepen your understanding, write your own Rust code for each lab, following the tutorial instructions. For reference, debugging, or verification, you can use the example code in our GitHub repository: https://github.com/ravishankarkumar/aiunderthehood-code-rust.
- Clone the repository:bash
git clone https://github.com/ravishankarkumar/aiunderthehood-code-rust.git cd aiunderthehood-code-rust
- Build the project:bash
cargo build
- Run an example (e.g., SVM):bashFind example files in the
cargo run --example svm
examples/
directory (e.g.,examples/svm/main.rs
). We encourage typing the code yourself to learn Rust and AI/ML concepts, using the repository only as a fallback. - Run sample code from this page:bashFind example files in the
cargo run --example setup
examples/
directory (e.g.,examples/setup/main.rs
).
Step 6: Verify Setup
Test with a nalgebra
matrix operation in src/main.rs
:
use nalgebra::Matrix2;
fn main() {
let matrix = Matrix2::new(1.0, 2.0, 3.0, 4.0);
println!("Matrix:\n{}", matrix);
}
Run:
cargo run
Troubleshooting
- Rust issues: Reinstall via rust-lang.org.
- Dependency errors: Run
cargo clean
andcargo build
. - IDE issues: Reinstall
rust-analyzer
. - Additional help: Check the repository’s README for troubleshooting tips.
Next Steps
Continue to Rust Basics or First ML Lab.
Further Reading
- Rust Installation: rust-lang.org/tools/install
- Hands-On Machine Learning by Géron (Chapter 2)
nalgebra
Documentation: nalgebra.org