Implementation:Fastai Fastbook Collab Learner NN
| Knowledge Sources | |
|---|---|
| Domains | Recommender Systems, Deep Learning |
| Last Updated | 2026-02-09 17:00 GMT |
Overview
Concrete tool for creating and training a neural network collaborative filtering model provided by fastai.collab and fastai.tabular.model.
Description
When collab_learner is called with use_nn=True, it creates an EmbeddingNN model internally instead of the default EmbeddingDotBias. EmbeddingNN is a subclass of TabularModel that concatenates user and item embeddings and passes them through a sequence of fully connected hidden layers with ReLU activations. The layers parameter specifies the sizes of the hidden layers. Embedding sizes are determined automatically by the get_emb_sz heuristic. The final output is a single scalar passed through sigmoid_range.
Usage
Use collab_learner(dls, use_nn=True, ...) when you want a neural network approach to collaborative filtering, or when you plan to extend the model with additional categorical or continuous features. Pass the layers parameter to define the hidden layer architecture.
Code Reference
Source Location
- Repository: fastbook
- File: translations/cn/08_collab.md (Lines 697-716)
Signature
# collab_learner with use_nn=True
collab_learner(
dls: DataLoaders,
use_nn: bool = True,
emb_szs: dict = None,
layers: list = [100, 50],
config: dict = None,
y_range: tuple = (0, 5.5),
loss_func: callable = None,
**kwargs
) -> Learner
Internal model class:
@delegates(TabularModel)
class EmbeddingNN(TabularModel):
def __init__(self, emb_szs, layers, **kwargs):
super().__init__(emb_szs, layers=layers, n_cont=0, out_sz=1, **kwargs)
Import
from fastai.collab import collab_learner, EmbeddingNN
from fastai.tabular.model import TabularModel
I/O Contract
Inputs
| Name | Type | Required | Description |
|---|---|---|---|
| dls | DataLoaders | Yes | DataLoaders created by CollabDataLoaders.from_df
|
| use_nn | bool | Yes | Must be set to True to use the neural network architecture
|
| layers | list of int | No | Sizes of hidden layers; e.g., [100, 50] creates two hidden layers of width 100 and 50
|
| y_range | tuple | No | Output range as (low, high); e.g., (0, 5.5)
|
| wd | float | No | Weight decay (L2 regularization) passed to fit_one_cycle; e.g., 0.1
|
Outputs
| Name | Type | Description |
|---|---|---|
| learn | Learner | A fastai Learner wrapping the EmbeddingNN model |
| learn.model | EmbeddingNN | The trained model; a subclass of TabularModel |
| Forward pass | Tensor (bs, 1) | Predicted ratings after sigmoid_range clamping |
| Embedding layers | Embedding | User and item embeddings with sizes determined by get_emb_sz; e.g., (944, 74) and (1635, 101)
|
| Hidden layers | nn.Sequential | Sequence of Linear + ReLU layers as specified by the layers parameter
|
Usage Examples
Basic Usage
from fastai.collab import *
from fastai.tabular.all import *
# Assumes 'dls' DataLoaders already created
# (see Fastai_Fastbook_CollabDataLoaders_From_Df)
# Check recommended embedding sizes
embs = get_emb_sz(dls)
print(embs)
# Output: [(944, 74), (1635, 101)]
# Create a neural network collaborative filtering learner
# with two hidden layers of sizes 100 and 50
learn = collab_learner(dls, use_nn=True, y_range=(0, 5.5), layers=[100, 50])
# Train for 5 epochs with one-cycle policy and weight decay
learn.fit_one_cycle(5, 5e-3, wd=0.1)
# Output:
# epoch train_loss valid_loss time
# 0 1.002747 0.972392 00:16
# 1 0.926903 0.922348 00:16
# 2 0.877160 0.893401 00:16
# 3 0.838334 0.865040 00:16
# 4 0.781666 0.864936 00:16
# Inspect the model architecture
print(learn.model)
# Output: EmbeddingNN(
# (embeds): ModuleList(
# (0): Embedding(944, 74)
# (1): Embedding(1635, 101)
# )
# (emb_drop): Dropout(p=0.0)
# (bn_cont): BatchNorm1d(0)
# (layers): Sequential(
# (0): LinBnDrop(...)
# (1): LinBnDrop(...)
# (2): LinBnDrop(...)
# )
# )
Manual Neural Model (Without collab_learner)
from fastai.collab import *
from fastai.tabular.all import *
# Define a custom neural collaborative filtering model
class CollabNN(Module):
def __init__(self, user_sz, item_sz, y_range=(0,5.5), n_act=100):
self.user_factors = Embedding(*user_sz)
self.item_factors = Embedding(*item_sz)
self.layers = nn.Sequential(
nn.Linear(user_sz[1]+item_sz[1], n_act),
nn.ReLU(),
nn.Linear(n_act, 1))
self.y_range = y_range
def forward(self, x):
embs = self.user_factors(x[:,0]), self.item_factors(x[:,1])
x = self.layers(torch.cat(embs, dim=1))
return sigmoid_range(x, *self.y_range)
# Build and train
embs = get_emb_sz(dls)
model = CollabNN(*embs)
learn = Learner(dls, model, loss_func=MSELossFlat())
learn.fit_one_cycle(5, 5e-3, wd=0.01)
Related Pages
Implements Principle
Requires Environment
- Environment:Fastai_Fastbook_Python_FastAI_Environment
- Environment:Fastai_Fastbook_CUDA_GPU_Environment