Distillation Can Make AI Models Smaller and Cheaper

Short excerpt below. Click through to read at the original source.

A fundamental technique lets researchers use a big, expensive model to train another model for less.

Read at Source