How to Adapt Traditional Distillation Methods to Improve Weakly Supervised Object Detection

How to adapt traditional distillation methods

Fermentation and distillation are the two processes essential to creating spirits, respectively. Fermentation produces alcohol while distillation separates it from water and other parts of its mixture.

To create spirits such as gin, ingredients are placed into a fermenter where they are mixed with special yeast that feeds on sugar to produce alcohol and carbon dioxide as by-products. Once fermented, this liquid mixture with 7 to 9% alcohol needs to be distilled into its final state – our beloved spirit!

Distillation equipment separates volatile compounds from non-volatile parts by distillation equipment such as an Alembic Still or Distillation Column, so as to prevent hotspots and thermal degradation of product during rapid distillation. Distillation must take place slowly as rapid distillation could result in hot spots and thermal degradation of product.

Copper stills are used to separate volatile compounds with differing levels of ethanol according to their boiling points, producing fractions with different degrees of ethanol content in each fraction. At the head of the distillate is concentrated with methanol, fatty acid esters and n-propanols, while at its tail are most of its isoalcohols; those at either end contribute fruity aromas while isoalcohols have strong flavour impacts.

Knowledge Distillation has been proposed as one method of increasing Weakly Supervised Object Detection accuracy. A pre-trained “teacher” model transfers its knowledge to an untrained student model which then can be trained on new data faster. This allows faster training times compared with using the entire teacher network for each task.