This video demonstrates how to use a neural network to control a synthesizer that has 10 control parameters using just the 2 control parameters of an XY pad with the FluCoMa Toolkit.
0:17 demo
0:24 theory
3:45 begin coding
5:21 FluidDataSet
7:09 FluidBufToKr
8:38 adding data points to FluidDataSet
12:54 saving FluidDataSets to disk
16:41 training the neural network (FluidMLPRegressor)
21:04 saving the state of FluidMLPRegressor to disk
22:27 making predictions with FluidMLPRegressor
26:00 updating the MultiSliderView with the predicted values
28:31 next steps
32:27 triggering predictions on the server using FluidMLPRegressor's .kr method
To learn more about FluidMLPRegressor visit:
[ Ссылка ]
[ Ссылка ]
[ Ссылка ]
Starting Code: [ Ссылка ]
Complete Code (without MLPRegressor .kr): [ Ссылка ]
Complete Code (with MLPRegressor .kr): [ Ссылка ]
The Fluid Corpus Manipulation Toolbox (FluCoMa) is a software extension that enables programmatic sound bank mining with machine listening and machine learning within Max, SuperCollider, and Pure Data.
Website: [ Ссылка ]
Download: [ Ссылка ]
Discourse: [ Ссылка ]
Max: [ Ссылка ]
SuperCollider: [ Ссылка ]
Pure Data: [ Ссылка ]
Ещё видео!