Feed forward neural networks are, in essence, generalized multivariate nonlinear function approximators. Clever algorithms (that include things like genetic optimization, canonical mutation, and conjugate gradient optimization) can be used to ``train'' (create) a neural network that models distribution functions with exclusive-or-like logical properties that might be candidate correlated multivariate wavefunctions for certain problems. Representations that can be empirically optimized in a (relatively) efficient multivariate search have other interesting applications in physics. We are in the initial stages of guiding several advanced undergraduates in projects intending to explore the application of neural networks and genetic optimization algorithms in physics and elsewhere.

Robert G. Brown 2001-08-03