There is striking dynamical complexity that is readily accessible with networks of even the simplest analog circuits, so long as they are non-linear.
A fundamental characteristic of brains is that their capabilities are mostly not hardwired by design, but reflect self-organization: brain networks possess so-called emergent properties that cannot be easily inferred from their separate constituent elements. The possibility to engineer a similar approach electronically would likely boost the ability of neuromorphic systems to solve classification and control tasks in a highly size- and energy-efficient manner, with practical implications for both embedded and large-scale computing. However, engineering self-organization remains difficult: leaping from neuromorphic computation by design – amply explored over the past decades – to neuromorphic computation by emergence may look like an insurmountable challenge.
Yet the ability to self-organize is far from unique to networks of neurons and may also be exhibited by elementary electronic oscillators. It is easy to build networks of oscillators that synchronize in patterns not trivially related to their connectivity, and that change dynamically depending on specific settings or inputs. Because this phenomenon can be readily observed even with elementary oscillators, it is possible to implement large networks that, at least in principle, may harbour huge computational capabilities. To stimulate work in this direction, my recent research has explored emergent phenomena through building a diverse set of oscillators including single-transistor circuits [references 1 to 3], CMOS inverter-rings , gas discharge tube circuits  and field-programmable analog arrays (FPAAs) .
Next: A strange board