-
NESTML安装教程
-
Learning function from structure in neuromorphic networks
Human brain can perform various of cognitive tasks and is able to flexibly learn new tasks without interfering other tasks. Whether and how the learning and computing capability is inherited from the brain connectomes remains unknown. This work tried to link the learning function and brain connectome in the framework of reservoir computing. They showed that the brain connectome outperform random network at crtical dynamical regime. Futhermore, they found that functional parcellation helps regulate the information flow which might facilitate the cognitive computation in brain
-
One Step Back, Two Steps Forward: Interference and Learning in Recurrent Neural Networks
Catastrophic forgetting is a key issue in continual learning paradigm. Training algorithms, like FORCE, seem to be able to bypass this to some extent. Chen and Barak applied fixed point analysis to explicitly show the change of fixed point structure of networks during training in continual learning scenario. Their work provide intuitions about how learning algorithm and the order of task sequence affect the training in continual learning.
-
Universality and individuality in neural dynamics across large populations of recurrent networks
Multi-solution is a prominant feature of ANNs (DNNs/RNNs) when training to perform certain tasks. Is there any common feature between different solutions remains an open questions. This works found that the topology of fixed points of trained network is the universally shared between different network architectures and realizations when those networks are trained for the same task. Further, they demonstrated the topological structure of fixed points of networks indeed interprets computation mechanism of trained networks.
-
路由器桥接配置