Back to Search
Start Over
Multitask Learning over Shared Subspaces
- Source :
- PLoS Computational Biology, Vol 17, Iss 7, p e1009092 (2021), PLoS Computational Biology
- Publication Year :
- 2020
- Publisher :
- Cold Spring Harbor Laboratory, 2020.
-
Abstract
- This paper uses constructs from machine learning to define pairs of learning tasks that either shared or did not share a common subspace. Human subjects then learnt these tasks using a feedback-based approach and we hypothesised that learning would be boosted for shared subspaces. Our findings broadly supported this hypothesis with either better performance on the second task if it shared the same subspace as the first, or positive correlations over task performance for shared subspaces. These empirical findings were compared to the behaviour of a Neural Network model trained using sequential Bayesian learning and human performance was found to be consistent with a minimal capacity variant of this model. Networks with an increased representational capacity, and networks without Bayesian learning, did not show these transfer effects. We propose that the concept of shared subspaces provides a useful framework for the experimental study of human multitask and transfer learning.<br />Author summary How does knowledge gained from previous experience affect learning of new tasks? This question of “Transfer Learning” has been addressed by teachers, psychologists, and more recently by researchers in the fields of neural networks and machine learning. Leveraging constructs from machine learning, we designed pairs of learning tasks that either shared or did not share a common subspace. We compared the dynamics of transfer learning in humans with those of a multitask neural network model, finding that human performance was consistent with a minimal capacity variant of the model. Learning was boosted in the second task if the same subspace was shared between tasks. Additionally, accuracy between tasks was positively correlated but only when they shared the same subspace. Our results highlight the roles of subspaces, showing how they could act as a learning boost if shared, and be detrimental if not.
- Subjects :
- Male
Computer science
Social Sciences
Multi-task learning
computer.software_genre
Field (computer science)
Task (project management)
Machine Learning
Learning and Memory
0302 clinical medicine
Task Performance and Analysis
Human Performance
Psychology
Biology (General)
0303 health sciences
Ecology
Artificial neural network
Applied Mathematics
Simulation and Modeling
Multitasking Behavior
Linear subspace
Computational Theory and Mathematics
Modeling and Simulation
Physical Sciences
Female
Transfer of learning
Algorithms
Subspace topology
Research Article
Adult
Optimization
Computer Science::Machine Learning
Computer and Information Sciences
Adolescent
Neural Networks
QH301-705.5
Research and Analysis Methods
Bayesian inference
Machine learning
Young Adult
Human Learning
Machine Learning Algorithms
03 medical and health sciences
Cellular and Molecular Neuroscience
Artificial Intelligence
Genetics
Humans
Learning
Molecular Biology
Ecology, Evolution, Behavior and Systematics
030304 developmental biology
Behavior
business.industry
Cognitive Psychology
Computational Biology
Biology and Life Sciences
Bayes Theorem
Task (computing)
Cognitive Science
Neural Networks, Computer
Artificial intelligence
business
computer
Mathematics
030217 neurology & neurosurgery
Neuroscience
Subjects
Details
- Database :
- OpenAIRE
- Journal :
- PLoS Computational Biology, Vol 17, Iss 7, p e1009092 (2021), PLoS Computational Biology
- Accession number :
- edsair.doi.dedup.....cd4cd465d3bd4233dccf450207a5d122