Multi-task learning (MTL) is an approach to machine learning that learns a problem together with other related problems at the same time, using a shared representation.
Questions tagged [multitask-learning]
27 questions
12
votes
3 answers
Difference between multitask learning and transfer learning
I am reading Caruana (1997) Multitask learning (pdf). In the definition of multi task learning, the author states that;
Usually, we do not care how well extra tasks are learned; their sole purpose is to help the main task be learned better.
This…

Rafael
- 1,109
- 1
- 13
- 30
10
votes
5 answers
What is the difference between Multitask and Multiclass learning
Consider a image labeling problem, where I need to assign one or more labels to an image. The possible labels are human, moving , indoor. Human means there is a human in the picture, moving could mean whether the human is runing/walking etc, and…

A.D
- 2,114
- 3
- 17
- 27
6
votes
1 answer
How to define multiple losses in machine learning?
I'm using TensorFlow for training CNN for classification. In machine learning, there are several different definitions for loss function. In general, we may select one specific loss (e.g., binary cross-entropy loss for binary classification, hinge…

mining
- 789
- 5
- 11
- 23
5
votes
2 answers
Sampling from matrix-variate normal distribution with singular covariances?
The matrix-variate normal distribution can be sampled indirectly by utilizing the Cholesky decomposition of two positive definite covariance matrices. However, if one or both of the covariance matrices are positive semi-definite and not positive…

baf84b4c
- 123
- 8
4
votes
2 answers
Multi-task learning: weight selection for combining loss functions
I am training a system that combines two sub-systems: one for classification and another for reconstruction. Can anyone suggestion what are the common practice for weight selection for combining two losses? The numerical values of two losses for…

talk2speech
- 518
- 1
- 6
- 14
4
votes
1 answer
Multi-task XGBoost
Is there a way to adapt the XGBoost algorithm to the multi-task case? Say there are related output variables and for some samples, some of those outcomes are missing. Is there a way to train XGBoost so that it lets information sharing across the…

user5054
- 1,259
- 3
- 13
- 31
3
votes
1 answer
Classification followed by regression?
I have the following problem:
I have a dataset for which my observations have a bunch of features and a continuous response (regression problem). However, some of my observations (about a fourth of them) do not have a response. The features should…

Tom
- 1,204
- 8
- 17
3
votes
1 answer
Training N classifiers for N labels vs one classifier with N labels
I have a classification problem which is multi-label with N labels. I would like to know which method would be the better choice? Training N classifiers (1 for each label) or a single classifier which use K-hot encoding. I'm using a neural network…

jvc
- 133
- 4
2
votes
2 answers
Parallel multi-task learning vs. continual learning
Assuming we want to learn k tasks jointly, and the data for all tasks are available. We may either train a model with parallel multi-task learning (eg. each batch is a mixture of samples from the k tasks), or present tasks sequentially (eg. switch…

thinkbear
- 141
- 5
2
votes
0 answers
How can I maximise binary cross entropy loss?
I have a multi-task learning model with two binary classification tasks. One part of the model creates a shared feature representation that is fed into two subnets in parallel. The loss function for each subnet at the moment is NLL, with a Softmax…

JM1982
- 21
- 3
2
votes
0 answers
How to optimize the hyperparameters of a Deep Gaussian Process?
I am trying to understand an article from NIPS 2017 where Alaa and van der Schaar create a Deep Multitask Gaussian Processes (DMGP) with competing risks. I don't grab what are they trying to optimize.
In their section 4 they present their inference…

Revolucion for Monica
- 753
- 4
- 17
1
vote
1 answer
Multiple-domain adaptation vs multi-task learning
I am confused with the definitions of domain adaptation and multi-task learning.
I have K datasets, each with the same feature and label space and thus the same learning problem, but with different domain P(x,y).
For each dataset, I would like to…

Tim D
- 11
- 3
1
vote
0 answers
When does multi-task learning make more sense than multi-label classification?
As part of writing a book on machine learning, I am creating an extreme multi-label stack overflow question tagger for thousands of tags with varying numbers of training examples and I’ve approached it as a multi-label classification problem after…

rjurney
- 151
- 8
1
vote
0 answers
Covariance Matrix of HIERARCHICAL MULTITASK GAUSSIAN PROCESS
I'm currently trying to develop a Gaussian Process to predict different levels of different individuals over time.
So it is a Time Regression Problem in which we have multiple tasks, but also multiple individuals... In other words, the shape of my…

Tbertin
- 327
- 2
- 12
1
vote
1 answer
Robust machine learning for slightly different class proportions in multiple data sets
Say we have n similar data sets, with the same variables, and outcome labels x and y. In these data sets, domains slightly differ as suggested by the proportion of the minority class x (ranging from 1%-15%).
How can we develop a robust ML algorithm…

sluijs
- 219
- 1
- 8