JMLR Special Topic on Multi Task Learning, Domain Adaptation and Transfer Learning The participants of the NIPS Workshop on Transfer and Multi-Task Learning are strongly encouraged to submit a paper. In the last years there has been an increase of activity in the areas of domain adaptation, transfer and multi-task learning. All born as solutions to better exploit the available data at training time and often moved by the need to deal with a reduced amount of information, these three topics grew fast in several directions and have multiple applications. Today new open research questions present the challenge. On one side, the literature is missing a joint theoretical framework over all of them, replaced instead by many theoretical formulations model regimes that are rarely used in practice (e.g. adaptive methods that store all the source samples). On the other, in the “big data” era, existing methods should be extended to manage large amount of data that do not lack anymore in size but may lack in quality or may continuously change over time. This special topic is intended to to gather contributions that indicate new directions, innovative views and to serve as an outlet for recent advances in learning in such environments. We welcome both theoretical advances in this field as well as detailed reports on applications. Topics of interest include:
Submission Procedure: Authors are kindly invited to follow the standard JMLR format and submission procedure. The number of pages is limited to 30. Please include a note stating that your submission is for the special topic on Multi-Task Learning, Domain Adaptation and Transfer Learning. For further details or inquiries, please contact the guest editors: mtldatl@gmail.com |