UPDATE:
With the latest advances of deep generative models, synthesis of images and videos as well as of human voices have achieved impressive realism. In many domains, synthetic media are already difficult to distinguish from real by the human eye and ear. The potential of misuses of these technologies is seldom discussed in academic papers; instead, vocal concerns are rising from media and security organizations as well as from governments. Researchers are starting to experiment on new ways to integrate deep learning with traditional media forensics and security techniques as part of a technological solution.
This workshop will bring together experts from the communities of machine learning, computer security and digital forensics in an attempt to highlight recent work and discuss future effort to address these challenges.
Our agenda will alternate contributed papers with invited speakers. The latter will emphasize connections among the interested scientific communities and the standpoint of institutions and media organizations.
The workshop will sponsor a best paper award and student travel awards.
The travel awards are not meant to completely cover travel expenses but meaningfully reduce the barrier to attend the event. The intention is to balance the students to be of diverse backgrounds and walks of life so that the opportunity to learn from their very distinct and complementary viewpoints is not limited to age, gender, academic level etc.
To apply for student travel award, please send a mail to <g.patrini@deeptracelabs.com, asadhu@usc.edu> with the subject "[AVFakes19 Student Support] [ICML]" and the following:
i) motivational letter explaining why you should be receiving financial support
ii) a supporting letter from your academic supervisor