Trust 2017

The 19th edition of this international workshop will be co-located with AAMAS 2017, the 16th International Conference on Autonomous Agents and Multiagent Systems held in Sao Paulo, Brazil on May 8-12, 2017.

Latest News

  • 22 experts in the field have joined the Program Committee!

Description

Trust is important in many kinds of interactions, including direct or computer-mediated human interaction, human-computer interaction and among social agents; it characterizes those elements that are essential in social reliability. It also informs the selection of partners for successful multiagent coordination (for example, in robotics applications). Trust is more than communication that is robust against repudiation or interference. The reliability of information about the status of a trade partner, for example, is only partly dependent on secure communication.

With the growing prevalence of social interaction through electronic means, trust, reputation, privacy and identity become more and more important. Trust is not just a simple, monolithic concept; it is multi-faceted, operating at many levels of interaction, and playing many roles. Another growing trend is the use of reputation mechanisms, and in particular the interesting link between trust and reputation. Many computational and theoretical models and approaches to reputation have been developed in recent years (for ecommerce, social networks, blogs, etc.). Further, identity and associated trustworthiness must be ascertained for reliable interactions and transactions. Trust is foundational for the notion of agency and for its defining relation of acting "on behalf of". It is also critical for modeling and supporting groups and teams, for both organization and coordination, with the related trade-off between individual utility and collective interest. The electronic medium seems to weaken the usual bonds of social control and the disposition to cheat grows stronger: this is yet another context where trust modeling is critical.

The aim of the workshop is to bring together researchers (ideally from different disciplines) who can contribute to a better understanding of trust and reputation in agent societies. We welcome submissions of high-quality research addressing issues that are clearly relevant to trust, deception, privacy, reputation, security and control in agent-based systems, from theoretical, applied and interdisciplinary perspectives. Submitted contributions should be original and not submitted elsewhere. Papers accepted for presentation must be relevant to the workshop, and to demonstrate clear exposition, offering new ideas in suitable depth and detail.

The scope of the workshop includes (but is not limited to):

  • Trust and risk-aware decision making
  • Game-theoretic models of trust
  • Deception and fraud, and its detection and prevention
  • Intrusion resilience in trusted computing
  • Reputation mechanisms
  • Trust in the socio-technical system
  • Trust in partners and in authorities
  • Trust during coordination and negotiation of agents
  • Privacy and access control in multi-agent systems
  • Trust and information provenance
  • Detecting and preventing collusion
  • Trust in human-agent interaction
  • Trust and identity
  • Trust within organizations
  • Trust, security and privacy in social networks
  • Trustworthy infrastructures and services
  • Trust modeling for real-world applications