Artigos

Itamar Viana's Publications

    • Learning Accurate and Interpretable Classifiers Using Optimal Multi-Criteria Rules, Journal of Information and Data Management, Vol 4, No 3 (2013)
      • Authors: Itamar Hata, Adriano Veloso, Nivio Ziviani
      • The Occam's Razor principle has become the basis for many Machine Learning algorithms, under the interpretation that the classifier should not be more complex than necessary. Recently, this principle has shown to be well suited to associative classifiers, where the number of rules composing the classifier can be substantially reduced by using condensed representations such as maximal or closed rules. While it is shown that such a decrease in the complexity of the classifier (usually) does not compromise its accuracy, the number of remaining rules is still larger than necessary and making it hard for experts to interpret the corresponding classifier. In this paper we propose a much more aggressive filtering strategy, which decreases the number of rules within the classifier dramatically without hurting its accuracy. Our strategy consists in evaluating each rule under different statistical criteria, and filtering only those rules that show a positive balance between all the criteria considered. Specifically, each candidate rule is associated with a point in an n-dimensional scattergram, where each coordinate corresponds to a statistical criterion. Points that are not dominated by any other point in the scattergram compose the Pareto frontier, and correspond to rules that are optimal in the sense that there is no rule that is better off when all the criteria are taken into account. Finally, rules lying in the Pareto frontier are filtered and compose the classifier. Our Pareto-Optimal filtering strategy may receive as input either the entire set of rules or even a condensed representation (i.e., closed rules). A systematic set of experiments involving benchmark data as well as recent data from actual application scenarios, followed by an extensive set of significance tests, reveal that the proposed strategy decreases the number of rules by up to two orders of magnitude and produces classifiers that are extremely readable (i.e., allow interpretability of the classification results) without hurting accuracy.
    • Multi-Objective Pareto-Efficient Approaches for Recommender Systems, ACM Transactions on Intelligent Systems and Technology , 2014, to appear.
    • Authors: Ribeiro, M.T., Lacerda, A., Moura, E.S., Hata, I., Veloso, A. and Ziviani, N.
      • In this paper we propose new approaches for multi-objective recommender systems based on the concept of Pareto-efficiency − a state achieved when the system is devised in the most efficient manner in the sense that there is no way to improve one of the objectives without making any other objective worse off. Given that existing multi-objective recommendation algorithms differ in their level of accuracy, diversity and novelty, we exploit the Pareto-efficiency concept in two distinct manners: (i) the aggregation of ranked lists produced by existing algorithms into a single one, which we call Pareto-efficient ranking, and (ii) the weighted combination of existing algorithms resulting in a hybrid one, which we call Pareto-efficient hybridization. Our evaluation involves two real application scenarios: music recommendation with implicit feedback (i.e., Last.fm) and movie recommendation with explicit feedback (i.e., MovieLens). We show that the proposed Pareto-efficient approaches are effective in suggesting items that are likely to be simultaneously accurate, diverse and novel. We discuss scenarios where the system achieves high levels of diversity and novelty without compromising its accuracy. Further, comparison against multi- objective baselines reveals improvements in terms of accuracy (from 10.4% to 10.9%), novelty (from 5.7% to 7.5%), and diversity(from 1.6% to 4.2%).
    • Modelagem de Desempenho de Plataformas Servidoras Multi-camadas, SBRC, March 1, 2009
    • Authors: Itamar Viana, João Palotti, Genaína Rodrigues, Jussara Almeida, Virgílio Almeida
        • Performance modeling is a central task in the capacity management of server platforms. Traditional performance models are created for transactional workloads (purely open models), batch or interactive (purely closed models). However, many real Web applications experience session-based workloads that exhibit a partially-open behavior, which includes components from either open or closed model,... more
    • Analisando os Compromissos de Segurança e Energia no Gerenciamento Autonômico de Capacidade, SBRC, May 1, 2008
      • Authors: Itamar Viana, João Palotti, Ítalo Cunha, Jussara Almeida, Virgílio Almeida
        • Gerenciamento de capacidade de uma infra-estrutura de hospedagem tem tradicionalmente focado em questões de desempenho. Entretanto, a qualidade
        • do serviçoo oferecido às aplicações hospedadas e o lucro do provedor dependem de outros aspectos, como segurança e restrições de energia. Este artigo estende nossa solução de gerenciamento de capacidade auto-adaptativo para capturar os compromissos... more
    • Analyzing Security and Energy Tradeoffs in Autonomic Capacity Management, IEEE/IFIP NOMS, April 1, 2008
      • Authors: Itamar Viana, Ítalo Cunha, João Palotti, Jussara Almeida, Virgílio Almeida
        • Capacity management of a hosting infrastructure has traditionally focused only on performance goals. However, the quality of service provided to the hosted applications, and ultimately the revenues achieved by the provider, depend also on other aspects, such as security and energy constraints. This paper extends our self-adaptive SLA-driven capacity management solution to capture, in an... more
    • Monografia de Projeto Orientado em Computação I - Desenvolvimento e Avaliação de Modelos Analíticos para Servidores Web Multi-Camadas com Diferentes Níveis de Multiprogramação por Camada
    • Analytical techniques for modeling of performance are important to carry out the capacity management and capacity planning of corporations with large computationals infrastructures. In this monograph we designed and evaluated a peformance model to multilayer Web systems. While these systems are usually modeled as having the workload open or closed, it is believed that certain Web applications such as e-commerce, fits better to the semi-open workload model based on user sessions. Moreover, our models also took into account the limit of multiprogramming level of each layer.
      • Keywords: queuing theory, performance analysis, Web multilayer platform.
    • Monografia de Projeto Orientado em Computação II - Caracterização da Carga de Trabalho Gerada por Usuários de Um Portal Web 2.0
    • The Web 2.0 can be divided into three parts: producers of content (eg.: groups of media, companies, users), content aggregators (eg.: platforms managers of content, content delivery networks - CDN) and content consumers (eg.: users). Content aggregators are responsible for managing, store, capture and deliver content in the best way. The aggregators allow content producers to focus on their activity leaving the activities of digital logistics to specialized companies. In this work we presented the characterization of the workload generated by users of a Web portal under the perspective of the aggregators of content, in this case a logistics company and a CDN. In the results we identified three profiles of users. The profile most common was the user that enters in the portal may require a video, watch the video or a part of it and end his session. The second profile was characterized by the activity of browsing. The third profile showed a high transfer rate with more traffic and was the users that more remained on the site.