Parra Arnau, Javier
Total activity: 23
Department
Department of Telematics Engineering
E-mail
javier.parraestudiant.upc.edu
Contact details
UPC directory Open in new window

Graphic summary
  • Show / hide key
  • Information


Scientific and technological production
  •  

1 to 23 of 23 results
  • Privacy-preserving enhanced collaborative tagging

     Parra Arnau, Javier; Perego, Andrea; Ferrari, Elena; Forné, Jordi; Rebollo Monedero, David
    IEEE transactions on knowledge and data engineering
    Date of publication: 2014-01-01
    Journal article

    Read the abstract Read the abstract View View Open in new window  Share Reference managers Reference managers Open in new window

    Collaborative tagging is one of the most popular services available online, and it allows end user to loosely classify either online or offline resources based on their feedback, expressed in the form of free-text labels (i.e., tags). Although tags are not per se sensitive information, the wide use of collaborative tagging services increases the risk of cross referencing, thereby seriously compromising user privacy. In this paper, we make a first contribution in this direction by showing how a specific privacy-enhancing technology, namely tag suppression, can be used to protect end-user privacy. Moreover, we analyze how our approach can affect the effectiveness of a policy-based collaborative tagging system which supports enhanced Web access functionalities, like content filtering and discovery, based on preferences specified by end users.

  • Measuring the privacy of user profiles in personalized information systems

     Parra Arnau, Javier; Rebollo Monedero, David; Forné, Jordi
    Future generation computer systems
    Date of publication: 2014-04-01
    Journal article

    Read the abstract Read the abstract View View Open in new window  Share Reference managers Reference managers Open in new window

    Personalized information systems are information-filtering systems that endeavor to tailor information-exchange functionality to the specific interests of their users. The ability of these systems to profile users is, on the one hand, what enables such intelligent functionality, but on the other, the source of innumerable privacy risks. In this paper, we justify and interpret KL divergence as a criterion for quantifying the privacy of user profiles. Our criterion, which emerged from previous work in the domain of information retrieval, is here thoroughly examined by adopting the beautiful perspective of the method of types and large deviation theory, and under the assumption of two distinct adversary models. In particular, we first elaborate on the intimate connection between Jaynes' celebrated method of entropy maximization and the use of entropies and divergences as measures of privacy; and secondly, we interpret our privacy metric as false positives and negatives in a binary hypothesis testing. (C) 2013 Elsevier B.V. All rights reserved.

  • A collaborative protocol for anonymous reporting in vehicular ad hoc networks

     Tripp Barba, Carolina; Urquiza Aguiar, Luis; Aguilar Igartua, Mónica; Parra Arnau, Javier; Rebollo Monedero, David; Forné, Jordi; Pallares Segarra, Esteve
    Computer Standards & Interfaces
    Date of publication: 2013-11-01
    Journal article

    Read the abstract Read the abstract View View Open in new window  Share Reference managers Reference managers Open in new window

    Vehicular ad hoc networks (VANETs) have emerged to leverage the power of modern communication technologies, applied to both vehicles and infrastructure. Allowing drivers to report traffic accidents and violations through the VANET may lead to substantial improvements in road safety. However, being able to do so anonymously in order to avoid personal and professional repercussions will undoubtedly translate into user acceptance. The main goal of this work is to propose a new collaborative protocol for enforcing anonymity in multi-hop VANETs, closely inspired by the well-known Crowds protocol. In a nutshell, our anonymous-reporting protocol depends on a forwarding probability that determines whether the next forwarding step in message routing is random, for better anonymity, or in accordance with the routing protocol on which our approach builds, for better quality of service (QoS). Different from Crowds, our protocol is specifically conceived for multi-hop lossy wireless networks. Simulations for residential and downtown areas support and quantify the usefulness of our collaborative strategy for better anonymity, when users are willing to pay an eminently reasonable price in QoS.

  • Medición de la privacidad de perfiles de usuario mediante un add-on de navegador

     Estrada, Jose; Rodríguez, Ana; Parra Arnau, Javier; Forné, Jordi; Rebollo Monedero, David
    Jornadas de Ingeniería Telemática
    Presentation's date: 2013-10-29
    Presentation of work at congresses

    Read the abstract Read the abstract View View Open in new window  Share Reference managers Reference managers Open in new window

    Actualmente, la monitorización de los usuarios en Internet es permanente, y la información obtenida en este proceso es de enorme interés para grandes compañías de publicidad e incluso gobiernos. Además, la gran cantidad de datos susceptibles de recopilarse por los sistemas de información personalizados representa un grave riesgo para la privacidad del usuario en Internet. Quizá aún más crítico es que muchos usuarios no son conscientes de este riesgo, ya que éste no es tan manifiesto como en el mundo físico. En este artículo presentamos un add-on de navegador que estima el riesgo de privacidad del perfil de un usuario, quien por sus hábitos de navegación, está expuesto a mecanismos de profiling en Internet. El nivel de riesgo se muestra, de manera comprensible y accesible en la interfaz gráfica del navegador y se calcula tomando en cuenta diferentes modelos de adversario.

  • On collaborative anonymous communications in lossy networks

     Rebollo Monedero, David; Forné, Jordi; Pallares Segarra, Esteve; Parra Arnau, Javier; Tripp Barba, Carolina; Urquiza Aguiar, Luis; Aguilar Igartua, Mónica
    Security and Communication Networks
    Date of publication: 2013-05-17
    Journal article

    Read the abstract Read the abstract View View Open in new window  Share Reference managers Reference managers Open in new window

    Message encryption does not prevent eavesdroppers from unveiling who is communicating with whom, when, or how frequently, a privacy risk wireless networks are particularly vulnerable to. The Crowds protocol, a well-established anonymous communication system, capitalizes on user collaboration to enforce sender anonymity. This work formulates a mathematical model of a Crowd-like protocol for anonymous communication in a lossy network, establishes quantifiable metrics of anonymity and quality of service (QoS), and theoretically characterizes the trade-off between them. The anonymity metric chosen follows the principle of measuring privacy as an attacker's estimation error. By introducing losses, we extend the applicability of the protocol beyond its original proposal. We quantify the intuition that anonymity comes at the expense of both delay and end-to-end losses. Aside from introducing losses in our model, another main difference with respect to the traditional Crowds is the focus on networks with stringent QoS requirements, for best effort anonymity, and the consequent elimination of the initial forwarding step. Beyond the mathematical solution, we illustrate a systematic methodology in our analysis of the protocol. This methodology includes a series of formal steps, from the establishment of quantifiable metrics all the way to the theoretical study of the privacy QoS trade-off. Copyright © 2013 John Wiley & Sons, Ltd.

  • A modification of the Lloyd algorithm for k-anonymous quantization

     Rebollo Monedero, David; Forné, Jordi; Pallares Segarra, Esteve; Parra Arnau, Javier
    Information sciences
    Date of publication: 2013
    Journal article

    View View Open in new window  Share Reference managers Reference managers Open in new window

  • Privacy protection of user profiles in personalized information systems.

     Parra Arnau, Javier
    Defense's date: 2013-12-02
    Universitat Politècnica de Catalunya
    Theses

    Read the abstract Read the abstract  Share Reference managers Reference managers Open in new window

    Recientemente estamos siendo testigos del surgimiento de una amplia variedad de sistemas de información que adaptan la funcionalidad del intercambio de información para satisfacer los intereses específicos de sus usuarios. Muchos de estos sistemas de información personalizados se basan en la construcción de perfiles, especificados por el propio usuario, o inferidos a partir de su actividad pasada. La habilidad de estos sistemas para perfilar a los usuarios es, por tanto, lo que permite dicha inteligente funcionalidad, pero al mismo tiempo es la fuente de serios problemas de privacidad.Aunque disponemos de una amplia gama de tecnologías para mitigar algunos de esos problemas, lo cierto es que el uso de estas tecnologías está lejos de ser generalizado. La principal razón es la ambigüedad que existe entre estas tecnologías y su eficacia en términos de protección de la privacidad. Además, puesto que estas tecnologías normalmente vienen a costa de funcionalidad del sistema y utilidad, es crucial evaluar si la ganancia en privacidad compensa el coste en utilidad. Por tanto, medir la privacidad que proporciona una cierta tecnología es esencial para determinar su beneficio global, comparar su eficacia con otras tecnologías, y optimizarla en términos del compromiso entre privacidad y utilidad que plantean.Se ha dedicado un gran esfuerzo a investigar métricas tanto de privacidad como de utilidad. Sin embargo, muchas de estas métricas están ligadas a escenarios y modelos de adversario concretos, y por tanto son difíciles de generalizar o trasladar a otros contextos. Además, en aplicaciones que se prestan a la construcción de perfiles de usuario, hay pocas propuestas para evaluar la privacidad y las existentes no son justificadas de una manera apropiada o simplemente yerran al justificar la elección.La primera parte de esta tesis aborda el problema de cuantificar la privacidad de usuario. Primeramente, presentamos un marco teórico para sistemas de preservación de la privacidad, dotado de una visión unificadora de privacidad en términos del error de estimación incurrido por un atacante que tiene como objetivo revelar la información privada que el sistema debe encubrir. Nuestro análisis teórico muestra cómo un gran número de métricas de privacidad surgidas de aplicaciones diversas están relacionadas biyectivamente con este error de estimación, lo que permite interpretar y comparar estas métricas bajo una misma perspectiva.En segundo lugar, estudiamos cómo medir la privacidad en los sistemas de información personalizados. En concreto, proponemos dos métricas de privacidad de perfiles, y las justificamos basándonos, por un lado, en el argumento de Jaynes detrás de los métodos de maximización de la entropía, y por otro lado, en resultados fundamentales del método de tipos y test de hipótesis.Dotados de medidas cuantificables de privacidad y de utilidad, la segunda parte de esta tesis investiga mecanismos de perturbación de los datos para la mejora de la privacidad en dos tipos de sistemas de información personalizados. En particular, estudiamos la eliminación de etiquetas en aplicaciones de la Web semántica, y la combinación de la falsificación y la supresión de puntuaciones en sistemas de recomendación. Diseñamos estos mecanismos para que alcancen el compromiso óptimo entre privacidad y utilidad, en el sentido de maximizar la privacidad para un nivel de utilidad deseado, o viceversa. Procedemos de una manera sistemática, utilizando la metodología de optimización multiobjetivo. Nuestro análisis teórico encuentra una solución cerrada al problema de eliminación óptima de etiquetas, y al problema de falsificación y supresión óptima de puntuaciones.Además, presentamos una extensa caracterización teórica del compromiso entre privacidad y utilidad. Los experimentos llevados a cabo en aplicaciones reales muestran la eficacia de nuestros mecanismos en términos de protección de privacidad, funcionalidad del sistema y utilidad de los datos.

  • Privacy protection of user profiles in personalized information systems

     Parra Arnau, Javier; Forné, Jordi; Rebollo Monedero, David
    Forum PhD Research Information and Communication
    Presentation's date: 2012-10-15
    Presentation of work at congresses

    View View Open in new window  Share Reference managers Reference managers Open in new window

  • Optimal tag suppression for privacy protection in the semantic Web

     Parra Arnau, Javier; Rebollo Monedero, David; Forné, Jordi; Muñoz Tapia, Jose Luis; Esparza Martin, Oscar
    Data and knowledge engineering
    Date of publication: 2012
    Journal article

    View View Open in new window  Share Reference managers Reference managers Open in new window

  • A privacy-protecting architecture for collaborative filtering via forgery and suppression of ratings

     Parra Arnau, Javier; Rebollo Monedero, David; Forné, Jordi
    Lecture notes in computer science
    Date of publication: 2012
    Journal article

    Read the abstract Read the abstract View View Open in new window  Share Reference managers Reference managers Open in new window

    Recommendation systems are information-filtering systems that help users deal with information overload. Unfortunately, current recommendation systems prompt serious privacy concerns. In this work, we propose an architecture that protects user privacy in such collaborative-filtering systems, in which users are profiled on the basis of their ratings. Our approach capitalizes on the combination of two perturbative techniques, namely the forgery and the suppression of ratings. In our scenario, users rate those items they have an opinion on. However, in order to avoid privacy risks, they may want to refrain from rating some of those items, and/or rate some items that do not reflect their actual preferences. On the other hand, forgery and suppression may degrade the quality of the recommendation system. Motivated by this, we describe the implementation details of the proposed architecture and present a formulation of the optimal trade-off among privacy, forgery rate and suppression rate. Finally, we provide a numerical example that illustrates our formulation.

  • A privacy-protecting architecture for recommendation systems via the suppression of ratings

     Parra Arnau, Javier; Rebollo Monedero, David; Forné, Jordi
    International journal of security and its applications
    Date of publication: 2012-04
    Journal article

    Read the abstract Read the abstract View View Open in new window  Share Reference managers Reference managers Open in new window

    Recommendation systems are information-filtering systems that help users deal with information overload. Unfortunately, current recommendation systems prompt serious privacy concerns. In this work, we propose an architecture that enables users to enhance their privacy in those systems that profile users on the basis of the items rated. Our approach capitalizes on a conceptually-simple perturbative technique, namely the suppression of ratings. In our scenario, users rate those items they have an opinion on. However, in order to avoid being accurately profiled, they may want to refrain from rating certain items. Consequently, this technique protects user privacy to a certain extent, but at the cost of a degradation in the accuracy of the recommendation. We measure privacy risk as the Kullback-Leibler divergence between the user's and the population's rating distribution, a privacy criterion that we proposed in previous work. The justification of such a criterion is our second contribution. Concretely, we thoroughly interpret it by elaborating on the intimate connection between the celebrated method of entropy maximization and the use of entropies and divergences as measures of privacy. The ultimate purpose of this justification is to attempt to bridge the gap between the privacy and the information-theoretic communities by substantially adapting some technicalities of our original work to reach a wider audience, not intimately familiar with information theory and the method of types. Lastly, we present a formulation of the optimal trade-o_ between privacy and suppression rate, what allows us to formally specify one of the functional blocks of the proposed architecture.

  • Access to the full text
    Hierarchical categorisation of tags for delicious  Open access

     Parra Arnau, Javier; Perego, Andrea; Ferrari, Elena; Forné, Jordi; Rebollo Monedero, David
    Date: 2012
    Report

    Read the abstract Read the abstract Access to the full text Access to the full text Open in new window  Share Reference managers Reference managers Open in new window

    In the scenario of social bookmarking, a user browsing the Web bookmarks web pages and assigns free-text labels (i.e., tags) to them according to their personal preferences. In this technical report, we approach one of the practical aspects when it comes to represent users' interests from their tagging activity, namely the categorization of tags into high-level categories of interest. The reason is that the representation of user profiles on the basis of the myriad of tags available on the Web is certainly unfeasible from various practical perspectives; mainly concerning the unavailability of data to reliably, accurately measure interests across such fine-grained categorisation, and, should the data be available, its overwhelming computational intractability. Motivated by this, our study presents the results of a categorization process whereby a collection of tags posted at Delicious #http://delicious.com# are classified into 200 subcategories of interest.

  • Categorization of bibsonomy tags to apply privacy-preserving mechanisms.  Open access

     Parra Arnau, Javier; Forné, Jordi; Rebollo Monedero, David
    Date: 2012-07-24
    Report

    Read the abstract Read the abstract Access to the full text Access to the full text Open in new window  Share Reference managers Reference managers Open in new window

    In this technical report, we approach one of the practical aspects when it comes to represent users' interests from their tagging activity, namely the categorization of tags into high-level categories of interest. The reason is that the representation of user profiles on the basis of the myriad of tags available on the Web is certainly unfeasible from various practical perspectives; mainly concerning the unavailability of data to reliably, accurately measure interests across such fine-grained categorization, and, should the data be available, its overwhelming computational intractability. Motivated by this, our study presents the results of a categorization process whereby a collection of tags posted at BibSonomy #http://www.bibsonomy.org# are classified into 5 categories of interest. The methodology used to conduct such categorization is in line with other works in the field.

  • Access to the full text
    On the measurement of privacy as an attacker's estimation error  Open access

     Rebollo Monedero, David; Parra Arnau, Javier; Diaz, Claudia; Forné, Jordi
    International journal of information security
    Date of publication: 2012
    Journal article

    Read the abstract Read the abstract Access to the full text Access to the full text Open in new window  Share Reference managers Reference managers Open in new window

    A wide variety of privacy metrics have been proposed in the literature to evaluate the level of protection offered by privacy enhancing-technologies. Most of these metrics are specific to concrete systems and adversarial models, and are difficult to generalize or translate to other contexts. Furthermore, a better understanding of the relationships between the different privacy metrics is needed to enable more grounded and systematic approach to measuring privacy, as well as to assist system designers in selecting the most appropriate metric for a given application. In this work we propose a theoretical framework for privacypreserving systems, endowed with a general definition of privacy in terms of the estimation error incurred by an attacker who aims to disclose the private information that the system is designed to conceal. We show that our framework permits interpreting and comparing a number of well-known metrics under a common perspective. The arguments behind these interpretations are based on fundamental results related to the theories of information, probability and Bayes decision.

  • An information-theoretic privacy criterion for query forgery in information retrieval

     Rebollo Monedero, David; Parra Arnau, Javier; Forné, Jordi
    International Conference on Security Technology
    Presentation's date: 2011-12
    Presentation of work at congresses

    View View Open in new window  Share Reference managers Reference managers Open in new window

  • BEST PAPER AWARD OF SECTECH 2011

     Rebollo Monedero, David; Parra Arnau, Javier; Forné, Jordi
    Award or recognition

     Share

  • Continuity of Service, Security and QoS for Transportation Systems

     Rebollo Monedero, David; Pallares Segarra, Esteve; Aguilar Igartua, Mónica; Parra Arnau, Javier; Tripp Barba, Carolina; Forné, Jordi
    Participation in a competitive project

     Share

  • Access to the full text
    Hierarchical categorisation of web tags for Delicious  Open access

     Parra Arnau, Javier; Perego, Andrea; Ferrari, Elena; Forné, Jordi; Rebollo Monedero, David
    Date: 2011-11-11
    Report

    Read the abstract Read the abstract Access to the full text Access to the full text Open in new window  Share Reference managers Reference managers Open in new window

    In the scenario of social bookmarking, a user browsing the Web bookmarks web pages and assigns free-text labels (i.e., tags) to them according to their personal preferences. The benefits of social tagging are clear – tags enhance Web content browsing and search. However, since these tags may be publicly available to any Internet user, a privacy attacker may collect this information and extract an accurate snapshot of users’ interests or user profiles, containing sensitive information, such as health-related information, political preferences, salary or religion. In order to hinder attackers in their efforts to profile users, this report focuses on the practical aspects of capturing user interests from their tagging activity. More accurately, we study how to categorise a collection of tags posted by users in one of the most popular bookmarking services, Delicious (http://delicious.com).

  • Access to the full text
    Un Criterio de Privacidad Basado en Teoría de la Información para la Generación de Consultas Falsas  Open access

     Parra Arnau, Javier; Rebollo Monedero, David; Forné, Jordi
    Reunión Española sobre Criptología y Seguridad de la Información
    Presentation's date: 2010-09
    Presentation of work at congresses

    Read the abstract Read the abstract Access to the full text Access to the full text Open in new window  Share Reference managers Reference managers Open in new window

    En este artículo presentamos un criterio de privacidad basado en teoría de la información para la generación de consultas falsas en el ámbito de la recuperación de información privada. Medimos el riesgo de privacidad como la divergencia de Kullback y Leibler entre la distribución de consultas del usuario y la de la población, que incluye la entropía de la distribución del usuario como caso especial. Asimismo, llevamos a cabo una rigurosa justificación de nuestra métrica al interpretarla desde distintas perspectivas de teoría de la información, desde la propiedad de equipartición asintótica, pasando por los fundamentos sobre los que sustentan los métodos de maximización de la entropía, la minimización de la divergencia y la minimización de la ganancia de información, hasta el lema de Stein.

  • Mecanismo para evitar ataques por confabulación basados en code passing

     Jaimez, Marc; Esparza Martin, Oscar; Hernández Gañan, Carlos; Parra Arnau, Javier
    Jornadas de Ingeniería Telemática1
    Presentation's date: 2009-09-20
    Presentation of work at congresses

    View View Open in new window  Share Reference managers Reference managers Open in new window

  • GRUP SEGURETAT DE LA INFORMACIÓ (ISG)

     Pallares Segarra, Esteve; Fernandez Muñoz, Marcel; León Abarca, Olga; Hernández Serrano, Juan Bautista; Forné, Jordi; Pegueroles Valles, Josep Rafel; Esparza Martin, Oscar; Muñoz Tapia, Jose Luis; Parra Arnau, Javier; Soriano Ibáñez, Miguel
    Participation in a competitive project

     Share

  • PKIX Certificate Status in Hybrid MANETs

     Muñoz Tapia, Jose Luis; Esparza Martin, Oscar; Hernández Gañan, Carlos; Parra Arnau, Javier
    Workshop in Information Security Theory and Practice
    Presentation of work at congresses

    View View Open in new window  Share Reference managers Reference managers Open in new window

  • ITACA. Inquiry-based, with a Trustness model support, semantic Annotations and Communities formation Assistance

     Aguilar Igartua, Mónica; Forné, Jordi; Pallares Segarra, Esteve; Muñoz Tapia, Jose Luis; Hinarejos Campos, M. Francisca; Parra Arnau, Javier
    Participation in a competitive project

     Share