Knowledge-based systems

Vol. 174, p. 144-159

DOI: 10.1016/j.knosys.2019.03.005

Date of publication: 2019-06-15

Abstract:

We introduce collective decision-making models associated with influence spread under the linear threshold model in social networks. We define the oblivious and the non-oblivious influence models. We also introduce the generalized opinion leader–follower model (gOLF) as an extension of the opinion leader–follower model (OLF) proposed by van den Brink et al. (2011). In our model we allow rules for the final decision different from the simple majority used in OLF. We show that gOLF models are non-oblivious influence models on a two-layered bipartite influence digraph. Together with OLF models, the satisfaction and the power measures were introduced and studied. We analyze the computational complexity of those measures for the decision models introduced in the paper. We show that the problem of computing the satisfaction or the power measure is #P-hard in all the introduced models even when the subjacent social network is a bipartite graph. Complementing this result, we provide two subfamilies of decision models in which both measures can be computed in polynomial time. We show that the collective decision functions are monotone and therefore they define an associated simple game. We relate the satisfaction and the power measures with the Rae index and the Banzhaf value of an associated simple game. This will allow the use of known approximation methods for computing the Banzhaf value, or the Rae index to their practical computation.]]>

Date: 2019-01-01

Abstract:

We call domain any arbitrary subset of a Cartesian power of the set {0,1} when we think of it as reflecting abstract rationality restrictions on vectors of two-valued judgments on a number of issues. In Computational Social Choice Theory, and in particular in the theory of judgment aggregation, a domain is called a possibility domain if it admits a non-dictatorial aggregator, i.e. if for some k there exists a unanimous (idempotent) function F:Dk¿D which is not a projection function. We prove that a domain is a possibility domain if and only if there is a propositional formula of a certain syntactic form whose set of satisfying truth assignments, or models, comprise the domain. A formula whose set of satisfying truth assignments is equal to a given arbitrary domain D is sometimes referred to as an integrity constraint for D. So we call {\em possibility} integrity constraints the formulas of the specific syntactic type we define. Given a possibility domain D, we show how to construct a possibility integrity constraint for D efficiently, i.e, in polynomial time in the size of the domain. We also show how to distinguish formulas that are possibility integrity constraints in linear time in the size of the input formula. Our result falls in the realm of classical results that give syntactic characterizations of logical relations that have certain closure properties, like e.g. the result that logical relations component-wise closed under logical AND are precisely the models of Horn formulas. However, our techniques draw from results in judgment aggregation theory as well from results about propositional formulas and logical relations.]]>

International Symposium on Algorithms and Computation

p. 1-13

DOI: 10.4230/LIPIcs.ISAAC.2018.20

Presentation's date: 2018-12-18

Abstract:

We study the concept ofcompactor, which may be seen as a counting-analogue of kernelization incounting parameterized complexity. For a functionF: S*¿Nand a parameterization¿: S*¿N, a compactor(P,M)consists of a polynomial-time computable functionP, calledcondenser, anda computable functionM, calledextractor, such thatF=M¿P, and the condensingP(x)ofxhaslength at mosts(¿(x)), for any inputx¿S*.Ifsis a polynomial function, then the compactoris said to be of polynomial-size. Although the study on counting-analogue of kernelization isnot unprecedented, it has received little attention so far. We study a family of vertex-certifiedcounting problems on graphs that are MSOL-expressible; that is, for an MSOL-formulafwithone free set variable to be interpreted as a vertex subset, we want to count allA¿V(G)where|A|=kand(G,A)|=f.In this paper, we prove that every vertex-certified countingproblems on graphs that isMSOL-expressibleandtreewidth modulable, when parameterized byk, admits a polynomial-size compactor onH-topological-minor-free graphs with condensing timeO(k2n2)and decoding time2O(k).This implies the existence of anFPT-algorithm of running timeO(n2k2) + 2O(k).All aforementioned complexities are under the Uniform Cost Measure (UCM)model where numbers can be stored in constant space and arithmetic operations can be done inconstant time.]]>

Date: 2018-09-26

Abstract:

Suppose that there is a family of n random points Xv for v¿V, independently and uniformly distributed in the square [-n--v/2,n--v/2]2. We do not see these points, but learn about them in one of the following two ways. Suppose first that we are given the corresponding random geometric graph G, where distinct vertices u and v are adjacent when the Euclidean distance dE(Xu,Xv) is at most r. Assume that the threshold distance r satisfies n3/14«r«n1/2. We shall see that the following holds with high probability. Given the graph G (without any geometric information), in polynomial time we can approximately reconstruct the hidden embedding, in the sense that, `up to symmetries', for each vertex v we find a point within distance about r of Xv; that is, we find an embedding with `displacement' at most about r. Now suppose that, instead of being given the graph G, we are given, for each vertex v, the ordering of the other vertices by increasing Euclidean distance from v. Then, with high probability, in polynomial time we can find an embedding with the much smaller displacement error O(logn----v).]]>

AlgoUK workshop

p. 4

Presentation's date: 2018-09-20

Materials Discovery Research Symposium

p. 12

Presentation's date: 2018-09-04

International Conference on Higher Education Advances

p. 425-432

DOI: 10.4995/HEAd18.2018.8007

Presentation's date: 2018-07-20

Abstract:

University education is facing new strategical changes that will lead to deep structural changes. Course organization is evolving and the organizational decisions have an economical impact. We propose a method to measure the present value of a pedagogical asset under a return rate. We apply the method to three courses in the Computer Science curricula taught at the Facultat d’Informatica de Barcelona of the Universitat Politècnica de Catalunya, Barcelona Tech. A large, compulsory, first year course (PRO1),a medium size undergraduate course (ALG) and a small specialized master course (AGT). Our results highlight that the present value gets higher values as a function of the size of the course and it goes in a negative relationship with respect to the level of computer support involved in their teaching.]]>

Date: 2018-07-18

Abstract:

The {\em Maximal} points in a set S are those that aren't {\em dominated} by any other point in S. Such points arise in multiple application settings in which they are called by a variety of different names, e.g., maxima, Pareto optimums, skylines. Because of their ubiquity, there is a large literature on the {\em expected} number of maxima in a set S of n points chosen IID from some distribution. Most such results assume that the underlying distribution is uniform over some spatial region and strongly use this uniformity in their analysis. This work was initially motivated by the question of how this expected number changes if the input distribution is perturbed by random noise. More specifically, let Ballp denote the uniform distribution from the 2-d unit Lp ball, delta Ballq denote the 2-d Lq-ball, of radius delta and Ballpq be the convolution of the two distributions, i.e., a point v in Ballp is reported with an error chosen from delta Ballq. The question is how the expected number of maxima change as a function of delta. Although the original motivation is for small delta the problem is well defined for any delta and our analysis treats the general case. More specifically, we study, as a function of n,\delta, the expected number of maximal points when the n points in S are chosen IID from distributions of the type Ballpq where p,q in {1,2,infty} for delta > 0 and also of the type Ballp infty-q, where q in [1,infty) for delta > 0.]]>

Workshop on pensions and insurance

p. 3

Presentation's date: 2018-07-13

European Conference on Operational Research

p. 110

Presentation's date: 2018-07-02

Abstract:

An influence game is a cooperative simple game in which a coalition of the players wins if it is able to convince enough agents to participate in the task (to vote in favor of a decision). In this vein, the linear threshold model is taken as the influence model. It is known that influence games capture the class of simple games. Influence games (as simple games) are monotonic. Now, we present some new classes of influence games under some connectivity restrictions over winning coalitions. These classes of influence games are not necessary monotonic. We characterize the computational complexity of problems on influence games over these new classes, including measures (length and width), values (Shapley-Shubik and Banzhaf) and properties (of teams and players). We also analyze those problems for some particular extremal cases, with respect to the propagation of influence.]]>

IEEE transactions on learning technologies

Vol. 11, num. 3, p. 321-333

DOI: 10.1109/TLT.2017.2723389

Date of publication: 2018-07

Abstract:

Jutge.org is an open educational online programming judge designed for students and instructors, featuring a repository of problems that is well organized by courses, topics and difficulty. Internally, Jutge.org uses a secure and efficient architecture and integrates modern verification techniques, formal methods, static code analysis and data mining. Jutge.org has exhaustively been used during the last decade at the Universitat Politecnica de Catalunya to strengthen the learn-by-doing approach in several courses. This paper presents the main characteristics of Jutge.org and shows its use and impact in a wide range of courses covering basic programming, data structures, algorithms, artificial intelligence, functional programming and circuit design.]]>

Electronic notes in discrete mathematics

Vol. 68, p. 197-202

DOI: 10.1016/j.endm.2018.06.034

Date of publication: 2018-07

Abstract:

We consider decision models associated with cooperative influence games, the oblivious and the non-oblivious influence models. In those models the satisfaction and the power measures were introduced and studied. We analyze the computational complexity of those measures when the in uence level is set to unanimity and the rule of decision is simple majority. We show that computing the satisfaction and the power measure in those systems are #P-hard.]]>

Jornadas sobre la Enseñanza Universitaria de la Informática

p. 247-254

Presentation's date: 2018-07

Abstract:

A partir del Q1 de 2017 el Departamento de CS de la UPC se ocupa de la docencia de Informática I en el nuevo campus de la EEBE. Dicha docencia ha sido singular en dos aspectos. Primero, hubo que organizarla al mismo tiempo que se impartía. Segundo, todos los números son grandes. En el Q1 de 2017, hubo 686 estudiantes matriculados, 18 docentes y se prepararon 108 exámenes de laboratorio. Para tratar con estas singularidades hemos adoptado una forma de coordinación a la que hemos llamado ordocoordinación. Es una coordinación flexible y rápida en la que los docentes generan y consensúan un conjunto mínimo de reglas. Es de abajo a arriba y requiere una toma de decisiones ágil, por lo que el número de emails ha sido importante. En Q1 de 2017: 350 x 18 = 6300 emails. Creemos que esta aproximación merece ser explicada y que puede ser aplicada a otras asignaturas. Since Autumn Term 2017 the Department of Computer Science of the Universitat Politecnica de Catalunya UPC-BarcelonaTech is in charge of teaching ”Fundamentals of Programming” in the new DiagonalBeso ´s Campus, at EEBE School. This new endeavour had to face two particular challenges: First, due to organizing constraints, it had to be organized at the same time it was being first taught. Second, all the numbers involved were large. In effect, in Autumn Term 2017, 686 students enroled, with a teaching staff of 18 instructors, and 108 laboratory tests being prepared. To deal with these challenges, we agreed to coordinate ourselves in a particular way which we name ordocoordination. We define ordocoordation as a flexible and quick particular way of coordination in which teachers generate and agree on a minimum set of rules. It is a bottom-up procedure, requiring taking quick decisions. As a consequence of applying this particular coordination, the number of sent emails has been a large one: in Autumn Term 2017: 350 ×18 = 6300 emails were interchanged. We believe that this approach deserves to be reported, and also that it is relevant to other subjects.

Since Autumn Term 2017 the Department of Computer Science of the Universitat Politecnica de Catalunya UPC-BarcelonaTech is in charge of teaching ”Fundamentals of Programming” in the new DiagonalBeso ´s Campus, at EEBE School. This new endeavour had to face two particular challenges: First, due to organizing constraints, it had to be organized at the same time it was being first taught. Second, all the numbers involved were large. In effect, in Autumn Term 2017, 686 students enroled, with a teaching staff of 18 instructors, and 108 laboratory tests being prepared. To deal with these challenges, we agreed to coordinate ourselves in a particular way which we name ordocoordination. We define ordocoordation as a flexible and quick particular way of coordination in which teachers generate and agree on a minimum set of rules. It is a bottom-up procedure, requiring taking quick decisions. As a consequence of applying this particular coordination, the number of sent emails has been a large one: in Autumn Term 2017: 350 ×18 = 6300 emails were interchanged. We believe that this approach deserves to be reported, and also that it is relevant to other subjects.

A partir del Q1 de 2017 el Departamento de CS de la UPC se ocupa de la docencia de Informática I en el nuevo campus de la EEBE. Dicha docencia ha sido singular en dos aspectos. Primero, hubo que organizarla al mismo tiempo que se impartía. Segundo, todos los números son grandes. En el Q1 de 2017, hubo 686 estudiantes matriculados, 18 docentes y se prepararon 108 exámenes de laboratorio. Para tratar con estas singularidades hemos adoptado una forma de coordinación a la que hemos llamado ordocoordinación. Es una coordinación flexible y rápida en la que los docentes generan y consensúan un conjunto mínimo de reglas. Es de abajo a arriba y requiere una toma de decisiones ágil, por lo que el número de emails ha sido importante. En Q1 de 2017: 350 x 18 = 6300 emails. Creemos que esta aproximación merece ser explicada y que puede ser aplicada a otras asignaturas.]]>

International Conference on Information Processing and Management of Uncertainty in Knowledge-Based Systems

p. 226-238

DOI: 10.1007/978-3-319-91476-3_19

Presentation's date: 2018-06-13

Abstract:

We propose a model for the behaviour of Web apps in the unreliable WWW. Web apps are described by orchestrations. An orchestration mimics the personal use of the Web by defining the way in which Web services are invoked. The WWW is unreliable as poorly maintained Web sites are prone to fail. We model this source of unreliability trough a probabilistic approach. We assume that each site has a probability to fail. Another source of uncertainty is the traffic congestion. This can be observed as a non-deterministic behaviour induced by the variability in the response times. We model non-determinism by imprecise probabilities. We develop here an ex-ante normal to characterize the behaviour of finite orchestrations in the unreliable Web. We show the existence of a normal form under such semantics for orchestrations using asymmetric parallelism.]]>

International Colloquium on Automata, Languages, and Programming

p. 1-14

DOI: 10.4230/LIPIcs.ICALP.2018.116

Presentation's date: 2018-06-09

Abstract:

We show that for no surface except for the plane does monadic second-order logic (MSO) have a zero-one-law - and not even a convergence law - on the class of (connected) graphs embeddable on the surface. In addition we show that every rational in [0,1] is the limiting probability of some MSO formula. This strongly refutes a conjecture by Heinig et al. (2014) who proved a convergence law for planar graphs, and a zero-one law for connected planar graphs, and also identified the so-called gaps of [0,1]: the subintervals that are not limiting probabilities of any MSO formula. The proof relies on a combination of methods from structural graph theory, especially large face-width embeddings of graphs on surfaces, analytic combinatorics, and finite model theory, and several parts of the proof may be of independent interest. In particular, we identify precisely the properties that make the zero-one law work on planar graphs but fail for every other surface.]]>

Balkan Conference on Operational Research

p. B6

Presentation's date: 2018-05-27

Abstract:

Given n players, a simple game is a 0/1 valued TU cooperative game verifying unanimity and monotonicity. In such a game, a coalition, i.e., a subset of players, always wins (winning coalition) or loses (losing coalition). It is known that each simple simple can be expressed as the intersection (or the union) of weighted voting games. A simple game is called weighted voting (majority) game if there is a quote q and an assignment of a non-negative integer weight to each player in such a way that a coalition is winning if and only if the sum of the weights of its players is greater than or equal to the integer quota q. Two important and well studied concepts relating simple games with weighted games are the dimension and the codimension. The dimension is the minimum number of weighted games such that their intersections generate the considered simple game. In the same vein, the codimension is the minimum number of weighted games such that their unions generate the considered simple game. There are some previous studies about the dimension and the codimension of simple games. Nevertheless no complete classification of dimension or codimension of simple games is known even for small number of players. In this work we initiate a systematic such classification with respect to the dimension and the codimension parameters. We introduce the concept of multidimension of a simple game, the minimum number of intersections and unions of weighted games to generate the considered simple game. A similar concept was introduced in another topic (Boolean functions) by Goldberg. We also classify some simple games with respect to their multidimension. Moreover, we present some particular results of the dimension, the codimension and the multidimension for specific simple games depending on properties of the (minimal) winning coalitions. Finally, we study possible relations between k-trade robustness and k-invarinat-trade robustness (concepts that characterize the subclass of simple games called complete simple games) with respect to the dimension, the codimension and the multidimension.]]>

Risk 2018- 7th Workshop on Risk Management and Insurance

p. 135-142

Presentation's date: 2018-04-27

Abstract:

In this post recession time it is important to measure the possibilities offered by a society in relation to investments. To do that, we consider an investment schema I= (R;R_1,...,R_n) where R is a lower bound on the desired return and the R_i's are the return of the assets (to invest in). We introduce "the power to invest" , denoted by Power(I), a measure of the capability of the schema to fulfilling the requirement R. The power to invest is inspired in the Coleman's power of a collectivity to act. We consider the angel-daemon approach to uncertainty and extend it to investment schemas. The approach tries to tune cases in-between the worst and the best scenarios and analyze them through game theory. We show how to use the power to invest to asses uncertainty in such situations and develop several examples.

In this post recession time it is important to measure the possibilities offered by a society in relation to investments. To do that, we consider an investment schema I= (R;R_1,...,R_n) where R is a lower bound on the desired return and the R_i's are the return of the assets (to invest in). We introduce]]>

Knowledge-based systems

Vol. 140, p. 92-102

DOI: 10.1016/j.knosys.2017.10.029

Date of publication: 2018-01-15

Abstract:

Centrality and influence spread are two of the most studied concepts in social network analysis. In recent years, centrality measures have attracted the attention of many researchers, generating a large and varied number of new studies about social network analysis and its applications. However, as far as we know, traditional models of influence spread have not yet been exhaustively used to define centrality measures according to the influence criteria. Most of the considered work in this topic is based on the independent cascade model. In this paper we explore the possibilities of the linear threshold model for the definition of centrality measures to be used on weighted and labeled social networks. We propose a new centrality measure to rank the users of the network, the Linear Threshold Rank (LTR), and a centralization measure to determine to what extent the entire network has a centralized structure, the Linear Threshold Centralization (LTC). We appraise the viability of the approach through several case studies. We consider four different social networks to compare our new measures with two centrality measures based on relevance criteria and another centrality measure based on the independent cascade model. Our results show that our measures are useful for ranking actors and networks in a distinguishable way.]]>

Abstract:

Internet is producing important changes, not only in society, but also in the computing discipline. For example, the web has made possible the existence of new forms of computing like cloud computing. On the other hand, the web also allows us to access large amounts of data, that can be used for very diverse purposes. In parallel, the progressive increase in the power of computers has made it possible to deal with problems that previously could not be addressed. A consequence of all this is that many traditional methods of searching or obtaining information, or designing, modeling, analyzing or verifying algorithms or systems have become partially obsolete. This project aims to address the study of some problems in this new scenario. In particular, the project will focus on the study of largescale computer systems, using graphs as the unifying element of this study. More specifically, the project is structured at two levels. At the most fundamental level, we propose the study of several issues that will serve as a basis for the design and analysis of large-scale information processing systems in the network. In particular, we will study the modeling of the network and how the information is disseminated in it, some logical methods to reason about these systems and some algorithmic methods that can help to design them. Whereas, at the second level, we would study some specific problems related to the type of systems we want to deal with. In particular, the search of information in the network, using the paradigm of graph databases, the processing of large volumes of data, and the design and construction of these systems.]]>

ICT Express

Vol. 3, num. 4, p. 155-159

DOI: 10.1016/j.icte.2017.11.014

Date of publication: 2017-12-01

Abstract:

In this study, we describe a new coordination mechanism for non-atomic congestion games that leads to a (selfish) social cost which is arbitrarily close to the non-selfish optimal. This mechanism incurs no additional cost, in contrast to tolls that typically differ from the social cost as expressed in terms of delays.

©

Journal of combinatorial optimization

Vol. 34, num. 4, p. 1265-1301

DOI: 10.1007/s10878-017-0146-9

Date of publication: 2017-11-01

Abstract:

We consider Web services defined by orchestrations in the Orc language and two natural quality of services measures, the number of outputs and a discrete version of the first response time. We analyse first those subfamilies of finite orchestrations in which the measures are well defined and consider their evaluation in both reliable and probabilistic unreliable environments. On those subfamilies in which the QoS measures are well defined, we consider a set of natural related problems and analyse its computational complexity. In general our results show a clear picture of the difficulty of computing the proposed QoS measures with respect to the expressiveness of the subfamilies of Orc. Only in few cases the problems are solvable in polynomial time pointing out the computational difficulty of evaluating QoS measures even in simplified models.

The final publication is available at link.springer.com]]>

European Conference on Symbolic and Quantitative Approaches to Reasoning with Uncertainty

p. 318-328

DOI: 10.1007/978-3-319-61581-3_29

Presentation's date: 2017-07-03

Abstract:

We propose the use of the angel-daemon framework to assess the Coleman's power of a collectivity to act under uncertainty in weighted voting games. In this framework uncertainty profiles describe the potential changes in the weights of a weighted game and fixes the spread of the weights' change. For each uncertainty profile a strategic angel-daemon game can be considered. This game has two selfish players, the angel and the daemon, the angel selects its action as to maximize the effect on the measure under consideration while daemon acts oppositely. Players angel and daemon give a balance between the best and the worst. The angel-daemon games associated to the Coleman's power are zero-sum games and therefore the expected utilities of all the Nash equilibria are the same. In this way we can asses the Coleman's power under uncertainty. Besides introducing the framework for this particular setting we analyse basic properties and make some computational complexity considerations. We provide several examples based in the evolution of the voting rules of the EU Council of Ministers.

The final publication is available at link.springer.com]]>

Information and computation

Vol. 256, p. 83-92

DOI: 10.1016/j.icte.2017.11.014

Date of publication: 2017-05-18

Abstract:

In this paper we prove that the Min-Bisection problem is NP-hard on unit disk graphs, thus solving a longstanding open question.

©

Journal of computer and system sciences

Vol. 83, num. 1, p. 132-158

DOI: 10.1016/j.jcss.2016.06.006

Date of publication: 2017-04-03

Abstract:

The metric dimension of a graph G is the size of a smallest subset L ¿ V (G) such that for any x, y ¿ V (G) with x =/ y there is a z ¿ L such that the graph distance between x and z differs from the graph distance between y and z. Even though this notion has been part of the literature for almost 40 years, prior to our work the computational complexity of determining the metric dimension of a graph was still very unclear. In this paper, we show tight complexity boundaries for the Metric Dimension problem. We achieve this by giving two complementary results. First, we show that the Metric Dimension problem on planar graphs of maximum degree 6 is NP-complete. Then, we give a polynomial-time algorithm for determining the metric dimension of outerplanar graphs.

©

ReVisión

Vol. 10, num. 1, p. 55-70

Date of publication: 2017

Abstract:

En este trabajo se propone un método para analizar cuantitativamente la efectividad de las diferentes medidas adoptadas para mejorar el proceso de evaluación continua de una asignatura. Nuestra propuesta utiliza herramientas simples provenientes de la economía y de las ciencias sociales. Primero se analiza el efecto de dichas medidas sobre las categorías de calificaciones obtenidas por los estudiantes en relación al incremento de carga docente del profesorado. En el estudio de la evolución de las calificaciones y el trabajo de los profesores a lo largo del tiempo se utiliza una aproximación coste-beneficio marginal. En segundo lugar se realiza un análisis de desigualdad utilizando distintos tipos de desagregación de datos y en varias subpoblaciones relacionadas con las calificaciones de los estudiantes. Finalmente se analiza la evolución de indicadores estadísticos como la media y la satisfacción. Dicho método se aplica al estudio de la asignatura de Programación-1 de la Facultat d’Informàtica de Barcelona de la Universitat Politècnica de Catalunya en los primeros 5 cursos de implantación del Grado en Ingeniería Informática. La metodología propuesta pretende introducir nuevas técnicas que permitan un análisis objetivo del impacto de medidas docentes en cualquier asignatura universitaria. In this paper we propose a method to quantitatively analyze the effectiveness of the different measures taken to improve a course’s evaluation process. Our proposal uses simple tools from economics and the social sciences. First, we analyze the effect of these measures on the different categories of grades obtained by the students in relation to the increment of faculty’s workload by means of a marginal cost-benefit approach. In second place, we analyze inequality using different types of data disaggregation in several subpopulations related to the students’ grades. Finally, we study the evolution of statistical indicators such as mean and satisfaction. This method is applied to the study of the CS1 at the Facultat d’Informàtica de Barcelona of the Universitat Politècnica de Catalunya along the first 5 years after the introduction of the new degree in Computer Engineering. The proposed methodology aims to introduce new techniques that allow an objective analysis of the impact of educational measures in any university course.

In this paper we propose a method to quantitatively analyze the effectiveness of the different measures taken to improve a course’s evaluation process. Our proposal uses simple tools from economics and the social sciences. First, we analyze the effect of these measures on the different categories of grades obtained by the students in relation to the increment of faculty’s workload by means of a marginal cost-benefit approach. In second place, we analyze inequality using different types of data disaggregation in several subpopulations related to the students’ grades. Finally, we study the evolution of statistical indicators such as mean and satisfaction. This method is applied to the study of the CS1 at the Facultat d’Informàtica de Barcelona of the Universitat Politècnica de Catalunya along the first 5 years after the introduction of the new degree in Computer Engineering. The proposed methodology aims to introduce new techniques that allow an objective analysis of the impact of educational measures in any university course.

En este trabajo se propone un método para analizar cuantitativamente la efectividad de las diferentes medidas adoptadas para mejorar el proceso de evaluación continua de una asignatura. Nuestra propuesta utiliza herramientas simples provenientes de la economía y de las ciencias sociales. Primero se analiza el efecto de dichas medidas sobre las categorías de calificaciones obtenidas por los estudiantes en relación al incremento de carga docente del profesorado. En el estudio de la evolución de las calificaciones y el trabajo de los profesores a lo largo del tiempo se utiliza una aproximación coste-beneficio marginal. En segundo lugar se realiza un análisis de desigualdad utilizando distintos tipos de desagregación de datos y en varias subpoblaciones relacionadas con las calificaciones de los estudiantes. Finalmente se analiza la evolución de indicadores estadísticos como la media y la satisfacción. Dicho método se aplica al estudio de la asignatura de Programación-1 de la Facultat d’Informàtica de Barcelona de la Universitat Politècnica de Catalunya en los primeros 5 cursos de implantación del Grado en Ingeniería Informática. La metodología propuesta pretende introducir nuevas técnicas que permitan un análisis objetivo del impacto de medidas docentes en cualquier asignatura universitaria.]]>

Lecture notes in computer science

Vol. 10369, p. 318-328

DOI: 10.1007/978-3-319-61581-3_29

Date of publication: 2017

Abstract:

We propose the use of the angel-daemon framework to assess the Coleman's power of a collectivity to act under uncertainty in weighted voting games. In this framework uncertainty profiles describe the potential changes in the weights of a weighted game and fixes the spread of the weights' change. For each uncertainty profile a strategic angel-daemon game can be considered. This game has two selfish players, the angel and the daemon, the angel selects its action as to maximize the effect on the measure under consideration while daemon acts oppositely. Players angel and daemon give a balance between the best and the worst. The angel-daemon games associated to the Coleman's power are zero-sum games and therefore the expected utilities of all the Nash equilibria are the same. In this way we can asses the Coleman's power under uncertainty. Besides introducing the framework for this particular setting we analyse basic properties and make some computational complexity considerations. We provide several examples based in the evolution of the voting rules of the EU Council of Ministers.

The final publication is available at link.springer.com

We propose the use of the angel-daemon framework to assess the Coleman's power of a collectivity to act under uncertainty in weighted voting games. In this framework uncertainty profiles describe the potential changes in the weights of a weighted game and fixes the spread of the weights' change. For each uncertainty profile a strategic angel-daemon game can be considered. This game has two selfish players, the angel and the daemon, the angel selects its action as to maximize the effect on the measure under consideration while daemon acts oppositely. Players angel and daemon give a balance between the best and the worst. The angel-daemon games associated to the Coleman's power are zero-sum games and therefore the expected utilities of all the Nash equilibria are the same. In this way we can asses the Coleman's power under uncertainty. Besides introducing the framework for this particular setting we analyse basic properties and make some computational complexity considerations. We provide several examples based in the evolution of the voting rules of the EU Council of Ministers.]]>

Theory and practice of logic programming

Vol. 17, num. 1, p. 1-48

DOI: 10.1017/S1471068416000016

Date of publication: 2017-01-01

Abstract:

Machine clients are increasingly making use of the Web to perform tasks. While Web services traditionally mimic remote procedure calling interfaces, a new generation of so-called hypermedia APIs works through hyperlinks and forms, in a way similar to how people browse the Web. This means that existing composition techniques, which determine a procedural plan upfront, are not sufficient to consume hypermedia APIs, which need to be navigated at runtime. Clients instead need a more dynamic plan that allows them to follow hyperlinks and use forms with a preset goal. Therefore, in this paper, we show how compositions of hypermedia APIs can be created by generic Semantic Web reasoners. This is achieved through the generation of a proof based on semantic descriptions of the APIs' functionality. To pragmatically verify the applicability of compositions, we introduce the notion of pre-execution and post-execution proofs. The runtime interaction between a client and a server is guided by proofs but driven by hypermedia, allowing the client to react to the application's actual state indicated by the server's response. We describe how to generate compositions from descriptions, discuss a computer-assisted process to generate descriptions, and verify reasoner performance on various composition tasks using a benchmark suite. The experimental results lead to the conclusion that proof-based consumption of hypermedia APIs is a feasible strategy at Web scale.]]>

International journal of data analysis techniques and strategies

Vol. 9, num. 4, p. 314-330

DOI: 10.1504/IJDATS.2017.088359

Date of publication: 2017

Abstract:

We propose the use of an angel-daemon framework to perform an uncertainty analysis of short-term macroeconomic models. The angel-daemon framework defines a strategic game where two agents, the angel and the daemon, act selfishly. These games are defined over an uncertainty profile which presents a short and macroscopic description of a perturbed situation. The Nash equilibria on these games provide stable strategies in perturbed situations, giving a natural estimation of uncertainty. We apply the framework to the uncertainty analysis of linear versions of the IS-LM and the IS-MP models.]]>

Electronic notes in discrete mathematics

Vol. 55, p. 147-150

DOI: 10.1016/j.endm.2016.10.037

Date of publication: 2016-11-01

Abstract:

This paper studies the complexity of computing a representation of a simple game as the intersection (union) of weighted majority games, as well as, the dimension or the codimension. We also present some examples with linear dimension and exponential codimension with respect to the number of players.]]>

Theoretical computer science

Vol. 648, p. 56-71

DOI: 10.1016/j.tcs.2016.08.005

Date of publication: 2016-10-04

Abstract:

We introduce Celebrity games, a new model of network creation games. In this model players have weights (W being the sum of all the player's weights) and there is a critical distance ß as well as a link cost a. The cost incurred by a player depends on the cost of establishing links to other players and on the sum of the weights of those players that remain farther than the critical distance. Intuitively, the aim of any player is to be relatively close (at a distance less than ß ) from the rest of players, mainly of those having high weights. The main features of celebrity games are that: computing the best response of a player is NP-hard if ß>1 and polynomial time solvable otherwise; they always have a pure Nash equilibrium; the family of celebrity games having a connected Nash equilibrium is characterized (the so called star celebrity games) and bounds on the diameter of the resulting equilibrium graphs are given; a special case of star celebrity games shares its set of Nash equilibrium profiles with the MaxBD games with uniform bounded distance ß introduced in Bilò et al. [6]. Moreover, we analyze the Price of Anarchy (PoA) and of Stability (PoS) of celebrity games and give several bounds. These are that: for non-star celebrity games PoA=PoS=max{1,W/a}; for star celebrity games PoS=1 and PoA=O(min{n/ß,Wa}) but if the Nash Equilibrium is a tree then the PoA is O(1); finally, when ß=1 the PoA is at most 2. The upper bounds on the PoA are complemented with some lower bounds for ß=2.]]>

Electronic notes in discrete mathematics

Vol. 54, p. 205-210

DOI: 10.1016/j.endm.2016.09.036

Date of publication: 2016-10

Abstract:

We analyze the computational complexity of the power measure in models of collective decision: the generalized opinion leader-follower model and the oblivious and non-oblivious infuence models. We show that computing the power measure is #P-hard in all these models, and provide two subfamilies in which the power measure can be computed in polynomial time.]]>

Theory of computing systems

Vol. 59, num. 3, p. 397-415

DOI: 10.1007/s00224-015-9640-6

Date of publication: 2016-10-01

Abstract:

We study a network formation game where players wish to send traffic to other players. Players can be seen as nodes of an undirected graph whose edges are defined by contracts between the corresponding players. Each player can contract bilaterally with others to form bidirectional links or break unilaterally contracts to eliminate the corresponding links. Our model is an extension of the traffic routing model considered in Arcaute, E., Johari, R., Mannor, S., (IEEE Trans. Automat. Contr. 54(8), 1765–1778 2009) in which we do not require the traffic to be uniform and all-to-all. Player i specifies the amount of traffic tij = 0 that wants to send to player j. Our notion of stability is the network pairwise Nash stability, when no node wishes to deviate unilaterally and no pair of nodes can obtain benefit from deviating bilaterally. We show a characterization of the topologies that are pairwise Nash stable for a given traffic matrix. We prove that the best response problem is NP-hard and devise a myopic dynamics so that the deviation of the active node can be computed in polynomial time. We show the convergence of the dynamics to pairwise Nash configurations, when the contracting functions are anti-symmetric and affine, and that the expected convergence time is polynomial in the number of nodes when the node activation process is uniform.]]>

Date: 2016-09-19

Abstract:

We propose the use of an angel-daemon framework to perform an uncertainty analysis of short-term macroeconomic models with exogenous components. An uncertainty profile ¿ is a short and macroscopic description of a potentially perturbed situation. The angel-daemon framework uses ¿ to define a strategic game where two agents, the angel and the daemon, act selfishly having different goals. The Nash equilibria of those games provide the stable strategies in perturbed situations, giving a natural estimation of uncertainty.]]>

Advances in applied probability

Vol. 48, num. 3, p. 848-864

DOI: 10.1017/apr.2016.31

Date of publication: 2016-09-01

Abstract:

Given any two vertices u, v of a random geometric graph G(n, r), denote by dE(u, v) their Euclidean distance and by dE(u, v) their graph distance. The problem of finding upper bounds on dG(u, v) conditional on dE(u, v) that hold asymptotically almost surely has received quite a bit of attention in the literature. In this paper we improve the known upper bounds for values of r=¿(vlogn) (that is, for r above the connectivity threshold). Our result also improves the best known estimates on the diameter of random geometric graphs. We also provide a lower bound on dE(u, v) conditional on dE(u, v).]]>

Random structures and algorithms

Vol. 49, num. 1, p. 137-159

DOI: 10.1002/rsa.20617

Date of publication: 2016-08-01

Abstract:

© 2016 Wiley Periodicals, Inc. The Moran process models the spread of mutations in populations on graphs. We investigate the absorption time of the process, which is the time taken for a mutation introduced at a randomly chosen vertex to either spread to the whole population, or to become extinct. It is known that the expected absorption time for an advantageous mutation is O(n4) on an n-vertex undirected graph, which allows the behaviour of the process on undirected graphs to be analysed using the Markov chain Monte Carlo method. We show that this does not extend to directed graphs by exhibiting an infinite family of directed graphs for which the expected absorption time is exponential in the number of vertices. However, for regular directed graphs, we show that the expected absorption time is O(nlogn) and O(n2). We exhibit families of graphs matching these bounds and give improved bounds for other families of graphs, based on isoperimetric number. Our results are obtained via stochastic dominations which we demonstrate by establishing a coupling in a related continuous-time model. The coupling also implies several natural domination results regarding the fixation probability of the original (discrete-time) process, resolving a conjecture of Shakarian, Roos and Johnson.]]>

Jornadas de Enseñanza Universitaria de la Informática

p. 161-168

Presentation's date: 2016-07-07

Abstract:

Este trabajo analiza cuantitativamente las diferentes medidas adoptadas en la evaluación contínua del curso de Programación-1 de la Facultat d’Informàtica de Barcelona de la Universitat Politècnica de Catalunya desde 2010, año en que comienza la impartición del nuevo grado en Ingeniería Informática. Se analiza el efecto de dichas medidas sobre las categorías de calificaciones obtenidas por los estudiantes en relación al incremento de carga docente del profesorado. El estudio de la evolución de estos dos valores a lo largo del tiempo usa una aproximación de coste-beneficio marginal, clásica en ámbitos de economía. Usando herramientas provenientes de las ciencias sociales, este estudio se complementa con un análisis preliminar de desigualdad. Ambos enfoques pretenden introducir nuevas técnicas para el análisis objetivo del impacto de medidas docentes.

Este trabajo analiza cuantitativamente las diferentes medidas adoptadas en la evaluación contínua del curso de Programación-1 de la Facultat d’Informàtica de Barcelona de la Universitat Politècnica de Catalunya desde 2010, año en que comienza la impartición del nuevo grado en Ingeniería Informática. Se analiza el efecto de dichas medidas sobre las categorías de caliﬁcaciones obtenidas por los estudiantes en relación al incremento de carga docente del profesorado. El estudio de la evolución de estos dos valores a lo largo del tiempo usa una aproximación de coste-beneﬁcio marginal, clásica en ámbitos de economía. Usando herramientas provenientes de las ciencias sociales, este estudio se complementa con un análisis preliminar de desigualdad. Ambos enfoques pretenden introducir nuevas técnicas para el análisis objetivo del impacto de medidas docentes.

This paper analyzes quantitatively the different measures taken in the continuous assessment of the CS1 course at the Facultat d’Informàtica de Barcelona of the Universitat Politècnica de Catalunya since 2010, when the new degree in Informatics Engineering began. The effect of those measures on the marks obtained by the students is analyzed with respect to the increment of the course load for the faculty members. These two values are compared along time through a classical marginal cost-beneﬁt approach. Using tools commonly accepted in the social sciences, this study is complemented with a preliminary analysis of inequality. Both approaches aim at introducing new techniques for an objective analysis on the impact of teaching measures.]]>

European Conference on Operational Research

p. 113

Presentation's date: 2016-07-04

Theory of computing systems

Vol. 59, num. 1, p. 1-23

DOI: 10.1007/s00224-015-9634-4

Date of publication: 2016-07

Abstract:

The Generalized Second Price (GSP) auction used typically to model sponsored search auctions does not include the notion of budget constraints, which is present in practice. Motivated by this, we introduce the different variants of GSP auctions that take budgets into account in natural ways. We examine their stability by focusing on the existence of Nash equilibria and envy-free assignments. We highlight the differences between these mechanisms and find that only some of them exhibit both notions of stability. This shows the importance of carefully picking the right mechanism to ensure stable outcomes in the presence of budgets.]]>

European Conference on Operational Research

p. 160-

Presentation's date: 2016-07

Abstract:

We consider two collective decision-making models associated with influence games [2], the oblivious and non-oblivious influence decision models. The difference is that in oblivious influence models the initial decision of the actors that are neither leaders nor independent is never taken into account, while in the non-oblivious it is taken when the leaders cannot exert enough influence to deviate from it. Following [1] we consider the power measure, for an actor in a decision-making model. Power is ascribed to an actor when, by changing its inclination, the collective decision can be changed. We show that computing the power measure is #P-hard in both oblivious and non-oblivious influence decision models. We present two subfamilies of (oblivious and non-oblivious) influence decision models in which the power measure can be computed in polynomial time.]]>

Discrete Mathematics Days

p. 21-

Presentation's date: 2016-07

Abstract:

We analyze the computational complexity of the power measure in models of collective decision: the generalized opinion leader-follower model and the oblivious and non-oblivious in uence models. We show that computing the power measure is #P- hard in all these models, and provide two subfamilies in which the power measure can be computed in polynomial time.]]>

Jornadas de Enseñanza Universitaria de la Informática

p. 253-260

Presentation's date: 2016-07

Abstract:

Teaching computer science in degrees that are not computer science related presents an important challenge: to motivate the students and to achieve good average grades. The student’s complaint is always based on his lack of motivation: What is this subject useful for? and this is specially relevant when this subject is not easy to learn by the student. In this paper we show the case of computer courses in the Statistics degree taught in the Universitat de Barcelona (UB) and the Universitat Politècnica de Catalunya (UPC) (two Catalan universities). We initially tried to reduce the complexity of their contents in order to obtain better average grades. Yet, it did not workout as expected. Therefore, we changed our strategy and instead of making the contents easier (less complex), we changed the tools that were used to teach and tried to adapt them to the students’ interests. In this particular case, we decided to use the R programming language, a language widely used by statisticians, in order to explain the basics of programming. Therefore, we changed our strategy from less (simpler contents) to different (more elaborated and nontrivial contents adapted to meet their expectations).]]>

Date of publication: 2016-06-07

Abstract:

The final publication is available at link.springer.com]]>

Date of publication: 2016-06-07

Abstract:

The final publication is available at link.springer.com]]>

Cologne Twente Workshop

p. 153-156

Presentation's date: 2016-06

Abstract:

This paper studies the complexity of computing a representation of a simple game as the intersection (union) of weighted majority games, as well as, the dimension or the codimension. We also present some examples with linear dimension and exponential codimension with respect to the number of players.]]>

Date: 2016-05-14

Abstract:

An opinion leader-follower model (OLF) is a two-action collective decision-making model for societies, in which three kinds of actors are considered: "opinion leaders", "followers", and "independent actors". In OLF the initial decision of the followers can be modified by the influence of the leaders. Once the final decision is set, a collective decision is taken applying the simple majority rule. We consider a generalization of \OLF, the gOLF models which allow collective decision taken by rules different from the single majority rule. Inspired in this model we define two new families of collective decision-making models associated with cooperative influence games. We define the "oblivious" and "non-oblivious" influence models. We show that gOLF models are non-oblivious influence models played on a two layered bipartite influence graph. Together with OLF models, the Satisfaction measure was introduced and studied. We analyze the computational complexity of the satisfaction measure for gOLF models and the other collective decision-making models introduced in the paper. We show that computing the satisfaction measure is #P-hard in all the considered models except for the basic OLF inwhich the complexity remains open. On the other hand, we provide two subfamilies of decision models in which the satisfaction measure can be computed in polynomial time. Exploiting the relationship with influence games, we can relate the satisfaction measure with the Rae index of an associated simple game. The Rae index is closely related to the Banzhaf value. Thus, our results also extend the families of simple games for which computing the Rae index and the Banzhaf value is computationally hard.

An opinion leader-follower model (OLF) is a two-action collective decision-making model for societies, in which three kinds of actors are considered:]]>

Probabilistic Combinatorics

p. 4

Presentation's date: 2016-04-08

Abstract:

Consider the following geometric infection model: Individuals are randomly placed points according to a Poisson point process in some appropriate metric space. Each individual has two states, infected or healthy. Any infected individual passes the infection to any other at distance d according to a Poisson process, whose rate is a function f(d) of d that decays as d increases. Any infected individual heals at rate 1. An epidemic is said to occur when, starting from one infected individual placed at some point, the infection has positive probability of lasting forever. Otherwise, we say extinction occurs. We investigate conditions on and under which the function f(d) = (d + 1)¿ is epidemic. Joint work with Xavier Perez and Nick Wormald.]]>

Theoretical computer science

Vol. 616, p. 126-140

DOI: 10.1016/j.tcs.2015.12.030

Date of publication: 2016-02-22

Abstract:

We introduce the quad-kd tree: a general purpose and hierarchical data structure for the storage of multidimensional points. Quad-kd trees include point quad trees and kd trees as particular cases and therefore they could constitute a general framework for the study of fundamental properties of trees similar to them. Besides, quad-kd trees can be tuned by means of insertion heuristics and bucketing techniques to obtain trade-offs between their costs in time and space. We propose three such heuristics and we show analytically and experimentally their competitive performance. Our analytical results back the experimental outcomes and suggest that the quad-kd tree is a flexible data structure that can be tailored to the resource requirements of a given application.]]>