Conference Papers

Dias J., Cardote A., Neves F., Sargento S., Oliveira A.
IEEE Vehicular Networking Conference, VNC
2012
Abstract:
In a near future, vehicles will be equipped with WAVE-compliant communication technology, enabling not only safety message exchange, but also infotainment and Internet access. However, without complete market penetration, other technologies must still be used, such as IEEE 802.11g/n to connect to public Wi-Fi hotspots in the city, or even 3G and 4G cellular networks. Due to the high mobility of nodes in Vehicular Ad-hoc NETworks (VANETs), the connectivity time between nodes becomes very short; therefore, it is essential to ensure the lowest handover time when moving between Road Side Units (RSUs) and other vehicles. In this paper, we implemented a multi-technology seamless handover mechanism for vehicular networks that integrates extended mobility protocols based on MIPv6 and PMIPv6, with a mobility manager that provides seamless communication between vehicles and the infrastructure, electing the best technology to maintain the vehicle connected without breaking any active sessions. To validate and evaluate the proposed handover approaches, a real-world vehicular testbed was setup, combining three technologies: IEEE 802.11p, IEEE 802.11g and 3G; handover metrics were obtained for all the combinations of these technologies. The results show that, if IEEE 802.11p is used in both vehicles and RSUs, the proposed approach is able to perform seamless handover with very low delay and no packet loss. The same conclusions apply for handovers between different technologies.
Vavala B., Neves N., Steenkiste P.
Proceedings - 46th Annual IEEE/IFIP International Conference on Dependable Systems and Networks, DSN 2016
2016
Abstract:
Code identity is a fundamental concept for authenticated operations in Trusted Computing. In today’s approach, the overhead of assigning an identity to a protected service increases linearly with the service code size. In addition, service code size continues to grow to accommodate richer services. This trend negatively impacts either the security or the efficiency of current protocols for trusted executions. We present an execution protocol that breaks the dependency between the code size of the service and the identification overhead, without affecting security, and that works on different trusted components. This is achieved by computing an identity for each of the code modules that are actually executed, and then building a robust chain of trust that links them together for efficient verification. We implemented and applied our protocol to a widely-deployed database engine, improving query-processing time up to 2× compared to the monolithic execution of the engine.
Brandao L.T.A.N.
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
2013
Abstract:
A secure two-party computation (S2PC) protocol allows two parties to compute over their combined private inputs, as if intermediated by a trusted third party. In the malicious model, this can be achieved with a cut-and-choose of garbled circuits (C&C-GCs), where some GCs are verified for correctness and the remaining are evaluated to determine the circuit output. This paper presents a new C&C-GCs-based S2PC protocol, with significant advantages in efficiency and applicability. First, in contrast with prior protocols that require a majority of evaluated GCs to be correct, the new protocol only requires that at least one evaluated GC is correct. In practice this reduces the total number of GCs to approximately one third, for the same statistical security goal. This is accomplished by augmenting the C&C with a new forge-and-lose technique based on bit commitments with trapdoor. Second, the output of the new protocol includes reusable XOR-homomorphic bit commitments of all circuit input and output bits, thereby enabling efficient linkage of several S2PCs in a reactive manner. The protocol has additional interesting characteristics (which may allow new comparison tradeoffs), such as needing a low number of exponentiations, using a 2-out-of-1 type of oblivious transfer, and using the C&C structure to statistically verify the consistency of input wire keys.
Vavala B., Neves N., Steenkiste P.
Proceedings of the IEEE Symposium on Reliable Distributed Systems
2016
Abstract:
We show how to leverage trusted computing technology to design an efficient fully-passive replicated system tolerant to arbitrary failures. The system dramatically reduces the complexity of a fault-tolerant service, in terms of protocols, messages, data processing and non-deterministic operations. Our replication protocol enables the execution of a single protected service, replicating only its state, while allowing the backup replicas to check the correctness of the results. We implemented our protocol on Trusted Computing (TC) technology and compared it with two recent replication systems.
Boban M., Vinhoza T.T.V., Tonguz O.K., Barros J.
IEEE Communications Letters
2012
Abstract:
One of the stumbling blocks for implementation of Vehicular Ad Hoc Networks is the penetration rate: the percentage of vehicles that have the communication equipment installed. As the equipment deployment is unlikely to happen instantaneously, it is important to explore the performance gains achievable at low penetration rates. This especially pertains to safety applications, which are expected to provide life-saving information to all drivers on the road within a given region. We propose a technique that can be employed by safety applications to address the low penetration issue. By using visual cues on the equipped vehicles, such as specific patterns of hazard warning lights, we show that for all but the lowest vehicle densities, a radio penetration rate of 30% is sufficient to inform more than 95% of drivers in the region of interest in a timely manner.
Condessa F., Bioucas-Dias J.
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
2012
Abstract:
In this paper we introduce a new methodology to segment and detect colorectal polyps in endoscopic images obtained by a wireless capsule endoscopic device. The cornerstone of our approach is the fact that polyps are protrusions emerging from colonic walls. Thus, they can be segmented by simple curvature descriptors. Curvature is based on derivatives, thus very sensitive to noise and image artifacts. Furthermore, the acquired images are sampled on a grid which further complicates the computation of derivatives. To cope with these degradation mechanisms, we use use Local Polynomial Approximation, which, simultaneously, denoise the observed images and provides a continuous representation suitable to compute derivatives. On the top of the image segmentation, we built a support vector machine to classify the segmented regions as polyps or non-polyps. The features used in the classifier are selected with a wrapper selection algorithm (greedy forward feature selection algorithm with support vector machines). The proposed segmentation and detection methodology is tested in several scenarios presenting very good results both using the same video sequences as training data and testing data (cross-feature validation) and different video sequences as training and testing data.
Zejnilovic S., Xavier J., Gomes J., Sinopoli B.
IEEE International Symposium on Information Theory - Proceedings
2015
Abstract:
In today’s large social and technological networks, since it is unfeasible to observe all the nodes, the source of diffusion is determined based on the observations of a subset of nodes. The probability of source localization error depends on the particular choice of observer nodes. We propose a criterion for observer node selection based on the minimal pairwise Chernoff distance between distributions of different source candidates. The proposed approach is optimal for the fastest error decay with vanishing noise. Although suboptimal for non-negligible noise, through simulation, we demonstrate its applicability in achieving low error probability. We also analyze the effect of network topology on the resulting error by bounding the smallest Chernoff distance for some specific networks.
Ribeiro R., Marujo L., De Matos D.M., Neto J.P., Gershman A., Carbonell J.
SIGIR 2013 - Proceedings of the 36th International ACM SIGIR Conference on Research and Development in Information Retrieval
2013
Abstract:
In general, centrality-based retrieval models treat all elements of the retrieval space equally, which may reduce their effectiveness. In the specific context of extractive summarization (or important passage retrieval), this means that these models do not take into account that information sources often contain lateral issues, which are hardly as important as the description of the main topic, or are composed by mixtures of topics. We present a new two-stage method that starts by extracting a collection of key phrases that will be used to help centrality-as-relevance retrieval model. We explore several approaches to the integration of the key phrases in the centrality model. The proposed method is evaluated using different datasets that vary in noise (noisy vs clean) and language (Portuguese vs English). Results show that the best variant achieves relative performance improvements of about 31% in clean data and 18% in noisy data.
Ferreira M., Damas L., Conceicao H., D'Orey P.M., Fernandes R., Steenkiste P., Gomes P.
IEEE Intelligent Vehicles Symposium, Proceedings
2014
Abstract:
Parking is a major problem of car transportation, with important implications in traffic congestion and urban landscape. Reducing the space needed to park cars has led to the development of fully automated and mechanical parking systems. These systems are, however, limitedly deployed because of their construction and maintenance costs. Leveraging on semi and fully-autonomous vehicular technology, as well as on the electric propulsion paradigm and in vehicular ad hoc networking, we propose a new parking concept where the mobility of parked vehicles is managed by a parking lot controller to create space for cars entering or exiting the parking lot, in a collaborative manner. We show that the space needed to park such vehicles can be reduced to half the space needed with conventional parking lot designs. We also show that the total travelled distance of vehicles in this new parking lot paradigm can be 30% less than in conventional parking lots. Our proposal can have important consequences in parking costs and in urban landscape.