Technical Program
Thursday, September 7, 2017
* Registration - held at DoubleTree by Hilton Hotel Cluj – City Plaza, 9-13 Sindicatelor Street, Cluj-Napoca, Romania.
** Welcome Party - held at DoubleTree by Hilton Hotel Cluj – City Plaza, Cluj-Napoca, Romania, Marco Polo Restaurant
Friday, September 8, 2017
8:00: Registration
DoubleTree by Hilton Hotel Cluj – City Plaza, 9-13 Sindicatelor Street, Cluj-Napoca, Romania.
*Banquet - held at
DoubleTree by Hilton Hotel Cluj – City Plaza - Marco Polo Restaurant.
Saturday, September 9, 2017
8:00: Registration
DoubleTree by Hilton Hotel Cluj – City Plaza, 9-13 Sindicatelor Street, Cluj-Napoca, Romania.
Detailed Technical Program
Workshops
Bosch Student Workshop - Path Planning (Thursday, September 7, 14:00 - 15:50)
Location: Beijing Room, 5th floor
Workshop Details:
here.
Deep Learning in Automated Driving (Thursday, September 7, 16:10 - 18:00)
Location: Beijing Room, 5th floor
Workshop Details:
here.
Keynote Lectures
Plenary Presentation 1 (Friday, September 8, 09:00 - 09:50)
Location: Ballroom, 1st floor
Chair: Sergiu Nedevschi Co-Chair: Rodica Potolea |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
Plenary Presentation 2 (Friday, September 8, 10:00 - 10:50)
Location: Ballroom, 1st floor
Chair: Sergiu Nedevschi Co-Chair: Rodica Potolea |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
Intelligent Systems
Intelligent Systems 1 (Friday, September 8, 11:10 - 12:30)
Location: Venezia Room, 5th Floor
Chair: Gabriela Czibula Co-Chair: Rodica Potolea |
Babeș-Bolyai University, Romania Technical University of Cluj-Napoca, Romania |
11:10 - 11:30 |
HACGA: An artifacts-based clustering approach for malware classification View Paper |
Oliviu-Bogdan Boțocan Gabriela Czibula |
Babeș-Bolyai University, Romania Babeș-Bolyai University, Romania |
Abstract: More and more sophisticated malware attacks are developed nowadays and new variants of existing malicious software are released daily. Malware clustering is often applied to identify patterns of malicious software, with similar samples being grouped together and considered variants of the same malware family. In this paper we propose an automated technique based on agglomerative hierarchical clustering combined with a supervised learning method for parameters optimization which helps determining samples that exhibit the same behavior, allowing malware analysts to uncover new and interesting threats. The proposed method relies on behavioral and attack pattern analysis. Despite the complexity of nowadays malicious software, the attacks of same malware families are very similar in terms of actions performed on the infected system. The experimental evaluation is carried out on a real case study and the results are analyzed, interpreted and compared to the ones of similar existing approaches. Our experiments demonstrate the capability of the proposed clustering method to accurately identify groups of similar malware samples, which are very likely to represent malware families.
|
11:30 - 11:50 |
Ensemble classifiers for supervised anomaly based network intrusion detection View Paper |
Valentina Timčenko Slavko Gajin |
University of Belgrade, Serbia University of Belgrade, Serbia |
Abstract: This paper focuses on the problem of machine learning classifier choice for network intrusion detection, taking into consideration several ensemble classifiers from the supervised learning category. We have evaluated Bagged trees, AdaBoost, RUSBoost, LogitBoost and GentleBoost algorithms, provided an analysis of the performance of the classifiers and compared their learning capabilities, taking for the reference UNSW-NB15 dataset. The obtained results have indicated that in the defined environment and under analyzed conditions Bagged tree and GentleBoost perform with highest accuracy and ROC values, while RUSBoost has the lowest performances.
|
11:50 - 12:10 |
Semantic Data Factory: A Framework for Using Domain Knowledge in Software Application Development View Paper |
Claudia Pop Alexandra Craciun Carla Knoblau Marcel Antal Dorin Moldovan Tudor Cioara Ionut Anghel Ioan Salomie |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
Abstract: This paper addresses the semantic gap between the domain knowledge and software application engineering by proposing a framework for mapping and integrating multiple heterogeneous data sources with application business logic by means of data semantic enrichment, aggregation, filtering and processing. Based on the main drawbacks identified in the current knowledge enhanced software application architectures, a generic framework for automating their development process is proposed. The framework reduces the implementation stage of complex applications to a simple task of editing a configuration file which can be performed even by the domain expert himself. Two use cases of the proposed framework are presented for developing semantically-enhanced applications. The first use case is described within the FP7 GEYSER research project in the context of energy-efficiency, while the second use-case is presented in the e-health context for generating applications over the Fitbit platform.
|
12:10 - 12:30 |
Building a cleanset. An artificial intelligence perspective of identifying network installers. View Paper |
Mihai Leonte Dragoș Teodor Gavriluț Nicolae Postolachi |
"Al. I. Cuza" University of Iași and Bitdefender Lab, Romania "Al. I. Cuza" University of Iași and Bitdefender Lab, Romania "Al. I. Cuza" University of Iași and Bitdefender Lab, Romania |
Abstract: With the growth of the known malware to more than 600 millions according to AVTest this year, every security solution developed different methods for detecting malicious content. Whether this method consists in signature based detection, emulation, heuristics or different techniques of machine learning, one thing remains a constant in this procedure: a need for a cleanset - a large collection of clean files that will be used to test the above mentioned detection methods. Initially, gathering files for a cleanset was merely a hardware resources problem: the more hardware and bandwidth you had, the more files you could download. However, in the last couple of years, a simple download is not enough. Now, a large portion of the clean files are obtained via a network installer. This means that downloading just the network installer is not enough; one needs to understand if a specific installer is a network installer and in this case run it in a virtual machine to extract the new files. This paper aims to analyze how different machine learning mechanisms can be used to solve this problem. We have used more than 1000000 non-downloader kits and more than 250000 network installer and trained different algorithms to learn how to automatically differentiate between those two categories.
|
Intelligent Systems 2 (Friday, September 8, 13:40 - 15:00)
Location: Venezia Room, 5th Floor
Chair: Ioan Alfred Leția Co-Chair: Adrian Groza |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
13:40 - 14:00 |
Analysing debates on climate change with textual entailment and ontologies View Paper |
Roxana Szabo Adrian Groza |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
Abstract: The difficult task of recognising textual entailment aims to check if a natural language text T entails a smaller statement H. Current methods rely on machine learning and various lexical resources. Our aim is to include domain knowledge when searching for entailment or non-entailment. As most available knowledge comes in form of ontologies, we focused on translating description logic axioms into lexical rules suitable for existing textual entailment algorithms. We apply the developed system in the climate change domain, where many pro and counter arguments do exist. The performed experiments indicate an increasing of performance when including domain knowledge into the existing textual entailment algorithms.
|
14:00 - 14:20 |
On Explaining Inconsistencies in Multi-Context Streams as Support for the Human User View Paper |
Octavian Pop Ioan Alfred Letia |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
Abstract: A framework for reasoning within heterogeneous knowledge sources over continuous input streams is the so called Multi-Context Systems (MCSs). To perform reasoning on multiple knowledge sources that need to be integrated we have to tackle the problem of potential inconsistencies. By applying transformations on the knowledge base of each context we generate a Main Knowledge Base (MKB) in Description Logics (DL) and use a reasoner to find out the causes of inconsistency, when that is the case. The justifications from the reasoner are used to find repairs in MKB, making it consistent, to help the user in the diagnosis of the error, and in the explanations of the inter-relationships between different concepts in play.
|
14:20 - 14:40 |
Conceptual Graph driven modeling and querying methods for RDMBS and XML databases View Paper |
Andrea Eva Molnar Viorica Varga Christian Săcărea Dan Cîmpan Bogdan Mocian |
Babeș-Bolyai University, Romania Babeș-Bolyai University, Romania Babeș-Bolyai University, Romania Babeș-Bolyai University, Romania Babeș-Bolyai University, Romania |
Abstract: Relational Database Management Systems (RDBMS) are keeping the most of the market share even after almost 40 years from there first release. Nevertheless, the variety of data imposed the enrichment of traditional RDBMS with a large variety of systems known as NoSQL. Visual query systems (VQSs) are query systems for databases that use visual representation of the structure of databases and the representation of queries. VQSs are designed to improve the effectiveness of the human-computer communication offering a valuable graphical support for communication and further analysis. Conceptual Graphs are a particular system of logic based on the existential graphs of Ch. S. Peirce and the semantic networks of Artificial Intelligence. They express meaning in a logically precise form which is computer tractable and human readable. In this article we present relational database and semi-structured data modeling based on Conceptual Graphs. We also discuss a Conceptual Graphs based query designer for the relational data model and for XQuery. The expressive power of Conceptual Graphs gives a natural and intuitive tool for database structure design and database querying. The novelty of this work is an intuitive graphical web application which represents the XML data structure in form of Conceptual Graphs and gives the possibility of constructing queries also in form of Conceptual Graphs on the selected data structure.
|
14:40 - 15:00 |
Climebot: an argumentative agent for climate change View Paper |
Daniel Toniuc Adrian Groza |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
Abstract: While climate experts have agreed that global warming is real, this consensus has not reached all the society levels. Our aim is to develop a conversational agent able to explain issues related to global warming. The developed chatbot relies on textual entailment to identify the best answer for a statement conveyed by a human agent. To enhance the conversational capabilities we employed the technical instrumentation provided by the API.AI framework. To exploit domain knowledge, the agent uses climate change ontologies converted into a adequate format for the API.AI model. Hence, we developed a Climebot, that is an argumentative agent for climate change based on ontologies and textual entailment.
|
Intelligent Systems 3 (Friday, September 8, 15:15 - 16:35)
Location: Venezia Room, 5th Floor
Chair: Mariana Mocanu Co-Chair: Mihaela Dînșoreanu |
University Politehnica of Bucharest, Romania Technical University of Cluj-Napoca, Romania |
15:15 - 15:35 |
Detection and Classification of Pilots Cognitive State using EEG View Paper |
Qasim Ali Khan Ali Hassan |
National University of Sciences and Technology, Pakistan National University of Sciences and Technology, Pakistan |
Abstract: Electroencephalogram (EEG) data is a set of brain signals recorded by special EEG headsets. These signals reflect the cortical electrical activity. The technique for utilization of EEG data has emerged to be a safe and portable non-invasive Brain Computer Interface (BCI) that can easily be used for studying the human cognitive states. In this paper we have focused on studying the pilot’s cortical potentials in simulated flight environment in order to classify his mental state into three categories i.e. rest mode, navigation flying mode, and dogfight mode. 14 channel Emotiv EEG headset was used by the subjects while playing a fighter aircraft game which could simulate all the required scenarios. The subject was screened in a dark room with huge projector screen along with audio stimuli. Several sessions of EEG data were recorded and feature extraction was carried out. Random Forest Tree classification algorithm proved to produce the best results. The pilot’s cognitive state was classified according to the labeled recordings by taking one second of data each time and classifying it. As a result 81.7 % accuracy was achieved. The decent accuracy of results prove that real time pilot cognitive state can be decoded effectively, and if transmitted live onto the ground command and control room, it can be utilized for ensuring pilots safety as well as for training and monitoring of pilots on-board activities.
|
15:35 - 15:55 |
Artifact detection in EEG using machine learning View Paper |
Elena Nedelcu Raluca Portase Ramona Tolas Raul Cristian Muresan Mihaela Dinsoreanu Rodica Potolea |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Center for Cognitive and Neuronal Studies, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
Abstract: The electroencephalography (EEG) data records vast amounts of human cerebral activity yet is still reviewed primarily by human readers. Most of the times, the data is contaminated with non-cerebral originated signals, called artifacts, which could be very difficult to visually detect and, undiscovered, could damage the neural information analysis. The purpose of our work is to detect the artifacts by identifying the most relevant features, both in the temporal and frequency domains, and train various supervised learning algorithms: Decision Trees, SVM and KNN, in order to distinguish between clean and contaminated signals. The performance of our method exceeds the ones achieved in literature with an accuracy of detection of 98.78%, 98.30% for precision and 98.40% for recall, for the best settings we found.
|
15:55 - 16:15 |
Detecting depression from fMRI using Relational Association Rules and Artificial Neural Networks View Paper |
Diana-Lucia Miholca Adrian Onicaș |
Babeș-Bolyai University, Romania Babeș-Bolyai University, Romania |
Abstract: Functional Resonance Imaging (fMRI) methods are used to identify brain abnormalities associated to psychiatric disorders. One of the most prevalent psychiatric disorders is depression that is marked by blunted reward sensitivity, the brain function becoming different compared to healthy individuals in dopamine irrigated fronto-striatal regions. In the current paper we approach, from a machine learning perspective, depression detection using fMRI data of 19 depressed and 20 healthy participants during a music listening sequence, a stimulus that has been shown to elicit reward-like responses in the brain. To this effect and in the context in which the machine learning approaches to the problem are very limited, we propose novel models based on Relational Association Rules and Artificial Neural Networks. Using the present data set, we performed leave-one-out cross-validation for the methods we propose. The experimental results are promising, the comparison with the related work favoring our solutions.
|
16:15 - 16:35 |
Particle Swarm Optimization for facility layout problems FLP – A comprehensive study View Paper |
Mostafa Abdulghafoor Mohammed Raed Abdulkareem Hasan |
University Politehnica of Bucharest, Romania Northern Technical University, Iraq |
Abstract: Regarding facility layout problem FLP and its combinatorial problems, a research gap is detected since none of the reviewed papers give a deeper insight into the potential of Particle Swarm Optimization PSO in that particular field. In this paper, a brief introduction for the so far most promising approaches to facility layout related topics, are provided. The succeeding paper will then illustrate some of those, in more detail. Moreover, we examine PSO modifications and extensions that could contribute to optimization methods in FLP; mostly conform to NP-hard combinatorial problems. Future research areas are identified in Construction Site Facility Layout Problems, Multi-Criteria Facility Layout Problems and Dynamic Facility Layout Problems.
|
Intelligent Systems 4 (Friday, September 8, 16:50 - 18:30)
Location: Venezia Room, 5th Floor
Chair: Nicolae Țăpuș Co-Chair: Camelia Lemnaru |
University Politehnica of Bucharest, Romania Technical University of Cluj-Napoca, Romania |
16:50 - 17:10 |
A Transition-based Approach for AMR Parsing using LSTM Networks View Paper |
Silviana Cimpian Andreea Lazar Florin Macicasan Camelia Lemnaru |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
Abstract: We present a procedure for generating Abstract Meaning Representation (AMR) structures from English sentences based on a transition-based system. Our proposed solution makes use of Long Short Term Memory networks to learn the action sequence that needs to be applied on the sentence in order to obtain the AMR graph. The action set is an extension of the arc-standard dependency parser, with several actions being altered to support action labels - necessary in the AMR graph. We apply several pre-processing steps on the sentences, to handle special language constructs - such as named entities, dates or monetary values - as well as a series of post-processing actions to restore the appropriate AMR nodes.
|
17:10 - 17:30 |
Formal Concept Analysis of a Romanian Emotion Lexicon View Paper |
Mihaiela Lupea Anamaria Briciu |
Babeș-Bolyai University, Romania Babeș-Bolyai University, Romania |
Abstract: Semantic lexicons are widely used resources in the implementation of solutions for a number of Natural Language Processing tasks. However, availability of these lexicons is limited for most languages other than English, including Romanian. The aim of this paper is to bridge this gap by processing and improving a translated lexicon of Romanian words tagged with Plutchik’s eight emotions and polarity data. We present the development of RoEmoLex (Romanian Emotion Lexicon) from early stages to an in depth analysis of its content using Formal Concept Analysis. Results show that the lexicon could represent a stepping stone in the creation of efficient emotion analysis systems for the Romanian language.
|
17:30 - 17:50 |
On a Diacritics Due Part-of-Speech Tagging Ambiguity in Romanian View Paper |
Radu Razvan Slavescu Melania Roxana Raduly Calin Cenan |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
Abstract: In this paper we investigate lack-of-diacritics due ambiguity when doing Part-of-Speech tagging in Romanian. This means that some words, if they are written with no diacritcs, could have associated more than one Part-of-Speech tag, even if this is not the case when the diacritcs are employed. A method for dealing with this problem is proposed. By developing an existing Hidden Markov Model and employing a large dictionary and trigrams set, the solution allows doing Part-of-Speech tagging for Romanian even if the words are written with no diacritics. The performance is evaluated on a set of sentences manually built to illustrate the situation.
|
17:50 - 18:10 |
Comparison of FPNNs Models Approximation Capabilities and FPGA Resources Utilization View Paper |
Martin Krčma Zdenek Kotasek Jakub Lojda |
Brno University of Technology, Czech Republic Brno University of Technology, Czech Republic Brno University of Technology, Czech Republic |
Abstract: This paper presents the concepts of FPNA and FPNN, used for the approximation of artificial neural networks in FPGAs and introduces derived types of these concepts used by the authors. The process of transformation of a trained artificial neural network to an FPNN is described. The diagram of the FPGA implementation is presented. The results of experiments determining the approximation capabilities of FPNNs are presented and the FPGA resources utilization are compared.
|
18:10 - 18:30 |
Cloudifier Virtual Apps - Virtual desktop predictive analytics apps environment based on GPU computing framework View Paper |
Andrei Ionut Damian Alexandru Purdila Nicolae Tapus |
Cloudifier SRL, Romania Cloudifier SRL, Romania University Politehnica of Bucharest, Romania |
Abstract: The need for systems capable of conducting inferential analysis and predictive analytics is ubiquitous in a global information society. With the recent advances in the areas of predictive machine learning models and massive parallel computing a new set of resources is now potentially available for the computer science community in order to research and develop new truly intelligent and innovative applications. In the current paper we present the principles, architecture and current experimentation results for an online platform capable of both hosting and generating intelligent applications -- applications with predictive analytics capabilities.
|
Intelligent Systems 5 (Saturday, September 9, 09:00 - 10:40)
Location: Venezia Room, 5th Floor
Chair: Călin Cenan Co-Chair: Camelia Chira |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
09:00 - 09:20 |
Using Self - Organising Maps for Pareto - Ranking in Genetic Multiobjective Optimisations View Paper |
Lavinia Ferariu |
"Gheorghe Asachi" Technical University of Iași, Romania |
Abstract: Numerous genetic algorithms with Pareto-ranking were proposed for solving multiobjective optimisations (MOOs). Mainly, these algorithms compute the fitness values of the solutions via dominance analysis. For few conflicting objectives, dominance analysis is suitable for managing the partial sorting; however, this technique is not capable to handle other common requirements of MOOs, such as preserving the diversity of the population or intensifying the exploration in some preferred areas. Preferably, any complementing ranking techniques should be configured in relation to the objective layout, although investigating the map of solutions in the objective space is very difficult. This paper presents an alternative for analysing the distribution of the solutions during the evolutionary loop. In this regard, the population is clustered in the objective space via a SOM and the dominance of each solution is analysed relative to its neighbours. Based on this analysis, two measures are proposed for detecting the cases when the diversity should be enhanced and/ or the exploration needs to be intensified around the middle of the best Pareto fronts. The approach permits triggering runtime alerts, for which specific ranking policies are introduced. The suggested adaptive ranking procedure is experimentally illustrated for bi-objective optimisations defined for strongly and weakly conflicting objectives, as well.
|
09:20 - 09:40 |
Machine Learning for Sensor-Based Manufacturing Processes View Paper |
Dorin Moldovan Tudor Cioara Ionut Anghel Ioan Salomie |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
Abstract: The increasing availability of relevant information, events and constraints in the environment of the modern factories due to deployment of IoT sensor technologies on the production line has led to an ”explosion” in contextual big data. At the same time the advancements in the machine learning field from the last years opened new approaches for the analysis of the manufacturing processes datasets that are characterized by noisy data, a large number of features and an imbalanced classification of the samples. In this paper we investigate the applicability and the impact of machine learning techniques for managing production processes considering the data from a semiconductor manufacturing process (SECOM dataset). We have applied algorithms such as Boruta and MARS for the selection of the most relevant features and the Random Forest and the Gradient Boosted Trees for the samples classification. The results show better values for precision when the features are selected using Boruta and MARS rather than PCA and better values for accuracy when the data is unsampled and classified using Random Forest and Logistic Regression rather than Gradient Boosted Trees.
|
09:40 - 10:00 |
Optimizing the Generation of Personalized Healthy Menus for Elderly People Using a Crab Breeding Inspired Method View Paper |
Viorica Rozina Chifu Emil St. Chifu Cristina Bianca Pop Ioan Salomie Alexandru Niculici |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
Abstract: In this paper we present a method for generating healthy diets for elderly people. The method proposed is based on the Crab Mating Optimization Algorithm, which is inspired from the breeding behavior of crabs in nature. In our case the generated diet is composed of several meals per day and it can be created for a number of 7 days. In generating a healthy diet we have considered the elder’s food preferences as well as the dietary restrictions associated with the diseases the elder suffers from. The method proposed has been integrated into an experimental prototype and evaluated on a set of profiles describing older people suffering from different diseases like diabetes and heart problems.
|
10:00 - 10:20 |
Chebyshev-Based Iterated Local Search for Multi-Objective Optimization View Paper |
Imen Ben Mansour Ines Alaya Moncef Tagina |
University of Manouba, Tunisia University of Manouba, Tunisia University of Manouba, Tunisia |
Abstract: Recently, scalarizing-function based approaches have received a renewed interest in multi-objective optimization. In fact, the simplicity of these functions allows them to be easily adaptable to any resolution method. In this contribution, we propose an iterated multi-objective local search algorithm, called MoLSAugWT, based on well-known and frequently-used scalarizing-function: the augmented weighted Chebyshev function. A neighborhood structure employing a ranking algorithm based on a weighted addition ratio is introduced in this work. The proposed ranking algorithm is integrated in MoLSAugWT in order to try to establish an order between the candidate items. It has the goal not only to focus the search on the most promising areas of the search space, thus to improve the quality of the obtained solutions, but also to help the population to converge while diminishing the computational time during the optimization process. To evaluate the performance of the proposed algorithm, MoLSAugWT is applied to the multi-objective multidimensional knapsack problem (MOMKP). Experimental results have shown the efficiency of MoLSAugWT compared to the state-of the-art algorithms on nine well-known benchmark instances of MOMKP.
|
10:20 - 10:40 |
A proposed Whale Search Algorithm with Adaptive Random Walk View Paper |
Eid Emary Hossam M. Zawbaa Mustafa Abdul Salam |
Cairo University, Egypt Babeș-Bolyai University, Romania Benha University, Egypt |
Abstract: In this paper, a variant of the recently introduced whale optimization algorithm (WOA) was proposed based on adaptive switching of random walk per individual search agent. WOA is recently proposed bio-inspired optimizers that employ two different random walks. The original optimizer stochastically switches between the two random walk at each iteration regardless of the search agents performance and regardless of the fitness terrain around it. In the proposed adaptive walk whale optimization algorithm (AWOA), an adaptive switching between the two random walk is recommended based on the agent's performance. Moreover, a random explorative switch of the walk is applied to allow search agents to try different walks. The proposed AWOA was benchmarked using 29 standard test functions with uni-modal, multi-modal, and composite test functions. Performance over such functions proves the capability of the proposed variant to outperform the original WOA.
|
Intelligent Systems 6 (Saturday, September 9, 10:50 - 12:30)
Location: Venezia Room, 5th Floor
Chair: Ioan Salomie Co-Chair: Anca Mărginean |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
10:50 - 11:10 |
Requirement Dependencies–based Formal Approach for Test Case Prioritization in Regression Testing View Paper |
Andreea Vescan Camelia Șerban Camelia Chisăliță-Crețu Laura Dioșan |
Babeș-Bolyai University, Romania Babeș-Bolyai University, Romania Babeș-Bolyai University, Romania Babeș-Bolyai University, Romania |
Abstract: Regression testing is the testing activity performed after changes occurred on software. Its aim is to increase confidence that achieved software adjustments have no negative impact on the already functional parts of the software. Test case prioritization is one technique that could be applied in regression testing with the aim to find faults early, resulting in reduced cost and shorten time of testing activities. Thus, prioritizing in the context of regression testing means to re-order test cases such that high priority ones are run first. The current paper addresses the test case prioritization as a consistent part of a larger approach on regression testing, which combines both test case prioritization and test case selection in order to overcome the limitations of each of them. A comprehensive formalization of test case prioritization is provided, incorporating beside the well known ingredients (test case, test requirement, fault, cost) also elements relating to the functional requirements and dependencies between requirements. An evolutionary algorithm is used to construct the re-ordering of test cases, considering as optimization objectives fault detection and cost. A synthetic case study was used to empirically prove our perspective for test case prioritization approach.
|
11:10 - 11:30 |
A System for Detecting Professional Skills from Resumes Written in Natural Language View Paper |
Emil St. Chifu Viorica Rozina Chifu Iulia Popa Ioan Salomie |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
Abstract: In this paper, we present a new method for detecting professional skills (as noun phrases) from resumes written in natural language. The proposed method uses an ontology of skills, the Wikipedia encyclopedia, and a set of standard multi word part-of-speech patterns in order to detect the professional skills. First, the method checks to see if there are, in the text of the resumes, skills that are concepts in our ontology. The method also tries to identify possible new skills, which are not present in our ontology. This is done with the help of some specific, lexicalized, multi-word expression patterns (i.e. specific contexts) that could surround new, unknown skills. The specific expression patterns (specific contexts) are induced by training from a corpus of resumes. This induction of the possible specific contexts for new skills is based on a set of standard, generic part-of-speech patterns (found by hand) that usually contain the skills already present in the ontology. Hence our skill extraction method is based on a bootstrapping approach. The newly detected skills are validated by a human expert and then inserted automatically into the skill ontology. Populating the ontology with the new skills is performed with the help of the Wikipedia encyclopedia. The method proposed has been tested on a set of resumes written by users as well as on a corpus collected by automatically extracting resumes from specific Web sites.
|
11:30 - 11:50 |
An approach to software development effort estimation using machine learning View Paper |
Vlad-Sebastian Ionescu |
Babeș-Bolyai University, Romania |
Abstract: We introduce a machine learning approach for real life software development effort estimation. Our method uses state of the art developments such as distributed word embeddings in order to create a system that can estimate effort given only basic project management metrics and, most importantly, textual descriptions of tasks. We use an artificial neural network for automating the effort estimation task. We evaluate our method on genuine software project data from a software company, obtaining results that surpass some of the related literature and a system that promises much easier integration into any software management tool that stores textual descriptions of tasks.
|
11:50 - 12:10 |
Ontology-Based Skill Matching Algorithms View Paper |
Teodor Petrican Ciprian Adrian Stan Marcel Antal Ioan Salomie Tudor Cioara Ionut Anghel |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
Abstract: Automatic recommendations based on skill matching techniques can prove to be an important component of an online recruitment platform, being able to lower the costs for employers, ease the process for candidates and increase the hiring quality overall. This is important nowadays, when online recruitment plays a major role in the hiring process. The main challenges in this area consist in providing relevant and computationally inexpensive results. In this paper we propose a semantic approach to the skill matching problem in the context of online recruitment. We present a metric of similarity based on a skills ontology and three algorithms on top of this metric, with the intent of obtaining a ranked skill-based matching between candidates and job offers. We also provide an analysis of those algorithms in terms of advantages, disadvantages and complexity.
|
12:10 - 12:30 |
Predicting Political Opinions in Social Networks with User Embeddings View Paper |
Ciprian Tălmăcel Florin Leon |
"Gheorghe Asachi" Technical University of Iași, Romania "Gheorghe Asachi" Technical University of Iași, Romania |
Abstract: As social media becomes the prevalent way of expressing one’s opinions, the capacity to analyze user behavioral patterns and trends becomes increasingly important. In this work we propose a method for embedding the opinions of social media users in low-dimensional vectors and explore some potential applications. Our proposal is general, in the sense that it can be easily adapted to a variety of use cases and social networks.
|
Intelligent Systems 7 (Saturday, September 9, 10:50 - 12:30)
Location: Beijing Room, 5th Floor
Chair: Tiberiu Leția Co-Chair: Radu Răzvan Slăvescu |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
10:50 - 11:10 |
Platform for Unified Enhanced Time Petri Net Models View Paper |
Attila Ors Kilyen Tiberiu S. Letia |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
Abstract: Unified Enhanced Time Petri Nets (UETPNs) are a new type of modeling platform conceived to control cyber-physical systems. They combine the features of regular Petri-nets and timed Petri-nets, describing fuzzy logic rules and arithmetical operators. They are capable of reacting or generating asynchronous events and continuous signals. In order to make the definition of UETPN models convenient, a domain specific language called UnifiedPLang was developed. The use of the UETPN models and UnifiedPLang is exemplified with an on/off controller. The execution algorithm of UETPN models is described with emphasis on speed improvements measured by various benchmark problems.
|
11:10 - 11:30 |
Time series - a taxonomy based survey View Paper |
Octavian Lucian Hasna Rodica Potolea |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
Abstract: The interest in time-series classification has increased in the last decade. Most of the proposals introduced new algorithms for classification, clustering and prediction. Most of the algorithms are dealing with two major aspects: dimensionality reduction techniques (ex: Piecewise Approximation Aggregation, Symbolic Aggregate Approximation etc.) and similarity measures (ex: Euclidean distance, Dynamic Time Warping etc.). In this article, we give an overview on the advantages and disadvantages of these algorithms.
|
11:30 - 11:50 |
GolfEngine: Network Management System for Software Defined Networking View Paper |
QianQian Li Reza Mohammadi Mauro Conti ChuanHuang Li XiaoLin Li |
University of Padova, Italy Shiraz University of Technology, Iran University of Padova, Italy Zhejiang GongShang University, China University of Florida, USA |
Abstract: Software Defined Networking (SDN) is a new networking paradigm which provides better decoupling between control plane and data plane. The separation not only allows OpenFlow (OF) switches in the data plane simply to forward data, but also enables the centralized programmable controller to control the behavior of entire network. SDN makes it possible to manage the network more flexible and simple. However, while promising, the current SDN frameworks also face new security challenges about network management. In this paper, we propose an innovative framework named GolfEngine, based on OpenDaylight controller, to simplify the development and deployment of security applications for SDN network. GolfEngine provides better visibility to dominate tasks for performing network anomalies on demand. In addition, we propose two important components to assures the robustness, the correctness and the efficiency of GolfEngine framework. The first component is Policy Conflict Detection, which is efficient and robust to discover and conciliate the contradictory flow rules. The second one is Network Status Coordinator, which focuses on simplifying and improving efficiency of the communication between the controller and OF-enabled switches. Moreover, we evaluate the performance and execution efficiency of GolfEngine through a use case implementation. The results of our simulation underline that these two components contribute significantly in improving the efficiency of GolfEngine.
|
11:50 - 12:10 |
A Parallel Evolutionary Approach to Community Detection in Complex Networks View Paper |
Marius Joldos Camelia Chira |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
Abstract: The problem of community detection in complex networks is of high interest in many application domains including sociology, biology, mathematics and economy. Given a set of nodes and links between them, the aim of the problem is to find a grouping of nodes such that a strong community has dense intra-connections and sparse outside community links. In this paper, a coarse-grained evolutionary algorithm (EA) is developed to address this challenging problem. Several populations of potential solutions are evolved in parallel in an island model and periodically exchange certain individuals. Each population can be evolved by a different fitness function and several approaches to evaluate the community structure are considered in the current paper. Experiments are performed for real-world complex networks and results are analysed based on the normalized mutual information between the detected and the known community structure. Comparisons with the standard version of the EA based on different fitness functions are performed and the results confirm a good performance of the parallel EA in terms of solution quality and computational time.
|
12:10 - 12:30 |
OpToGen - A Genetic Algorithm based Framework for Optimal Topology Generation for Linear Networks View Paper |
Adil A. Sheikh Emad Felemban Ahmad Alhindi Atif Naseer Mukhtar Ghaleb Ahmed Lbath |
Umm Al-Qura University, Saudi Arabia Umm Al-Qura University, Saudi Arabia Umm Al-Qura University, Saudi Arabia Umm Al-Qura University, Saudi Arabia Bisha University, Saudi Arabia University of Grenoble, France |
Abstract: Smart transportation is one of the essential components of smart cities that involves sensing traffic and pedestrians. Wireless Sensor Networks (WSN) have extensively been utilized over the years for sensing and data transfer in diverse structural deployments including mesh, ad hoc and hierarchical layouts. Several applications of WSN may involve placing the nodes in a linear topology, constituting a special class of networks called Linear Networks. Such networks are being used in smart cities to collect data from roads and highways. Additionally, in a densely deployed linear network case, issues related to optimal resource allocation and networking may persist because the standard network protocols attempt to manage the network as a mesh or an ad hoc infrastructure. In this paper, we present an optimal topology generation (OpToGen) framework that uses Genetic Algorithm (GA) to configure and deploy a heterogeneous wireless network for linear infrastructures. OpToGen framework is scalable to multiple tiers and the use of GA results in less computational overhead and fast convergence to optimal topologies that are verified by a discrete event simulator.
|
Computer Vision
Computer Vision 1 (Friday, September 8, 11:10 - 12:30)
Location: Ballroom , 1st Floor
Chair: Radu Dănescu Co-Chair: Robert Varga |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
11:10 - 11:30 |
Vehicle taillight detection and tracking using deep learning and thresholding for candidate generation View Paper |
Flaviu Ionut Vancea Arthur Daniel Costea Sergiu Nedevschi |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
Abstract: Vehicle taillights detection is an important topic in collision avoidance and in the field of autonomous vehicles. Analyzing the behavior of the front vehicle can prevent possible accidents. In this paper, a method for detecting vehicle taillights is presented. First, the system detects vehicles and then searches for candidate taillight pairs inside the obtained vehicles. Two methods for detecting candidate regions are presented. The first method uses explicit thresholds to extract red regions and the second method uses deep learning to segment taillights. Extracted candidates are then paired by comparing their sizes and centroid heights. Bhattacharyya coefficient is also used to validate taillight pairs by comparing their histograms. The system uses Kalman filtering to track detected taillights over time and to compensate for false negatives. The proposed solution is evaluated using the KITTI dataset.
|
11:30 - 11:50 |
Automatic extrinsic camera parameters calibration using Convolutional Neural Networks View Paper |
Razvan Itu Diana Borza Radu Danescu |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
Abstract: Camera calibration is essential for accurate computer vision, and automatic calibration of some extrinsic parameters is needed in case the camera is placed on a mobile platform. The pitch and yaw angles, which are the most likely ones to change as the vehicle moves, can be inferred from the image coordinates of the vanishing point (VP). In this paper we present an artificial neural network approach for detecting the vanishing point position in road traffic scenarios. The network is trained using 2500 images which are first automatically annotated using a classical vanishing point detection algorithm, and then manually validated. The training and test datasets are made publicly available. The trained network was tested on more than 250 images not previously seen by the network, locating the VP accurately in more than 90% of the cases.
|
11:50 - 12:10 |
A Facial Recognition Application Based on Incremental Supervised Learning View Paper |
Elena-Roxana Buhuș Lacrimioara Grama Catalina Șerbu |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Babeș-Bolyai University, Romania |
Abstract: Facial recognition applications present a great interest in the area of computer vision, with various methods and approaches that provide impressive performance. However, not all studies investigate the possibilities of using proper feature extraction methods with efficient classifiers, for applications that facial expression is not required for detection. In this sense, we propose another facial recognition application based on Local Binary Patterns or the fusion of Local Binary Patterns and Discrete Cosines Transform for feature extraction, with a classifier based on a Simplified Fuzzy Adaptive Resonance Theory Map neural network. Experiments results on two open source face databases (AT&T, Extended Yale B) show that the new approach achieves promising results.
|
12:10 - 12:30 |
Automatic gender recognition for “in the wild” facial images using convolutional neural networks View Paper |
Sergiu Cosmin Nistor Alexandra-Cristina Marina Adrian Sergiu Darabant Diana Borza |
Babeș-Bolyai University, Romania Babeș-Bolyai University, Romania Babeș-Bolyai University, Romania Technical University of Cluj-Napoca, Romania |
Abstract: Automatic recognition of human demographical attributes has implications in a variety of domains, such as surveillance systems, human computer interaction, marketing etc. In this paper, we present an automatic gender recognition method from facial images based on convolutional neural networks. In order to train the network, we merged together several face databases and also gathered and annotated a ~70000 facial images from the internet. We trained, evaluated and compared several network architectures that achieved impressive results on other computer vision tasks. The best accuracy is obtained using Inception-v4 network: 98.2% on our dataset, and 84% on Adience dataset.
|
Special Session: Automated Driving (Friday, September 8, 13:40 - 15:00)
Location: Ballroom , 1st Floor
Chair: Pasi Pyykönen Co-Chair: Ion Giosan |
VTT Technical Research Centre of Finland, Finland Technical University of Cluj-Napoca, Romania |
13:40 - 14:00 |
Online Cross-Calibration of Camera and LIDAR View Paper |
Bianca-Cerasela-Zelia Blaga Sergiu Nedevschi |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
Abstract: In an autonomous driving system, drift can affect the sensor’s position, introducing errors in the extrinsic calibration. For this reason, we have developed a method which continuously monitors two sensors, camera, and LIDAR with 16 beams, and adjusts the value of their cross-calibration. Our algorithm, starting from correct values of the extrinsic cross-calibration parameters, can detect small sensor drift during vehicle driving, by overlapping the edges from the LIDAR over the edges from the image. The novelty of our method is that in order to obtain edges, we create a range image and filter the data from the 3D point cloud, and we use distance transform on 2D images to find edges. Another improvement we bring is applying motion correction on laser scanner data to remove distortions that appear during vehicle motion. An optimization problem on the 6 calibration parameters is defined, from which we are able to obtain the best value of the cross-calibration, and readjust it automatically. Our system performs successfully in real time, in a wide variety of scenarios, and is not affected by the speed of the car.
|
14:00 - 14:20 |
Spectral attenuation in low visibility artificial fog - Experimental study and comparison to literature models View Paper |
Aki Mäyrä Eero Hietala Matti Kutila Pasi Pyykönen |
VTT Technical Research Centre of Finland, Finland VTT Technical Research Centre of Finland, Finland VTT Technical Research Centre of Finland, Finland VTT Technical Research Centre of Finland, Finland |
Abstract: The ECSEL joint undertaking RobustSENSE focuses on technologies and solutions for automated driving in adverse weather conditions. One of the main technology challenges is to improve laser scanner performance in fog where the existing 905 nm LIDAR reliability degrades below tolerances. This report briefly summarizes the results of experimental fog absorbance measurements, which were conducted in test laboratories located in VTT’s premises. The content of the presentation will focus on spectral absorbance at low visibility artificial fog in near infrared band.
|
14:20 - 14:40 |
An approach for segmenting 3D LiDAR data using multi-volume grid structures View Paper |
Selma Goga Sergiu Nedevschi |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
Abstract: This paper proposes a novel approach for segmenting and space partitioning data of sparse 3D LiDAR point clouds for autonomous driving tasks in urban environments. Our main focus is building a compact data representation which provides enough information for an accurate segmentation algorithm. We propose the use of an extension of elevation maps for automotive driving perception tasks which is capable of dealing with both protruding and hanging objects found in urban scenes like bridges, hanging road barrier, traffic tunnels, tree branches over road surface, and so on. For this we use a Multi-Volume grid representation of the environment. We apply a fast primary classifier in order to label the surface volumes as being part of the ground segment or of an object segment. Segmentation is performed on the object labeled data which is previously connected in a spatial graph structure using a height overlapping criterion. A comparison between the proposed method and the popular connected-components based segmentation method applied on an Elevation Map is performed in the end.
|
14:40 - 15:00 |
Real-Time Object Detection Using a Sparse 4-Layer LIDAR View Paper |
Mircea Paul Muresan Sergiu Nedevschi Ion Giosan |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
Abstract: The robust detection of obstacles, on a given road path by vehicles equipped with range measurement devices represents a requirement for many research fields including autonomous driving and advanced driving assistance systems. One particular sensor system used for measurement tasks, due to its known accuracy, is the LIDAR (Light Detection and Ranging). The commercial price and computational intensiveness of such systems generally increase with the number of scanning layers. For this reason, in this paper, a novel six step based obstacle detection approach using a 4-layer LIDAR is presented. In the proposed pipeline we tackle the problem of data correction and temporal point cloud fusion and we present an original method for detecting obstacles using a combination between a polar histogram and an elevation grid. The results have been validated by using objects provided from other range measurement sensors.
|
Computer Vision 3 (Friday, September 8, 15:15 - 16:35)
Location: Ballroom , 1st Floor
Chair: Tiberiu Marița Co-Chair: Mihai Negru |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
15:15 - 15:35 |
Lazy Feature Extraction and Boosted Classifiers for Object Detection View Paper |
Robert Varga Sergiu Nedevschi |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
Abstract: We present an optimization technique for general object detection and an algorithm for training decision trees. By delaying the calculation of the features as late as possible we drastically reduce the execution time. At detection we alternate between evaluating the necessary features and eliminating candidates. This enables us to have both a rich pool of features and a powerful classifier while keeping the execution time low. We test our approach on popular pedestrian detection benchmarks and obtain good results while improving on the baseline method. Our approach improves on the original method and is capable of running at more than 30 fps on 640x480 resolution images using only one CPU core.
|
15:35 - 15:55 |
Insatiate Boosted Forest: Towards Data Exploitation in Object Detection View Paper |
Farzin Ghorban Yu Su Mirko Meuter Anton Kummert |
Bergischen Universität Wuppertal, Germany Delphi Deutschland GmbH, Germany Delphi Deutschland GmbH, Germany Bergischen Universität Wuppertal, Germany |
Abstract: Boosted forest (BF) is a commonly used method for object detection. With the help of cascade strategy, it can efficiently reject non-object windows and finally, combined with sliding window paradigm, give the locations of target objects in an image. In the literature, many aspects of cascaded boosted forest (CBF) have been well studied, such as image representation, tree split and cascade structure. Although it has been extensively investigated, CBF is still not saturated. In this work, we demonstrate by a series of experiments that the performance of a CBF-based object detector can be significantly improved by careful data exploitation. Specifically, we use a simple yet efficient approach to collect more training samples with high quality. We show that the trained CBF-based detector can be included in the data collection loop to provide better training samples. Our experiments are conducted on two challenging pedestrian detection benchmarks: Caltech and Kitti. Taking Caltech for example, we get the best balance between performance and run-time: 17.2% miss rate (MR) while running at 11 frames per second (FPS) on a moderate CPU. Compared with state-of-the-art CBF-based detectors our method gives better or similar performance while running significantly faster.
|
15:55 - 16:15 |
Experimentation of Vision Algorithm Performance using Custom OpenCL™ Vector Language Extensions for a Graphical Accelerator with Vector Architecture View Paper |
Bogdan Ditu Fred Peterson Ciprian Arbone |
NXP Semiconductors, Romania NXP Semiconductors, USA NXP Semiconductors, Romania |
Abstract: OpenCL is a standard that supports a parallel programming paradigm which enables heterogeneous multi-core systems and also offers a high level of portability for the application. Some of the systems that are used with OpenCL might have vector capabilities at device compute units level. There are more ways the vector capabilities could be exploited by the OpenCL device application, the most common one being that of automatically enhance vector execution by using auto-vectorization of the aggregated device code. In most of the cases this is probably the ideal situation, but there are certain situations which would require the device application to be written in a vector fashion. These situations can be related to the increased control that the user might require over the execution of the code in a vector manner, including increased efficiency on the vector memory handling. To fulfill this need, we came with the idea of defining a language extension which we called the OpenCL Vector Language Extension. This extension will allow writing OpenCL applications in a vector manner without concerning about the vector details of the targeted device, by simply specifying to the compiler which are the pieces of code that can benefit from vector execution, as well as explicitly control the vector memory handling. The purpose of this paper is to briefly present the OpenCL Vector Language Extension (a non-standard extension we bring to the OpenCL paradigm), the implication of this extension on the OpenCL application and the steps on how to define or translate an existing OpenCL application toward the vector version. More important, the paper presents the experimentation we conducted for performance evaluation of different configurations for an OpenCL Vision Application using the vector extensions running on a graphical accelerator with vector architecture.
|
16:15 - 16:35 |
GPU memory leveraged for accelerated training using Tensorflow View Paper |
Dan-Georgian Marculeț Dragoș Teodor Gavriluț Razvan Benchea |
"Al. I. Cuza" University of Iași and Bitdefender Lab, Romania "Al. I. Cuza" University of Iași and Bitdefender Lab, Romania "Al. I. Cuza" University of Iași and Bitdefender Lab, Romania |
Abstract: Machine learning has been a detection technique used by many security vendors for some time now. With the enhancement brought by GPUs, many security products can now use different deep learning methods and forms of neural networks for malware classification. However, these new methods, as powerful as they are, are also limited by the amount of memory a GPU has or by the constant need of transferring data from CPU to GPU. As training for models used in security industry requires very large databases, consisting of millions of malicious and benign samples, security vendors had to look for ways to overcome memory constraints. This paper addresses this problem and presents some approaches that can be used when dealing with deep learning algorithms in conjunction with large databases, approaches that are adapted to different known machine learning frameworks like Theano or Tensorflow. The results obtained show that training time can be reduced by a factor of 30 if memory is used efficiently.
|
Computer Vision 4 (Friday, September 8, 16:50 - 18:10)
Location: Ballroom , 1st Floor
Chair: Florin Oniga Co-Chair: Raluca Brehar |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
16:50 - 17:10 |
Real-Time Micro-Expression Detection From High Speed Cameras View Paper |
Diana Borza Razvan Itu Radu Danescu |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
Abstract: This work presents an original real time, robust micro-expression detection algorithm. The algorithm analyses the movement modifications that occur around the most prominent facial regions using two absolute frame differences. Next, a machine learning algorithm is used to predict if a micro-expression occurred at a given frame t. Two classifiers were evaluated: decision tree and random forest classifier. The robustness of the proposed solution is increased by further processing the preliminary predictions of the classifier: the appropriate predicted micro-expression intervals are merged together and the interval that are too short are filtered out. The proposed solution achieved an 86.95% true positive rate on CASME2 dataset. The mean execution time of the proposed solution on 640x480 images is 9 milliseconds.
|
17:10 - 17:30 |
Animal Detection from Traffic Scenarios Based on Monocular Color Vision View Paper |
György Jaskó Ion Giosan Sergiu Nedevschi |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
Abstract: This paper presents a system capable of detecting various large sized wild animals from traffic scenes. Visual data is obtained from a camera with monocular color vison. The goal is to analyze the traffic scene image, to locate the regions of interest and to correctly classify them for finding the animals that are on the road and might cause an accident. A saliency map is generated from the traffic scene image, based on intensity, color and orientation features. The salient regions of this map are considered to be regions of interest. A database is compiled from a large number of images containing different four-legged wild animals. Relevant features are extracted from these and are used to train Support Vector Machine classifiers. These classifiers provide an accuracy of above 90% and is used to predict whether or not the selected regions of interest contain animals. If one of the regions is classified as containing an animal, a warning can be signaled.
|
17:30 - 17:50 |
An Extremely Fast Pattern Based Line Detector View Paper |
Ibrahim Cem Baykal Ismail Can Yilmaz |
Adana Science and Technology University, Turkey Çukurova University, Turkey |
Abstract: This article describes a completely new, fully automatic line detector algorithm that takes advantage of look-up tables to recognize and fit straight line patterns. The algorithm first recognizes any possible 4x4 pixel line patterns among the binary edge pixels and then uses several small look up tables to decide whether the connected patterns form a line or not. It is designed for real time processing of high resolution images such as the ones used in computer vision applications. The algorithm’s system on chip implementation is even cheaper and faster making it especially valuable for battery operated mobile robot applications. Based on the benchmarks, at the time of the writing, this algorithm is the fastest line detector in the literature.
|
17:50 - 18:10 |
Automated Prototype for Asteroids Detection View Paper |
Denisa Copândean Ovidiu Văduvescu Dorian Gorgan |
Technical University of Cluj-Napoca, Romania Instituto de Astrofísica de Canarias, Spain Technical University of Cluj-Napoca, Romania |
Abstract: Near Earth Asteroids (NEAs) are discovered daily, mainly by few major surveys, nevertheless many of them remain unobserved for years, even decades. Even so, there is room for new discoveries, including those submitted by smaller projects and amateur astronomers. Besides the well-known surveys that have their own automated system of asteroid detection, there are only a few software solutions designed to help amateurs and mini-surveys in NEAs discovery. Some of these obtain their results based on the blink method in which a set of reduced images are shown one after another and the astronomer has to visually detect real moving objects in a series of images. This technique becomes harder with the increase in size of the CCD cameras. Aiming to replace manual detection we propose an automated pipeline prototype for asteroids detection, written in Python under Linux, which calls some 3rd party astrophysics libraries.
|
Computer Vision 5 (Saturday, September 9, 09:00 - 10:20)
Location: Beijing Room, 5th Floor
Chair: Cristian Vicaș Co-Chair: Anca Ciurte |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
09:00 - 09:20 |
Deep Convolutional Neural Nets for Objective Steatosis Detection from Liver Samples View Paper |
Cristian Vicas Ioana Rusu Nadim Al Hajjar Monica Lupșor-Platon |
Technical University of Cluj-Napoca, Romania Regional Institute of Gastroenterology and Hepatology, Romania Regional Institute of Gastroenterology and Hepatology, Romania Regional Institute of Gastroenterology and Hepatology, Romania |
Abstract: In present technical paper we describe an algorithm to detect steatosis. The golden standard in liver diagnosis is the biopsy. For clinical investigations the score given by a human expert is good enough. In developing noninvasive tools one needs objective and reproducible measurements of the biopsy parameters. There are two approaches proposed here, one based on classical computer vision and another one based on the deep convolutional neural nets. Tests on 100 patients clearly show that neural net approach is superior both in performance levels and in the amount of work that is invested. This paper can be included in the area of semantic segmentation but with recent advances in computer vision, the lines between segmentation and classification are blurred out.
|
09:20 - 09:40 |
Classification of EEG signals in an Object Recognition task View Paper |
Iacob D. Rus Paul Marc Mihaela Dinsoreanu Rodica Potolea Raul Cristian Muresan |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Center for Cognitive and Neuronal Studies, Romania |
Abstract: The main objective of this paper is the time-frequency analysis of the EEG signal captured in a cognitive task (i.e. object recognition) performed by human subjects. We investigate whether the power spectral density of the gamma frequency range can be used to classify the outcome of the object recognition task (i.e. seen, unseen, uncertain). The EEG signals were acquired and analyzed from 128 electrodes located on all parts of the brain. Power spectral density features in gamma frequency are extracted and used for classification in support vector machine (SVM), K-Nearest Neighbor (KNN) and Artificial Neural Network (ANN) classifiers. We tested the hypothesis that gamma EEG would be measurable and could contribute to the classification during the object recognition tasks.
|
09:40 - 10:00 |
Towards Semantic Visual Features for Malignancy Description within Medical Images View Paper |
Abir Baâzaoui Walid Barhoumi Ezzeddine Zagrouba |
SIIVA_LIMTIC, Tunisia SIIVA_LIMTIC, Tunisia SIIVA_LIMTIC, Tunisia |
Abstract: Semantic gap, which is the difference between lowlevel image features and their high-level semantics, has become very popular and witnessed great interest in the last two decades. This paper deals with this problem and proposes a hybrid approach to learn image semantic concepts for modeling visual features in discriminative learning stage. It combines the advantages of human-in-the-loop and discriminative semantic models. Herein, we investigate the expert-domain knowledge and expertise owing to expert-in-the-loop to determine medicalknowledge informations. Semantic models aim to learn the correlations between low-level features and textual words to describe malignancy signs in terms of semantic visual descriptors. These descriptors are automatically generated from low-level image features by exploiting the semantic concepts-based clinician medical-knowledge. Reported results over mammography image analysis society (MIAS) database prove the effectiveness of this work and its outperformance relative to compared approaches.
|
10:00 - 10:20 |
A semiautomatic saliency model and its application to video compression View Paper |
Vitaliy Lyudvichenko Mikhail Erofeev Yury Gitman Dmitriy Vatolin |
Lomonosov Moscow State University, Russia Lomonosov Moscow State University, Russia Lomonosov Moscow State University, Russia Lomonosov Moscow State University, Russia |
Abstract: This work aims to apply visual-attention modeling to attention-based video compression. During our comparison we found that eye-tracking data collected even from a single observer outperforms existing automatic models by a significant margin. Therefore, we offer a semiautomatic approach: using computer-vision algorithms and good initial estimation of eye-tracking data from just one observer to produce high-quality saliency maps that are similar to multi-observer eye tracking and that are appropriate for practical applications. We propose a simple algorithm that is based on temporal coherence of the visual-attention distribution and requires eye tracking of just one observer. The results are as good as an average gaze map for two observers. While preparing the saliency-model comparison, we paid special attention to the quality-measurement procedure. We observe that many modern visual-attention models can be improved by applying simple transforms such as brightness adjustment and blending with the center-prior model. The novel quality-evaluation procedure that we propose is invariant to such transforms. To show the practical use of our semiautomatic approach, we developed a saliency-aware modification of the x264 video encoder and performed subjective and objective evaluations. The modified encoder can serve with any attention model and is publicly available.
|
Distributed Computing and Networking
Distributed Computing and Networking 1 (Friday, September 8, 11:10 - 12:30)
Location: Beijing Room, 5th Floor
Chair: Vasile Dădârlat Co-Chair: Bogdan Iancu |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
11:10 - 11:30 |
Evaluation of CTP and LEACH Protocols for Structural Health Monitoring Systems View Paper |
Amira Zrelli Tahar Ezzedine |
ENIT, Tunisia ENIT, Tunisia |
Abstract: To improve lifetime of wireless sensor networks, we must chose the adequate routing protocol for our system. Many are the routing protocol that can be used in structural health monitoring systems ‘SHM’. SHM are actually used to detect damage in civil structures, it’s an emergent technology owned to secure human life. Routing protocols have a big effect in these networks, then the routing protocol must be studied before setting up our SHM system. In this paper, we focus on hierarchics routing protocols specially CTP (collect tree protocol) and LEACH (Low-Energy Adptative Clustering Hierarchy) which are the most used protocols in SHM systems based on wsn. Indeed, by this work we study the performance of CTP and LEACH protocols, we implement these protocols under contiki system and we use cooja simulator to compare both LEACH and CTP performance.
|
11:30 - 11:50 |
Analysis, design and implementation of secure LoRaWAN sensor networks View Paper |
Bogdan Oniga Vasile Dadarlat Eli De Poorter Adrian Munteanu |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Ghent University, Belgium Vrije Universiteit Brussel, Belgium |
Abstract: LoRaWAN is a LPWAN (Low Power Wide Area Network) technology used in a large variety of Internet of Things (IoT) applications. The paper addresses the security concerns of data protection and data privacy in sensor networks that make use of the LoRaWAN (Long-Range Wide Area Network) protocol specification. In this context, this paper performs an in-depth analysis of the security aspects in LoRaWAN sensor networks. Additionally, a novel secure LoRaWAN network architecture is proposed. The architecture provides protected data transmission and prevents unauthorized access and data loss. Based on experimental results and security tests, recommendations are made concerning the best practices to be followed in order to provide secure data transmission and data privacy in applications built using the LoRaWAN protocol specification.
|
11:50 - 12:10 |
Energy-Aware Cooperative Localization Approach for Wireless Sensor Networks View Paper |
Badia Bouhdid Wafa Akkari Abdelfettah Belghith |
University of Manouba, Tunisia University of Manouba, Tunisia King Saud University, Saudi Arabia |
Abstract: Despite the fact that cooperative localization approaches stand well in Wireless Sensor Networks (WSNs), they impose challenge of increased energy consumption resulting from the important communication overhead required to accomplish the localization task. In this paper, we developed an Energy Aware Cooperative Localization approach (EACL) based on using a recursive localization system. Obviously, system coverage increases iteratively as nodes with newly-estimated position join the reference set. Our approach implements a novel selection strategy of reference nodes that is beneficial in the sense that: (1) it reduces the adverse effects of error propagation and accumulation, and (2) helps conserve the residual energy of the sensor nodes. Furthermore, we introduced a refinement mechanism to further improve localization accuracy without incurring any additional costs. Simulations showed that our approach consistently reduces the position error. It allows also conserving the energy and consequently prolonging the WSN life time.
|
12:10 - 12:30 |
Efficient Localization Approach for large scale Wireless Sensor Networks View Paper |
Badia Bouhdid Wafa Akkari Abdelfettah Belghith |
University of Manouba, Tunisia University of Manouba, Tunisia King Saud University, Saudi Arabia |
Abstract: To obtain a trade-off between location accuracy and implementation cost, recursive localization schemes are being pursued as a cost-effective alternative to more expensive localization approaches, where localization information increases progressively as new nodes compute their positions and become themselves reference nodes. A strategy is then required to control and maintain the distribution of these new reference nodes. The lack of such a strategy leads, especially in large scale networks, to wasted energy, important communication overhead and even impacts the localization accuracy. In this paper, we propose an efficient recursive localization approach that reduces the energy consumption, the execution time, and the communication overhead, yet it increases the localization accuracy through an adequate distribution of reference nodes within the network.
|
Special Session: HiPerGRID (Friday, September 8, 13:40 - 15:00)
Location: Beijing Room, 5th Floor
Chair: Florin Pop Co-Chair: Dorian Gorgan |
University Politehnica of Bucharest, Romania Technical University of Cluj-Napoca, Romania |
13:40 - 14:00 |
Community Engagement in Water Resources Planning Using Serious Gaming View Paper |
Marian Muste Andrea Carson Haowen Xu Mariana Mocanu |
University of Iowa, USA Institute for Water Resources, US Army Corps of Engineers, USA University of Iowa, USA University Politehnica of Bucharest, Romania |
Abstract: The necessity to optimize the use of water resources and to raise the awareness of different categories of stakeholders to competing user demands requires development of systems analyses involving big data situations. The paper presents a virtual problem-solving environment aimed at engaging individual citizens and communities in decision making using a game-like approach. A web-based environment was developed and successfully used for delivering the game into the community. The web-platform entailed the technical aspects of the multi-hazard mitigation planning (including visualization of the selected scenarios) as well as the mechanics of the game delivery (instructions for platform usage, compilation of scores, etc).
|
14:00 - 14:20 |
SWAT model calibration over Cloud infrastructures using the BigEarth platform View Paper |
Victor Bacu Constantin Nandra Teodor Stefanut Dorian Gorgan |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
Abstract: SWAT models are well known hydrological model used to simulate the influence of land uses and land practices on the quality and quantity of water in the simulated watershed. Calibration of complex SWAT models is a time-consuming operation that requires high computational resources. This paper presents a software solution, based on Cloud infrastructures, to optimize the execution time of this operation by running in parallel multiple SWAT model simulations. Exposing this functionality in a web-based application the gSWATCloud application allows users to control multiple SWAT models and multiple running sessions from a simple laptop or PC, without requiring technical knowledge on the computing infrastructure. The paper highlights the modules that build-up the proposed solution. The experiments performed on different SWAT models and different computational resources proves the benefits brought by this distributed approach of calibrating SWAT models.
|
14:20 - 14:40 |
Software Workbench for Interactive, Time Critical and Highly self-adaptive Cloud applications View Paper |
George Suciu Victor Suciu Cristina Butca Ciprian Dobre Florin Pop |
University Politehnica of Bucharest, Romania University Politehnica of Bucharest, Romania BEIA Consult International, Romania University Politehnica of Bucharest, Romania University Politehnica of Bucharest, Romania |
Abstract: An elastic early warning system enables people and authorities to save lives and property in case of disasters. In case of floods, a warning issued with enough time before the event will allow for reservoir operators to gradually reduce water levels, people to reinforce their homes, hospitals to be prepared to receive more patients, authorities to prepare and provide help. In this paper is presented an elastic disaster early warning system application that represents a “cloudified” early warning solution for natural disasters. The application collects data from real time sensors, processes the information, and provides warning services for the public. The system should be capable of collecting and processing the sensor data in nearly real time, and thus allowing very rapid response to urgent events. The main contribution of this paper consists in the presentation of the communication center with cloud computing clients that will be integrated with the early warning system.
|
14:40 - 15:00 |
Elastic Stack in Action for Smart Cities: Making Sense of Big Data View Paper |
Andrei Talaș Florin Pop Gabriel Neagu |
University Politehnica of Bucharest, Romania University Politehnica of Bucharest, Romania National Institute for Research and Development in Informatics, Romania |
Abstract: Having good decision support is absolutely necessary nowadays because of the need to improve and gain value. For any organization, it is vital to obtain value from anything it can and having huge amounts of data, Big Data, pushes them to do so. But only having Big Data is not enough. The most important thing is to use it smartly in order to gain valuable decision support. Making sense of Big Data would happen when we are able to use all the data we have and get important hints and directions. The impediment in the Big Data concept is to obtain support as fast it can be obtained, in real time if possible and the solution used needs to be very malleable because of information technology evolution. Because of this evolution and because of the ease in managing data with Elastic Stack we chose to use Elastic Stack to manage Big Data. The process of making sense of Big Data is based on three big steps: collect, process and use. In order to do so and to make sense of all of this, this paper proposes to use an Elastic Stack solution, also known as ELK (Elasticsearch, Logstash and Kibana), to easily and rapidly manage the Big Data problem. The purpose of making sense of Big Data is succeeding to extract valuable information to stand as decision support. In order to achieve that purpose and obtain valuable information from Big Data all components are used to process data and analyze the result to offer support for decision making.
|
Distributed Computing and Networking 2 (Friday, September 8, 15:15 - 16:35)
Location: Beijing Room, 5th Floor
Chair: Lucian Vințan Co-Chair: Emil Cebuc |
"Lucian Blaga" University of Sibiu, Romania Technical University of Cluj-Napoca, Romania |
15:15 - 15:35 |
A Task Scheduling Algorithm for HPC Applications using Colored Stochastic Petri Net Models View Paper |
Ion Dan Mironescu Lucian Vințan |
"Lucian Blaga" University of Sibiu, Romania "Lucian Blaga" University of Sibiu, Romania |
Abstract: The increase in demand for High Performance Computing (HPC) scientific applications motivates the efforts to reduce costs of running these applications. The problem to solve is that of dynamical multi-criterial optimal scheduling of an application on a HPC platform with a high number of heterogeneous nodes. The solution proposed by the authors is a HPC hardware-software architecture that includes the infrastructure for two level (node and inter-node level) adaptive load balancing. The article presents the development of an Coloured Petri Net(CPN) for such an architecture. The model was used for the development of a dynamic distributed algorithm for the scheduling problem. The CPN allowed a holistic hardware-software formal verification and analysis. Some simple properties were formally proofed. Simulations were performed to assess performance and the results were in the performance range of other load balancing algorithms with significant benefits in reducing the optimization’s complexity.
|
15:35 - 15:55 |
Cyber-Physical System for Assisted Living and Home Monitoring View Paper |
Maria Iuliana Bocicor David Cuesta Frau Ionut-Catalin Draghici Nicolae Goga Arthur-Jozsef Molnar Raúl Valor Pérez Andrei Vasilateanu |
SC Info World SRL, Romania Innovatec Sensing & Communication, Spain University Politehnica of Bucharest, Romania University Politehnica of Bucharest, Romania SC Info World SRL, Romania Innovatec Sensing & Communication, Spain University Politehnica of Bucharest, Romania |
Abstract: Assisted living and home monitoring systems are gradually becoming a necessity, considering the current trends in population ageing and older adults' desire to continue living independently in their homes and their communities for as long as possible. This paper presents our current achievements regarding the implementation of a cyber-physical system for assisted living and home monitoring, developed as part of a European Union-funded research project. Integrating a wireless network of smart, sensor-equipped hardware nodes for indoor localization and ambient monitoring, as well as both server and client side software components, the system offers basic activity recognition, decision support, supervision and real-time alerting capabilities. While briefly presenting the high-level architecture of our proposed solution, in this paper we focus our attention on the software components of the platform, which are portrayed in more detail. Using data acquired by the smart nodes along with data processing and analysis techniques, the system is able to pinpoint the location of monitored persons within the home, to provide reports about ambient conditions and basic activities of the supervised person and, most importantly, in case of potential danger, send real-time alerts to caregivers, allowing them to act immediately to ensure the safety of monitored persons.
|
15:55 - 16:15 |
FAULT TOLERANCE CAPABILITY OF CLOUD DATA CENTER View Paper |
Humphrey Emesowum Athanasios Paraskelidis Mo Adda |
University of Portsmouth, United Kingdom University of Portsmouth, United Kingdom University of Portsmouth, United Kingdom |
Abstract: In this era of big data and internet of things, the need for performance improvement in cloud data center is unavoidable. This has led to several designs of data center network topologies with the aim of achieving a data center that has the capability of tolerating fault during multiple failures. In this paper, we proposed improved variants of fat-tree interconnections to mitigate the challenges of fault tolerance. The availability of alternative paths for congestion control and fault tolerance gave Fat-tree an edge over other data center architectures, thereby becoming a widely used architecture for data center. Our focus is on client to server communications in a cloud data center network as explained in Fig. 7, hence simulation of HTTP application was carried out on different variants of fat tree designs. The simulation results with Riverbed showed that our proposed hybrid designs outperformed the Single fat tree designs as the number of link failures increase.
|
16:15 - 16:35 |
Coralcon: An Open Source Low-Cost Modem for Underwater IoT Applications View Paper |
Adil A. Sheikh Emad Felemban Adnan Ashraf |
Umm Al-Qura University, Saudi Arabia Umm Al-Qura University, Saudi Arabia Makkah Techno Valley Company, Saudi Arabia |
Abstract: Most underwater deployments rely on acoustics for enabling communication. The equipment for underwater communication is usually very expensive. Simulations of underwater systems are insufficient for many researchers. An open source platform built with commercially available hardware at low rates along with customizable software modules may prove to be very useful for researcher. This paper particularly focuses on the details of Coralcon, a low-cost, open-sources based modem designed to be integrated with underwater Internet of Things (IoT). We implemented Coralcon with a customizable minimal software defined radio (SDR). Coralcon can be interfaced with other equipment to send and receive data reliably at rates up to 1000 bits per sec with one antenna.
|
Distributed Computing and Networking 3 (Friday, September 8, 16:50 - 18:30)
Location: Beijing Room, 5th Floor
Chair: Gheorghe Sebestyen Co-Chair: Ionuț Anghel |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
16:50 - 17:10 |
Bayesian Analysis of Resource Allocation Policies in Data Centers in Terms of Virtual Machine Migrations View Paper |
Cora Crăciun Ioan Salomie |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
Abstract: One technique of solving energy consumption and performance issues in virtualized data centers consists in migrating the virtual machines between the physical hosts, in order to achieve either resource consolidation or load balancing. These migrations, however, may degrade the performance of the virtualized applications and of the servers and network involved in the migration process. A possible solution to reduce the number of virtual machine migrations is to use efficient resource allocation policies, which avoid resource over or under-usage. In this context, we compare two recently proposed Gaussian-type policies with greedy consolidation methods, in terms of virtual machine migrations. The outcomes of repeated simulation experiments are analyzed using Bayesian statistics. We assess different hierarchical Bayesian models to describe the virtual machine migration number, which enables us to ponder the average behavior of the resource allocation policies.
|
17:10 - 17:30 |
An Implementation of Loop Fusion for Improving Performance and Energy Consumption of Shared-Memory Parallel Codes View Paper |
Iulia Știrb |
Politehnica University of Timișoara, Romania |
Abstract: State-of-the-art Low Level Virtual Machine (LLVM) compiler infrastructure has a dedicated set of optimizations for loops. Each optimization is organized as a separate pass in LLVM, whereas passes are created using a mix of object creational patterns. However, recent focus of modern compilers is in improving runtime performance using a large set of conservative optimizations, most often omitting the energy consumption impact. This paper introduces the implementation of a new loop fusion algorithm designed for LLVM, which aims to improve both runtime performance and energy consumption of parallel codes involving loop parallelism. The algorithm proposed merges two non dependent loops with the same number of iterations and without any code between. Two loops are dependent when the first loop has to finish for the second to start, whereas two independent loops may not be allowed to be executed in parallel. This paper also proves that loop fusion is useful in optimizing loop parallelism, since the fusion of two loops cuts in half the number of threads that would otherwise be required to execute each iteration (when there is a one-to-one relation between threads and iterations). The decreased number of threads reduces the parallelization overhead which in turn improves the energy consumption. The improvements are discussed in the context of Non-Uniform Memory Access (NUMA) systems.
|
17:30 - 17:50 |
Proactive Day-Ahead Data Center Operation Scheduling for Energy Efficiency. Solving a MIOCP using a Multi-Gene Genetic Algorithm View Paper |
Marcel Antal Claudia Pop Tudor Cioara Ionut Anghel Ionut Tamas Ioan Salomie |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
Abstract: This paper addresses the problem of Data Centers (DC) energy efficiency by proposing a proactive optimization technique to schedule the day-ahead DC operation to minimize the operational cost. The proactive optimization technique is formalized as a Mixed Integer Optimal Control Problem, known to be NP-hard. Because the time needed for solving this problem by some of the gradient-based solvers depends on the input data, an evolutionary algorithm based solver that computes an approximate solution in a constant number of steps is proposed. The proactive DC optimization technique is implemented using the Lindo Lingo mathematical solver and using a genetic algorithm. Finally, the proposed solution is compared against a professional mathematical solver, Lindo Lingo, being able to compute an approximate solution in cases where the Lingo solver takes too long to determine the solution, and showing an overall cost improvement of the Data Center day-ahead operation of 5%, while the Lingo based solver achieves only 3.3% cost savings on the evaluated scenarios.
|
17:50 - 18:10 |
Self-Adaptive Task Scheduler for Dynamic Allocation in Energy Efficient Data Centers View Paper |
Marcel Antal Adelina Burnete Claudia Pop Tudor Cioara Ionut Anghel Ioan Salomie |
Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania Technical University of Cluj-Napoca, Romania |
Abstract: This paper tackles the issue of Data Centers (DCs) energy efficiency by proposing a Self-Adaptive Task Scheduler that proactively allocates incoming tasks to physical servers avoiding the need of consolidation and implicitly avoiding task migration from one server to another. The Self-Adaptive Task Scheduler is based on a MAPE architecture. The monitoring phase aggregates data about resource utilization, the analysis phase uses neural networks to predict future incoming tasks, while the planning stage consists of a genetic algorithm capable of solving a simplified mathematical programming representation of the well-known Dynamic Server Allocation Problem (DSAP). Finally, the execution stage consists of a real-time loop that matches real time incoming tasks to the predicted ones and allocates them according to the plan computed by the planning stage. A simulator was implemented and the proposed solution was compared against the well-known First-Fit-Decreasing (FFD) algorithm enhanced with a periodically-triggered consolidation algorithm on traces from a real Google datacenter, showing energy a reduction of up to 11% in the simulated scenarios.
|
18:10 - 18:30 |
PAPR Reduction in OFDM with Active Constellation Extension and Hadamard Transform View Paper |
Lele Dang Hui Li Songyun Guo |
Northwestern Polytechnical University, China Northwestern Polytechnical University, China Northwestern Polytechnical University, China |
Abstract: Orthogonal frequency division multiplexing (OFDM) uses multiple equal-spaced orthogonal subcarriers to carry transmit sequences which has the advantages of flexible bandwidth configuration and good anti-multipath performance. For high data rates in the wireless fading channel transmission, OFDM is one of the most promising wireless technology. However, peak-to-average power ratio (PAPR) can significantly reduce the bandwidth utilization and terminal efficiency of OFDM systems. In this paper, hadamard matrix conversion PAPR suppression method based on the Active Constellation Extension (ACE) is proposed. Firstly, a matrix transformation is carried out by the relationship between the PAPR size of the OFDM signal and the autocorrelation function of the input data sequence, which makes the upper bound of the high PAPR decrease, and then the PAPR is further suppressed by the constellation extension method.This method does not generate data distortion and reduce the data transfer rate on the basis of suppression of PAPR. Compared with the computer simulation, the proposed method is more effective with PAPR reduction than the single constellation expansion method, which can improve the performance of OFDM technology in Fifth generation communication and future wireless communication.
|