Proceedings of the International Multiconference on Computer Science and Information Technology
Volume 3
October 20–22, 2008. Wisła, Poland
ISSN 1896-7094
ISBN 978-83-60810-14-9
IEEE Catalog Number CFP0864E-CDR
Multiconference Proceedings (PDF, 55.773 M)
Preface
Workshop on Agent Based Computing V
-
Agent architecture for building Robocode players with SWI-Prolog
142 agent architecture, agent environment, robocode, prolog Vasile Alaiba, Armand Rotaru, pages 3 – 7. Show abstract Effective study of autonomous agents is a challenging activity, for researchers and programmers alike. In this paper we introduce a new agent environment built on top of the tank-simulation game Robocode. We consider this game a good choice for testing agent programming strategies and an excellent learning opportunity. We integrated SWI-Prolog into Robocode and built an architecture for writing agents with Prolog. Game generated events are translated by our framework into predicate invocations, allowing the agent to perceive the environment. An API to issue commands to the tank from Prolog acts as agent's effectors. We tested our architecture by implementing a simple autonomous agent that uses a static strategy. -
Tackling Complexity of Distributed Systems: towards an integration of Service-Oriented Computing and Agent-Oriented Programming
91 Agent oriented programming, Service Oriented Computing Giacomo Cabri, Letizia Leonardi, Raffaele Quitadamo, pages 9 – 15. Show abstract The development of distributed systems poses different issues that developers must carefully take into consideration. Web Services and (Mobile) Agents are two promising paradigms that are increasingly exploited in distributed systems design: they both try, albeit with very different conceptual abstractions, to govern unpredictability and complexity in wide-open distributed scenarios. In this paper, we compare the two approaches with regard to different aspects. Our aim is to provide developers with critical knowledge about the advantages of the two paradigms, stressing also the need for an intelligent integration of the two approaches. -
Connecting Methodologies and Infrastructures in the Development of Agent Systems
37 Methodologies, infrastructures, meta-model Giacomo Cabri, Mariachiara Puviani, Raffaele Quitadamo, pages 17 – 23. Show abstract In the building of agent systems developers can be supported by both appropriate methodologies and infrastructures, which guide them in the different phases of the development and provide useful abstractions. Nevertheless, we assist to a situation in which methodologies and infrastructures are not connected each other: the products of the analysis and design phases could not always be exploited in the implementation phase in a direct way, even if sometimes CASE-tools are present to help in translating methodologies' diagrams in infrastructures' code. This leads to a “gap” between methodologies and infrastructures that is likely to produce fragmented solutions and to make the application maintenance difficult. In this paper we face this issue, proposing three directions to solve the problem. We do not want to propose a “new brand” methodology and infrastructure tightly connected, rather, we aim at reusing as much as possible what already exists, not only in abstract terms, but also in concrete “fragments” of methodologies; an appropriate meta-language that describes how a methodologies works would be useful to more easily map them onto the infrastructures, or even to “compose” a new methodologies. A further approach is based on an “intermediate” layer between methodologies and infrastructures, which provides a mapping between the involved entities. -
Adaptation of Extended Polymorphic Self-Slimming Agent Model into e-Sourcing Platform
146 e-supply chains, e-commerce, e-marketplace, e-sourcing, agent technology Konrad Fuks, Arkadiusz Kawa, Waldemar Wieczerzycki, pages 25 – 29. Show abstract There are two main contributions of the work presented in this paper. First, extended PSA agent model is described. It is based on two new properties of so called bootstrap agent: multiplication and parallelism. Second contribution is application of extended PSA model to e-sourcing platform. This duo (e-sourcing and extended PSA model) shows enterprises can significantly increase resources acquisition and potential suppliers search. -
The Triple Model: Combining Cutting-Edge Web Technologies with a Cognitive Model in a ECA
208 agent architecture, cognitive modeling Maurice Grinberg, Stefan Kostadinov, pages 31 – 37. Show abstract This paper introduces a new model which is intended to combine the power of a connectionist engine based on fast matrix calculation, RDF based memory and inference mechanisms, and affective computing in a hybrid cognitive model. The model is called Triple and has the following three parts: a reasoning module making advantage of RDF based long-term memory by performing fast inferences, a fast connectionist mapping engine, which can establish relevance and similarities, including analogies, and an emotional module which modulates the functioning of the model. The reasoning engine synchronizes the connectionist and the emotional modules which run in parallel, and controls the communication with the user, retrieval from memory, transfer of knowledge, and action execution. The most important cognitive aspects of the model are context sensitivity, specific experiential episodic knowledge and learning. At the same time, the model provides mechanisms of selective attention and action based on anticipation by analogy. The inference and the connectionist modules can be optimized for high performance and thus ensure the real-time usage of the model in agent platforms supporting embodied conversational agents. -
Planning and Re-planning in Multi-actors Scenarios by means of Social Commitments
188 social commitments, distributed planning, HTN, BDI Antonín Komenda, Michal Pěchouček, Jiří Bíba, Jiří Vokřínek, pages 39 – 45. Show abstract We present an approach to plan representation in multi-actors scenarios that is suitable for flexible replanning and plan revision purposes. The key idea of the presented approach is in integration of (i) the results of an arbitrary HTN (hierarchical task network) -oriented planner with (ii) the concept of commitments, as a theoretically studied formalism representing mutual relations among intentions of collaborating agents. The paper presents formal model of recursive form of commitments and discusses how it can be deployed to a selected hierarchical planning scenarioThe presented research has been supported by I-GLOBE (http://i-globe.info/), the US ARMY, CERDEC project (no.: N62558-06-P-0353). The project has been performed in cooperationwith University of Edinburgh, Artificial Intelligence Application Institute . -
Utility-based Model for Classifying Adversarial Behaviour in Multi-Agent Systems
186 multi-agent systems, agent modelling, agent interactions, adversarial modelling Viliam Lisý, Michal Jakob, Jan Tožička, Michal Pěchouček, pages 47 – 53. Show abstract Interactions and social relationships among agents are an important aspect of multi-agent systems. In this paper, we explore how such relationships and their relation to agent's objectives influence agent's decision-making. Building on the framework of stochastic games, we propose a classification scheme, based on a formally defined concept of interaction stance , for categorizing agent's behaviour as self-interested, altruistic, competitive, cooperative, or adversarial with respect to other agents in the system. We show how the scheme can be employed in defining behavioural norms, capturing social aspects of agent's behaviour and/or in representing social configurations of multi-agent systems. -
A Case Study in Agent-Based Systems-Implementation: Distributed Information Management at the Public Prosecution
119 distributed information management agents security Martijn Warnier, Reinier Timmer, Michel Oey, Frances Brazier, Anja Oskamp, pages 55 – 61. Show abstract Securely managing shared information in distributed environments across multiple organizations is a challenge. Distributed information management systems designed to this purpose, must be able to support individual organizations' information policies whilst ensuring global consistency and completeness of information. This paper describes a multi-agent based prototype implementation of a distributed information management system, for distributed digital criminal dossiers. The prototype implementation runs on the multi-agent platform AgentScape.
3rd International Symposium Advances in Artificial Intelligence and Applications
-
Applying Differential Evolution to the Reporting Cells Problem
117 Differential Evolution, Location Management, Reporting Cells, Mobile Networks Sónia Almeida-Luz, Miguel Angel Vega-Rodríguez, Juan António Gómez-Pulido, Juan Manuel Sánchez-Pérez, pages 65 – 71. Show abstract The Location Management problem corresponds to the management of the network configuration with the objective of minimizing the costs involved. It can be defined using several different schemes that principally consider the location update and the paging costs. The Location Area and Reporting Cells are two common strategies used to solve the location management problem. In this paper we present a new approach that uses the Differential Evolution algorithm applied to the reporting cells planning problem, with the objective of minimizing the involved location management costs. With this work we want to define the best values to the differential evolution parameters and respective scheme, using 12 distinct test networks, as well as compare our results with the ones obtained by other authors. The final results obtained with this approach are very encouraging. -
Towards Word Sense Disambiguation of Polish
162 word sense diambiguation, machine learning Dominik Baś, Bartosz Broda, Maciej Piasecki, pages 73 – 78. Show abstract We compare three different methods of Word Sense Disambiguation applied to the disambiguation of a selected set of 13 Polish words. The selected words express different problems for sense disambiguation. As it is hard to find works for Polish in this area, our goal was to analyse applicability and limitations of known methods in relation to Polish and Polish language resources and tools. The obtained results are very positive, as using limited resources, we achieved the accuracy of sense disambiguation greatly exceeding the baseline of the most frequent sense. For the needs of experiments a small corpus of representative examples was manually collected and annotated with senses drawn from plWordNet. Different representations of context of word occurrences were also experimentally tested. Examples of limitations and advantages of the applied methods are discussed. -
USERING. Educational self-tuning–recommendations in the 8th level of ISO/OSI model.
147 IT security, computational intelligence, usering Miroslaw Bedzak, pages 79 – 82. Show abstract It is autumn 2012…The VMware Infrastructure …3, 4 editions virtualised crucial components of IT environment, i.e. computing (CPU, RAM), networking and storage. However, an important element was overlooked. Which one? A user. There was no mechanism built in into VI3/VI4 that would support administrator in gaining effectively the skills of implementing solutions advised by manufacturer, the so-called recommendations (“best practice”, etc.). Either, the high level of user’s skills (primarily, administrator’s ones) were not treated as a valuable resource of LAN infrastructure that could be (should be) used, while a “surplus” could be virtualised for a common good of the society (local and/or global), concentrated around the enterprise-class infrastructure virtualisation technology. The latest edition, VI5 beta, brings also an important change in this respect in the form of a new module, VMware Usering, which is directed in the current version (i.e. beta version) first of all towards security hardening, i.e. the linguistic inference has been accomplished by a method of fuzzy control basing on the knowledge-base built on the recommendation of VI5 beta manufacturer. The VMware Usering is a set of tools, stimulating users to take up actions being consistent with manufacturer’s recommendations (VMware User Hardening) and virtualising (optionally for administrators with HIGH FUZZY rating) the competence resource of advanced users (VMware User Competence Sharing). The rise of a resistance level of IT infrastructure to increasing threats of internet security is also a manufacturer’s own interest, therefore one can expect in the near future a popularisation of VMware Usering-class solutions as an important tool supporting an average internet surfer in his/her solitary struggle against crackers at the level of 8th OSI model layer . -
Sense-Based Clustering of Polish Nouns in the Extraction of Semantic Relatedness
161 Bartosz Broda, Maciej Piasecki, Stanisław Szpakowicz, pages 83 – 89. Show abstract The construction of a wordnet from scratch requires intelligent software support. An accurate measure of semantic relatedness can be used to extract groups of semantically close words from a corpus. Such groups help a lexicographer make decisions about synset membership and synset placement in the network. We have adapted to Polish the well-known algorithm of Clustering by Committee, and tested it on the largest Polish corpus available. The evaluation by way of a plWordNet -based synonymy test used Polish WordNet, a resource still under development. The results are consistent with a few benchmarks, but not encouraging enough yet to make a wordnet writer's support tool immediately useful. -
An immune approach to classifying the high-dimensional datasets
141 negative selection, anomaly detection Andrzej Chmielewski, Sławomir Wierzchoń, pages 91 – 96. Show abstract This paper presents an immune-based approach to problem of binary classification and novelty detection in high-dimensional datasets. It is inspired by the negative selection mechanism, which discriminates between self and nonself elements using only partial information. Our approach incorporates two types of detectors: binary and real-valued. Relatively short binary receptors are used for primary detection, while the real valued detectors are used to resolve eventual doubts. Such a hybrid solution is much more economical in comparison with “pure” approaches. The binary detectors are more faster than real-valued ones, what allows minimize computationally and timely complex operations on real values. Additionally, regardless of type of encoding, the process of sample's censoring is conducted with relatively small part of its attributes. -
Application of Clustering and Association Methods in Data Cleaning
111 data cleaning, data quality, data mining, attribute correction Lukasz Ciszak, pages 97 – 103. Show abstract Data cleaning is a process of maintaining data quality in information systems. Current data cleaning solutions require reference data to identify incorrect or duplicate entries. This article proposes usage of data mining in the area of data cleaning as effective in discovering reference data and validation rules from the data itself. Two algorithms designed by the author for data attribute correction have been presented. Both algorithms utilize data mining methods. Experimental results show that both algorithms can effectively clean text attributes without external reference data. -
A rule-based approach to robust granular planning
153 granular planning, route planning, rule-based systems Sebastian Ernst, Antoni Ligęza, pages 105 – 111. Show abstract Real-time execution of planned routes often requires re-planning, especially in dynamic, highly unpredictable environments. Ad-hoc re-planning in case of failure of an optimal (suboptimal) plan leads to deterioration of solution quality. Moreover, it becomes costly and often unmanageable under real time constraints. This paper describes an approach of a partitioning scheme for urban route planning problem domain into a hierarchy of subproblems and an algorithm for rule-based top-down route derivation. A hierarchical, rule-based approach is aimed at granular planning for generating robust, multi-variant plans. -
Support Vector Machines with Composite Kernels for NonLinear systems Identification
175 Support Vector Machines, composite kernels, nonlinear system identfication. Amina EL Gonnouni, Abdelouahid Lyhyaoui, Soufiane El Jelali, Manel Martínez Ramón, pages 113 – 118. Show abstract In this paper, a nonlinear system identification based on support vector machines (SVM) has been addressed. A family of SVM-ARMA models is presented in order to integrate the input and the output in the reproducing kernel Hilbert space (RKHS). The performances of the different SVM-ARMA formulations for system identification are illustrated with two systems and compared with the Least Square method. -
The Expert System Approach in Development of Loosely Coupled Software with Use of Domain Specific Language
113 Expert System, Web Services, Domain Specific Language, Fuzzy Surroundings Piotr Grobelny, pages 119 – 123. Show abstract This paper addresses the problem of supporting the software development process through the artificial intelligence. The expert systems could advise the domain engineer in programming without the detailed experience in programming languages. He will use and integrate, with the help of deductive database and domain knowledge, the previously developed software components to new complex functionalities. The Service Oriented Architecture (SOA) and loosely coupled software allow to fulfill these requirements. The objective of this document is to provide the knowledge representation of atomic Web Services which will be registered as the facts in the deductive database as well as the inferring techniques. Also, the use of Domain Specific Language (DSL) for modeling domain engineer’s requests to the expert system will be considered within this document. -
Coalition formation in multi-agent systemsevolutionary solution
203 Coalition formation, evolutionary algorithm, multi-agent systems Wojciech Gruszczyk, Halina Kwaśnicka, pages 125 – 130. Show abstract The paper introduces solution of Coalition Formation Problem (CFP) in Multi-Agents Systems (MAS) based on evolutionary algorithm. The main aim of our study is to develop an evolutionary based algorithm for creation of coalitions of agent for solving assumed tasks. We describe the coding schema and genetic operators such as mutation, crossover and selection that occurred to be efficient in solution of CFP. Last part of the document provides a brief comment on our research results. -
Applying Emphasized Soft Targets for Gaussian Mixture Model Based Classification
92 Classification, estimation, smoothing target, Gaussian Mixture Models Soufiane El Jelali, Abdelouahid Lyhyaoui, Aníbal Ramón Figueiras-Vidal, pages 131 – 136. Show abstract When training machines classifiers, it is possible to replace hard classification targets by their emphasized soft versions so as to reduce the negative effects of using cost functions as approximations to misclassification rates. This emphasis has the same effect as sample editing methods which have proved to be effective for improving classifiers performance. In this paper, we explore the effectiveness of using emphasized soft targets with generative models, such as Gaussian Mixture Models, that offer some advantages with respect to decision (prediction) oriented architectures, such as an easy interpretation and possibilities of dealing with missing values. Simulation results support the usefulness of the proposed approach to get better performance and show a low sensitivity to design parameters selection. -
Improving Naive Bayes Models of Insurance Risk by Unsupervised Classification
165 classification, cluster analysis, naive bayes, insurance risk Anna Jurek, Danuta Zakrzewska, pages 137 – 144. Show abstract In the paper application of Naïve Bayes model, for evaluation of the risk connected with life insurance of customers, is considered. Clients are classified into groups of different insurance risk levels. There is proposed to improve the efficiency of classification by using cluster analysis in the preprocessing phase. Experiments showed that, however the percentage of correctly qualified instances is satisfactory in case of Naïve Bayes classification, but the use of cluster analysis and building separate models for different groups of clients improve significantly the accuracy of classification. Finally, there is discussed increasing of efficiency by using cluster validation techniques or tolerance threshold that enables obtaining clusters of very good quality. -
On a class of defuzzification functionals
202 convex fuzzy numbers, ordered fuzzy numbers, defuzzification functional Witold Kosiński, Wiesław Piasecki, pages 145 – 150. Show abstract Classical convex fuzzy numbers have many disadvantages. The main one is that every operation on this type of fuzzy numbers induces the growing fuzziness level. Another drawback is that the arithmetic operations defined for them are not complementary, for instance: addition and subtraction. Therefore the first author (W. K.) with his coworkers has proposed the extended model called ordered fuzzy numbers (OFN). The new model overcomes the above mentioned drawbacks and at the same time has the algebra of crisp (non-fuzzy) numbers inside. Ordered fuzzy numbers make possible to utilize the fuzzy arithmetic and to construct the Abelian group of fuzzy numbers and then an algebra. Moreover, in turn out, that four main operations introduced are very suitable for their algorithmisation. The new attitudes demand new defuzzification operators. In the linear case they are described by the well--know functional representation theorem valid in function Banach spaces. The case of nonlinear functionals is more complex, however, it is possible to prove general, uniform approximation formula for nonlinear and continuous functionals in the Banach space of OFN. Counterparts of defuzzification functionals known in the Mamdani approach are also presented, some numerical experimental results are given and conclusions for further research are drawn. -
Properties of the World Health Organization's Quality of Life Index
201 Quality of life, inconsistency analysis, consistency-driven pairwise comparisons Tamar Kakiashvili, Waldemar W. Koczkodaj, Phyllis Montgomery, Kalpdrum Passi, Ryszard Tadeusiewicz, pages 151 – 154. Show abstract This methodological study demonstrates how to strengthen the commonly used World Health Organization's Quality of Life Index (WHOQOL) by using the consistency-driven pairwise comparisons (CDPC) method. From a conceptual view, there is little doubt that all 26 items have exactly equal importance or contribution to assessing quality of life. Computing new weights for all individual items, however, would be a step forward since it seems reasonable to assume that all individual questions have equal contribution to the measure of quality of life. The findings indicate that incorporating differences of importance of individual questions into the model is essential enhancement of the instrument. -
Accuracy Boosting Induction of Fuzzy Rules with Artificial Immune Systems
195 fuzzy logic, artificial immune system, data mining Adam Kalina, Edward Mężyk, Olgierd Unold, pages 155 – 159. Show abstract The paper introduces accuracy boosting extension to a novel induction of fuzzy rules from raw data using Artificial Immune System methods. Accuracy boosting relies on fuzzy partition learning. The modified algorithm was experimentally proved to be more accurate for all learning sets containing non-crisp attributes. -
On Classifcation Tools for Genetic Algorithms
211 genetic algorithm, entropy, fractal dimension, box-counting dimension Stefan Kotowski, Witold Kosiński, Zbigniew Michalewicz, Piotr Synak, Łukasz Brocki, pages 161 – 164. Show abstract Some tools to measure convergence properties of genetic algorithms are introduced. A classification procedure is proposed for genetic algorithms based on a conjecture: the entropy and the fractal dimension of trajectories produced by them are quantities that characterize the classes of the algorithms. The role of these quantities as invariants of the algorithm classes is discussed together with the compression ratio of points of the genetic algorithm. -
Evolution of Strategy Selection in Games by Means of Natural Selection
160 Evolutionary Dynamics, Natural Selection, Game Theory, Multi-agent Systems, Agent Design, JADE Daniel Laszlo Kovacs, Tadeus Dobrowiecki, pages 165 – 172. Show abstract In this paper we present a novel agent-based simulation model for the natural selection of replicating agents whose survival depends on their programs for selecting their strategy to interact with each other. Game theoretic models describe this kind of interaction. The simulator can be used both for analysis, design and verification of autonomous systems (by intuitively abstracting them into our model and running the simulation). Good system design can be selected even for difficult, underspecified design problems. Although the inspiration of the model comes from evolutionary game theory, the evolving agents may represent not only biological, but also technical systems (e.g. software agents), and thus in some cases the underlying evolutionary mechanisms and observations differ from the typically published results. -
Image Similarity Detection in Large Visual Data Bases
209 image similarity detection, visual databases, morphological spectra, cliques finding in graphs Juliusz Kulikowski, pages 173 – 180. Show abstract A method of similarity clusters detection in large visual databases is described in this work. Similarity clusters have been defined on the basis of a general concept of similarity measure. The method is based also on the properties of morphological spectra as a tool for image presentation. In the proposed method similarity of selected spectral components in selected basic windows are used to similarity of images evaluation. Similarity clusters are detected in an iterative process in which non-perspective subsets of images are step-by-step removed from considerations. In the method similarity graphs and hyper-graphs also play an auxiliary role. The method is illustrated by an example of a collection of medical images in which similarity clusters have been detected. -
Automatic Acquisition of Wordnet Relations by the Morpho-Syntactic Patterns Extracted from the Corpora in Polish
189 Roman Kurc, Maciej Piasecki, pages 181 – 188. Show abstract In the paper we present an adaptation of the Espresso algorithm of the extraction of lexical semantic relation to specific requirements of Polish. The introduced changes are of more technical character like the adaptation to the existing Polish language tools, but also we investigate the structure of the patterns that takes into account specific features of Polish as an inflectional language. A new method of the reliability measure computation is proposed. The modified version of the algorithm called Estratto was compared with the more direct reimplementation of Espresso on several corpora of Polish. We tested the influence of different algorithm parameters and different corpora on the received results. -
Using UML State Diagrams for Visual Modeling of Business Rules
96 UML, Rule Systems, Rules Representation, Rules Modeling, Hekate Konrad Kułakowski, Grzegorz J. Nalepa, pages 189 – 194. Show abstract Recently, in response to growing market demand, several different techniques of business rules representation have been created. Some of them try to present business rules in a visual manner. However, due to the complexity of the problem, the graphic representations that are proposed seem to be far from perfection. In this paper we would like to describe how UML state diagrams might be used for business rules formulation and visual modeling. The strength of this approach relies on reusing classical notions provided by UML 2.0, e.g. an action, guard, etc., in a way which is close to theirs original meaning. -
Discovering Technical Analysis Patterns
205 neural network, time series,data sequence discovery, financial analysis Urszula Markowska-Kaczmar, Maciej Dziedzic, pages 195 – 200. Show abstract In this paper our method of discovering data sequences in the time series is presented. Two major approaches to this topic are considered. The first one, when we need to judge whether a given series is similar to any of the known patterns and the second one when there is a necessity to find how many times within long series a defined pattern occurs. In both cases the main problem is to recognize pattern occurrence(s), but the distinction is essential because of the time frame within which identification process is carried on. The proposed method is based on the usage of multilayered feed-forward neural network. Effectiveness of the method is tested in the domain of financial analysis but its adaptation to almost any kind of sequences data can be done easily. -
A Hybrid Differential Evolution Algorithm to Solve a Real-World Frequency Assignment Problem
40 Differential Evolution Algorithm, Frequency Assignment Problem, Evolutionary Computing Marisa Maximiano, Miguel A. Vega-Rodríguez, Juan A. Gómez-Pulido, Juan M. Sánchez-Pérez, pages 201 – 205. Show abstract The Frequency Assignment is a very important task in the planning of the GSM networks, and it still continues to be a critical task for current (and future) mobile communication operators. In this work we present a hybrid Differential Evolution (DE) algorithm to solve a real-world instance of the Frequency Assignment problem. We present a detailed explanation about the hybridization method applied to DE in order to make it more efficient. The results that are shown use accurate interference information. That information was also adopted by other researchers and it represents a real GSM network, granting, therefore, an extremely important applicability. Furthermore, we have analyzed and compared our approach with other algorithms, obtaining good results. -
Hierarchical Rule Design with HaDEs the HeKatE Toolchain
137 rules, design prototyping, prolog Grzegorz Nalepa, Igor Wojnicki, pages 207 – 214. Show abstract A hierarchical approach to decision rule design in the HeKatE project is introduced in the paper. Within this approach a rule prototyping method ARD+ is presented. It offers a high-level hierarchical prototyping of decision rules. The method allows to identify all properties of the system being designed, and using the algorithm presented in this paper, it allows to automatically build rule prototypes. A practical implementation of the prototyping algorithm is discussed and a design example is also presented. Using the rule prototypes actual rules can be designed in the logical design stage. The design process itself is supported by HaDEs, a set of of design tools developed within the project. -
On Automation of Brain CT Image Analysis
200 Brain CT Image Analysis, Image Classification, Automatic Image Annotation Mariusz Paradowski, Halina Kwasnicka, Martin Tabakov, Jacek Filarski, Marek Sasiadek, pages 215 – 220. Show abstract During recent years a number of medical diagnosis support systems have been presented. Such systems are important from medical point of view, because they aid physicians in their daily work. Some of those systems, like Computed tomography support systems (CT) rely on image data analysis, making it an interesting research topic from pattern recognition point of view. The paper presents a practical realization of a medical support system, including segmentation, feature extraction and decision processes. Various techniques are used within the system: fuzzy logic, shape descriptors, classifiers and automatic image annotation framework. Series of 2D CT images are an input of the system. As a result, a patient's diagnosis is presented. -
Neuronal Groups and Interrelations
216 neural networks, pattern recognition, structural pattern recognition, pattern classification Michal Pryczek, pages 221 – 227. Show abstract Recent development of various domains of Artificial Intelligence including Information Retrieval and Text/Image Understanding created demand on new, sophisticated, contextual methods for data analysis. This article formulates Neuronal Group and Extended Neuron Somatic concepts that can be vastly used in creating such methods. Neural interrelations are described using graphs, construction of which is done in parallel with neural network learning. Prototype technique based on Growing Neural Gas is also presented to give more detailed view. -
Semantic Relevance Measure between Resources based on a Graph Structure
215 Semantic Relevance, Knowledge Management, Information Retrieval, Ontology Sang Keun Rhee, Jihye Lee, Myon-Woong Park, pages 229 – 236. Show abstract Upon developing an information system, establishing an effective information retrieval method is the main problem along with the management of the information. With ontologies, the semantic structure of knowledge can be represented and the resources can be managed along with their context information. Observing that within this knowledge space, resources are not isolated but connected to each other by various types of relations, we believe that utilising those relations can aid information provisioning. In this paper, we explore the concept of the semantic relevance between resources and the semantic structure of a knowledge space, and present a relevance measuring algorithm between resources based on a graph structure, with two implementation examples. -
Exclusion Rule-based Systems—case study
25 Exclusion Rule-based Systems, decision support systems, railway traffic management system Marcin Szpyrka, pages 237 – 242. Show abstract The exclusion rule-based system , proposed in the paper, is an alternative method of designing a rule-based system for an expert or a control system. The starting point of the approach is the set of all possible decisions the considered system can generate. Then, on the basis of the exclusion rules, the set of decisions is limited to the ones permissible in the current state. A railway traffic management system case study is used in the paper to demonstrate the advantages of the new approach. To compare the exclusion rule-based systems with the decision tables both solutions are considered. -
Ontological Matchmaking in a Duty Trip Support Application in a Virtual Organization
220 admin Michal Szymczak, Grzegorz Frackowiak, Maria Ganzha, Marcin Paprzycki, Sang Keun Rhee, Jihye Lee, pages 243 – 250. Show abstract In our work, an agent-based system supporting workers in fulfilling their roles in a virtual organization has as its centerpiece ontologically demarcated data. In such system, ontological matchmaking is one of key functionalities in provisioning of personalized information. Here, we discuss how matchmaking will be facilitated in a Duty Trip Support application. Due to the nature of the application, particular attention is paid to the geospatial data processing. -
Computer-aided Detecting of Early Strokes and its Evaluation on the Base of CT Images
199 ischemic stroke, Ct image, graphics Grzegorz Terlikowski, Elżbieta Hudyma, pages 251 – 254. Show abstract This paper presents an easy way of finding strokes on computer tomography images. By calculating a cohesive rate (CR) of suspicious pixels on a series of CT images there is a possibility of calculating a general probability of a stroke. In a difficult case there is always an opportunity to generate a graph of all stroke probabilities interposed on the original image. It is a very helpful tool for specialists and neurologists working in emergency situations . Supported by grant N518 022 31/1338 (Ministry of Science).
Computer Aspects of Numerical Algorithms
-
Numerical Algorithm of Solving the Forced sin-Gordon Equation
115 numerical algorithm sin-Gordon differential equations riemann function Alexandre Bezen, pages 257 – 261. Show abstract The numerical method of solving the problem of small perturbations of a stationary traveling solution (soliton) of well-known in physics sin-Gordon equation is presented. The solution is reduced to solving a set of linear hyperbolic partial differential equations. The Riemann function method is used to find a solution of a linear PDE. The value of the Riemann function at any particular point is found as a solution of an ordinary differential equation. An algorithm of calculation of a double integral over a triangular integration area is given. -
Merging Jacobi and Gauss-Seidel Methods for Solving Markov Chains on Computer Clusters
204 Markov chains, parallel algorithms, computer clusters, queuing models, iterative methods, linear equations Jaroslaw Bylina, Beata Bylina, pages 263 – 268. Show abstract The authors consider the use of the parallel iterative methods for solving large sparse linear equation systems resulting from Markov chains—on a computer cluster. A combination of Jacobi and Gauss-Seidel iterative methods is examined in a parallel version. Some results of experiments for sparse systems with over 3× 10^7 equations and about 2× 10^8 nonzeros which we obtained from a Markovian model of a congestion control mechanism are reported. -
Influence of Molecular Models of Water on Computer Simulations of Water Nanoflows
214 Molecular Dynamics Simulation, molecular models of material, microchannels. Computer Simulation Janusz Bytnar, Anna Kucaba-Piętal, Zbigniew Walenta, pages 269 – 275. Show abstract We present some problems related to the influence of molecular models on Molecular Dynamics simulation of water nanoflows. Out of large number of existing models of water we present some results of the MD simulation of water nanoflows for four molecular models: TIP4P, PPC TIP4P-2005, TIP5P. -
A Sparse Shared-Memory Multifrontal Solver in SCAD Software
47 sparse direct solver, multi-core computers, finite element method, block factorization Sergiy Fialko, pages 277 – 283. Show abstract A block-based sparse direct finite element solver for commonly used multi-core and shared-memory multiprocessor computers has been developed. It is intended to solve linear equation sets with sparse symmetrical matrices which appear in problems of structural and solid mechanics. A step-by-step assembling procedure together with simultaneous elimination of fully assembled equations distinguishes this method from the well-known multifrontal solver, so this approach can be interpreted as a generalization of the conventional frontal solver for an arbitrary reordering. This solver is implemented in the commercial finite element software SCAD (www.scadsoft.com), and its usage by numerous users has confirmed its efficiency and reliability. -
Testing Tesla Architecture for Scientific Computing: the Performance of Matrix-Vector Product.
212 GPU, tesla architecture, BLAS, matrix-vector product, CUDA Paweł Macioł, Krzysztof Banaś, pages 285 – 291. Show abstract The paper presents results of several experiments evaluating the performance of NVIDIA processors, implementing a new Tesla architecture, in matrix-vector multiplication. Three matrix forms, dense, banded and sparse, are considered together with three hardware platforms: NVIDIA Tesla C870 computing board, NVIDIA GeForce 8800 GTX graphics card and one of the newest Intel Xeon processors, E5462, with 1.6 GHz front side bus speed. The conclusions from experiments indicate what speed-ups can be expected when, instead of standard CPUs, accelerators in the form of presented GPUs are used for considered computational kernels. -
Solving a Kind of BVP for Second-Order ODEs Using Novel Data Formats for Dense Matrices
213 bvp, cache memory, data structures, multicore Przemyslaw Stpiczynski, pages 293 – 296. Show abstract The aim of this paper is to show that a kind of boundary value problem for second-order ordinary differential equations which reduces to the problem of solving tridiagonal systems of linear equations can be efficiently solved on modern multicore computer architectures. A new method for solving such tridiagonal systems of linear equations, based on recently developed algorithms for solving linear recurrence systems with constant coefficients, can be easily vectorized and parallelized. Further improvements can be achieved when novel data formats for dense matrices are used. -
On the hidden discrete logarithm for some polynomial stream ciphers
169 numerical encryption algorithms, private key, finite automata, implementation Vasyl Ustimenko, pages 297 – 301. Show abstract The paper is devoted to the special key management algorithm for the stream ciphers defined in [12] via finite automata corresponding to the family of directed algebraic graphs of high girth and two affine transformation. Security of the key based on the complexity of the discrete logarithm problem. It has additional heuristic security because of the “hidden base” and “hidden value” of the discrete logarithm function. We consider the heuristic evaluation of the complexity for different attack by the adversary on our private key cipher. The detailed description of the whole algorithm is given. Implemented software package has been used for the evaluation of mixing properties and speed of the private key encryption. -
Empirically Tuning LAPACK's Blocking Factor for Increased Performance
60 empirical tuning, LAPACK, ATLAS, BLAS, linear algebra Clint Whaley, pages 303 – 310. Show abstract LAPACK (Linear Algebra PACKage) is a statically cache-blocked library, where the blocking factor ( NB) is determined by the service routine ILAENV. Users are encouraged to tune NB to maximize performance on their platform/BLAS (the BLAS are LAPACK's computational engine), but in practice very few users do so (both because it is hard, and because its importance is not widely understood). In this paper we (1) Discuss our empirical tuning framework for discovering good NB settings, (2) quantify the performance boost that tuning NB can achieve on several LAPACK routines across multiple architectures and BLAS implementations, (3) compare the best performance of LAPACK's statically blocked routines against state of the art recursively blocked routines, and vendor-optimized LAPACK implementations, to see how much performance loss is mandated by LAPACK's present static blocking strategy, and finally (4) use results to determine how best to block nonsquare matrices once good square blocking factors are discovered.
Computational Linguistics—Applications
-
The incidence of corpus quality and type on topic based text segmentation evaluation: Should we change our evaluation methods in NLP?
31 Alexandre Labadié, Violaine Prince, pages 313 – 319. Show abstract In this paper, we try to fathom the real impact of corpus quality on methods performances and their evaluations. The considered task is topic-based text segmentation, and two highly different unsupervised algorithms are compared: C99, a word-based system, augmented with LSA, and Transeg, a sentence-based system. Two main characteristics of corpora have been investigated: Data quality (clean vs raw corpora), corpora manipulation (natural vs artificial data sets). The corpus size has also been subject to variation, and experiments related in this paper have shown that corpora characteristics highly impact recall and precision values for both algorithms. -
N-gram language models for Polish language. Basic concepts and applications in automatic speech recognition systems.
219 language models, language modeling, POS-tagging, ASR, Polish language, n-gram models Bartosz Rapp, pages 321 – 324. Show abstract Usage of language models in automatic speech recognition systems usually give significant quality and certainty improvement of recognition outcomes. On the other hand, wrongly chosen or trained language models can result in serious degradation not only recognition quality but also overall performance of the system. Proper selection of language material, system parameters and representation of the model itself is important task during language models construction process. This paper describes basic aspects of building, evaluating and applying language models for Polish language in automatic speech recognition systems, which are intended to be used by lawyer's chambers, judiciary and law enforcements. Language modeling is a part of project which is still early stage of development and work is ongoing so only some basic concepts and ideas are presented in this paper. -
SemCAPTCHA―user-friendly alternative for OCR-based CAPTCHA systems
183 CAPTCHA, positive semantic priming, linguistic competence, OCR Paweł Łupkowski, Mariusz Urbański, pages 325 – 329. Show abstract In this paper we present a new CAPTCHA system (Completely Automated Turing Test To Tell Computers and Humans Apart ). This proposal, SemCAPTCHA, is motivated by an increasing number of broken OCR-based CAPTCHA systems and it is based not only on text recognition but also on text understanding. We describe SemCAPTCHA from both user's perspective and system's perspective and compare it to some currently popular CAPTCHAs. We also briefly describe an experiment carried out to test our CAPTCHA on human users. -
Arabic/English Word Translation Disambiguation Approach based on Naive Bayesian Classifier
145 parallel corpus, word translation disambiguation, Arabic language, cross language information retrieval Farag Ahmed, Andreas Nürnberger, pages 331 – 338. Show abstract We present a word sense disambiguation approach with application in machine translation from Arabic to English. The approach consists of two main steps: First, a natural language processing method that deals with the rich morphology of Arabic language and second, the translation including word sense disambiguation. The main innovative features of this approach are the adaptation of the Naïve Bayesian approach with new features to consider the Arabic language properties and the exploitation of a large parallel corpus to find the correct sense based on its cohesion with words in the training corpus. We expect that the resulting system will overcome the problem of the absence of the vowel signs, which is the main reason for the translation ambiguity between Arabic and other languages. -
Smarty―Extendable Framework for Bilingual and Multilingual Comprehension Assistants
107 comprehension assistant, WordNet, BalkaNet, intelligent dictionary, Bulgarian, English, lexical resources, word-sense disambiguation Todor Arnaudov, Ruslan Mitkov, pages 339 – 344. Show abstract This paper discusses a framework for development of bilingual and multilingual comprehension assistants and presents a prototype implementation of an English-Bulgarian comprehension assistant. The framework is based on the application of advanced graphical user interface techniques, WordNet and compatible lexical databases as well as a series of NLP preprocessing tasks, including POS-tagging, lemmatisation, multiword expressions recognition and word sense disambiguation. The aim of this framework is to speed up the process of dictionary look-up, to offer enhanced look-up functionalities and to perform a context-sensitive narrowing-down of the set of translation alternatives proposed to the user. -
SuperMatrix: a General Tool for Lexical Semantic Knowledge Acquisition
163 Bartosz Broda, Maciej Piasecki, pages 345 – 352. Show abstract The paper presents the SuperMatrix system, which was designed as a general tool supporting automatic acquisition of lexical semantic relations from corpora. The construction of the system is discussed, but also examples of different applications showing the potential of SuperMatrix are given. The core of the system is construction of co-incidence matrices from corpora written in any natural language as the system works on UTF-8 encoding and possesses modular construction. SuperMatrix follows the general scheme of distributional methods. Many different matrix transformations and similarity computation methods were implemented in the system. As a result the majority of existing Measures of Semantic Relatedness were re-implemented in the system. The system supports also evaluation of the extracted measures by the tests originating from the idea of the WordNet Based Synonymy Test. In the case of Polish, SuperMatrix includes the implementation of the language of lexico-syntactic constraints delivering means for a kind of shallow syntactic processing. SuperMatrix processes also multiword expressions as lexical units being described and elements of the description. Processing can be distributed as a number of matrix operations were implemented. The system serves huge matrices. -
Definition Extraction: Improving Balanced Random Forests
154 definition extraction, machine learning, balanced random forests Łukasz Degórski, Łukasz Kobyliński, Adam Przepiórkowski, pages 353 – 357. Show abstract The article discusses methods of improving the ways of applying Balanced Random Forests (BRFs), a machine learning classification algorithm, used to extract definitions from written texts. These methods include different approaches to selecting attributes, optimising the classifier prediction threshold for the task of definition extraction and initial filtering by a very simple grammar. -
An application supporting language analysis within the framework of the phonetic grammar
120 language analysis, phonetics, phonetic grammar, data-mining, distances Krzysztof Dyczkowski, Norbert Kordek, Paweł Nowakowski, Krzysztof Stroński, pages 359 – 362. Show abstract The aim of the paper is to present an application supporting language analysis within the framework of the phonetic grammar. The notion of the phonetic grammar has been concisely introduced and the basic potential of the application and the algorithms employed in it are briefly discussed. The application is to enable a uniform description and a comparative analysis of many languages. At the first stage the languages taken into consideration are Polish, Chinese and Hindi. -
Set of active suffix chains and its role in development of the MT system for Azerbaijani
61 Machine translation, formal parsing, Azerbaijani, Turkic languages group Rauf Fatullayev, Ali Abbasov, Abulfat Fatullayev, pages 363 – 368. Show abstract Set of active suffix chains and its role in development of the MT system for AzerbaijaniAbstract—Definition process of the active suffix chains of Azerbaijani (The Azerbaijani language) has been explained in this paper. Morphologically Azerbaijani word-forms consist of the stem and simple or compound suffixes (suffix chains). Because Azerbaijani is an agglutinative language the simple suffixes in this language can form the great number of suffix chains and consequently, it is possible to generate practically “endless” number of word-forms from the same stem. While developing machine translation system from Azerbaijani into non-Turkic languages for the translation of any word-form it is necessary to translate its suffix chain in whole. It is not possible to get the correct translation of a word-form by translating each simple suffix of its suffix chain separately and then by putting these translations together. For these reasons, it is necessary to define the subset of suffix chains frequently used in the texts instead of the set of every possible suffix chains. -
A New Word Sense Similarity Measure in WordNet
75 Semantic Similarity,Information Content,Lexical Database,WordNet Ali Sebti, Ahmad Abodollahzadeh Barfroush, pages 369 – 373. Show abstract Recognizing similarities between words is a basic element of computational linguistics and artificial intelligence applications. This paper presents a new approach for measuring semantic similarity between words via concepts. Our proposed measure is a hybrid system based on using a new Information content metric and edge counting-based tuning function. In proposed system, hierarchical structure is used to present information content instead of text corpus and our result will be improved by edge counting-based tuning function. The result of the system is evaluated against human similarity ratings demonstration and shows significant improvement in compare with traditional similarity measures. -
A Linguistic Light Approach to Multilingualism in Lexical Layers for Ontologies
155 Alexander Troussov, John Judge, Mikhail Sogrin, Amine Akrout, Brian Davis, pages 375 – 379. Show abstract Semantic web ontologies are being increasingly used in modern text analytics applications and Ontology-Based Information Extraction(OBIE) as a means to provide a semantic backbone either for modelling the internal conceptual data structures of the Text Analytics(TA) engine or to model the Knowledge base to drive the analysis of unstructured information in raw text and subsequent Knowledge acquisition and population. Creating and targeting Language Resources(LR)s from a TA to an Ontology can be time consuming and costly. In \cite{lrecpaper} the authors describe a user-friendly method for Ontology engineers to augment an ontologies with a lexical layer which provides a flexible framework to identify term mentions of ontology concepts in raw text. In this paper we explore multilinguality in these lexical layers using the same framework. We discuss a number of potential issues for the “linguistic light” Lexical Extensions for Ontologies (LEON) approach when looking at languages more morphologically rich and which have more complex linguistic constraints than English. We show how the LEON approach can cope with these phenomena once the morphological normaliser used in the lexical analysis process is able to generalise sufficiently well for the language concerned. -
Modeling the Frequency of Phrasal Verbs with Search Engines
97 phrasal verbs, search engines, quantitative language modeling Dawid Weiss, Grażyna Chamielec, pages 381 – 388. Show abstract There are well over a thousand phrasal verbs in English. For non-native speakers they are notoriously difficult to remember and use in the right context. We tried to construct a ranking of phrasal verbs according to their estimated occurrence frequency, based on quantitative information available from the public indexable Web. Technically, we used major Web search engines to acquire phrase-occurrence statistics, measured consistency between the rankings implied by their results and confirmed that a rough set of `classes' of phrasal verbs can be distinguished. While this technique relies on inaccurate and possibly biased estimation functions, we show that the overall distribution of ranks seems to be consistent among all the queried search engines operated by different vendors.
8th International Multidisciplinary Conference on e-Commerce and e-Government
-
On stimulus for citizens' use of e-government services
46 e-governmemnt, motivation for use, EU27 Cene Bavec, pages 391 – 395. Show abstract On stimulus for citizens’ use of e-government servicesAbstract―The paper presents the desk research on interdependences between individual use of e-government services and group of selected socio-economic indicators, results from public opinion polls on S&T, and work requirements in EU27 countries. We identified six distinct groups of indicators that are significantly correlated with use of e-government services: national innovativeness and competitiveness, regular use of Internet, demanding and autonomous work, interest in innovations and S&T, data protection and security and personal trust. Among others, research opens questions about possible role of social capital in public acceptance of e-governments services. Deeper insight into interdependences between studied indicators also reveals that old EU15 and the new EU member states in some cases demonstrate different behavior patterns. -
e-collaboration platform for the development of rural areas and enterprises
191 e-collaboration, rural SMEs, notification service, networked collaboration, living lab Grzegorz Kołaczek, Adam Turowiec, Dominik Kasprzak, Witold Hołubowicz, pages 397 – 401. Show abstract This paper presents the basic assumptions of the Collaboration@Rural project (C@R) supported by the EU's 6th Framework Programme for Research and Technological Development. Apart from discussing primary objectives of the project―focused on supporting the development of rural areas by providing network-based collaboration environment―it also shows basic assumptions of a 3-layer reference model serving as the foundation for C@R architecture. As an example of service rendered available within such network collaboration environment, the implementation of a notification service component is presented herein. -
Corporate blog–innovative communication tool or another internet hype?―Empirical research study
127 communication, Internet, information asymmetry, corporate blogs Grzegorz Mazurek, pages 403 – 406. Show abstract In the following paper the role, potential and perception of corporate blogging among key marketing decision makers from the companies listed on Warsaw Stock Exchange have been presented. The topic of blogs has been widely promoted in recent theoretical and practical publications, however very little information can be found on the scope of blogs usage and real impact of such modern communication tool on the business. Such issues as limitations in corporate blogs implementation, the perception of the blogs’ information value or the expected potential of blogs usage have been identified and explained. Author describes various models of corporate blogs and tries to find out whether blogs are truly used as modern communication tool for business or it is just another hype which soon is replaced in media by another ideas and concepts. -
An examination of factors affecting bidders' choice in electronic auctions
148 electronic auction, consumer behavior, eBay Costantinos Rougeris, George S. Androulakis, pages 407 – 413. Show abstract The increment of number of services provided in World Wide Web lately drives more and more consumers to e-commerce. During the last years and due to the vast increase of e-stores (B2C), electronic marketplaces and electronic auction sites became more popular. Many researchers examined how buyers interact with the auction facts and sellers during the procedure. Questions such as “how do millions of users decide about their e-bidding” and “what are the factors affecting them and what is their order of importance” are amongst the most significant ones in current research. In this paper these factors are initially located using auction literature and eBay interface as well as expanded with the addition of a new factor (communication with seller). Their weights derive from the statistical analysis of the answers given in a questionnaire that was filled electronically by eBay users during the period of February—March 2007. -
Integration of governmental services in semantically described processes in the Access-eGov system
144 egovernment, semantic service integration, process model, ontology Marek Skokan, Tomáš Sabol, Marián Mach, Karol Furdík, pages 415 – 419. Show abstract This paper describes a “user-centred” approach to integration of services provided by the government in a “traditional” (i.e. face-to-face) or electronic way, applied in the EU R&D project Access-eGov. Individual (“atomic”) services are integrated on semantic level (using semantic description of existing services – either traditional or e-services) into a scenario, realization of which leads to a solution of a problem faced by the end users (citizens or businesses) in a given life event (e.g. how to get a building permit, establish company etc.). Benefits for the end users are twofold: firstly they are provided with higher value-added services – a scenario of services (consisting of a series of services, including their dependencies), not just a single services; the services in the scenario are personalised (i.e. adapted to his/her personal data and/or situation). In case some services in the scenario are available electronically, they can be also executed online, increasing thus ultimate benefits to the end user. First prototype of the Access-eGov platform was test and evaluated in three pilot applications in three EU countries. -
Access-eGov―Personal Assistant of Public Services
159 public services, ontology, web services Magdalena Sroga, pages 421 – 427. Show abstract Aim of this article is to present targeted research project Access-eGov, founded from the Sixth Framework Programme , priority 2 - Information Society Technologies. The article describes Access-eGov architectural and logical design and used technologies which enable to achieve the Project's aims. The Project introduces new, user-centric approach which uses predefined life events to solve citizens' problems which require from users implementation of public services. Project's approach enables users to perform complex executive scenarios instead of separated public services. This approach concentrates on user's needs and helps to build semantic interoperability among e-government services. -
Values in internet shopping—methods and preliminary research
225 values, internet, model Jacek Wachowicz, Piotr Drygas, pages 429 – 432. Show abstract Internet shopping becomes more and more popular. One of the most important questions for internet enerpreneurs seems to be how to encourage users to spend money. This is strictly connected with users’ motivations. Therefore arises a strong need of learning, what they are and which factors, represented in product’s features, are influencing them. One of most promising techniques in this field seems to be Hierarchical Value Maps. This paper describes the method and outcomes from preliminary research. -
V-Commerce: The Potential of Virtual Worlds in Services
44 virtual world, Second Life, social networks Urszula Świerczyńska-Kaczor, pages 433 – 436. Show abstract The author assesses the suitability of virtual worlds for building effective selling environment . In analysis key features of virtual worlds are discussed and compared with the way of functioning the other type of virtual communities―social networks web-site. -
Intermediate information layer. The use of the SKOS ontology to create information about e-resources provided by the public administration
36 WKUP, natural language processing, semantic web, ontology, search engine Wojciech Górka, Michał Socha, Adam Piasecki, Jakub Gańko, pages 437 – 442. Show abstract Currently, the issue of information search is based on processing a large number of documents, indexing their contents, and then evaluating the level of their adaptation to the question asked by the user. The development of the web allows to offer certain on-line services which make it possible to shop, book tickets or deal with public-administration issues. The objective of the WKUP system (Virtual Consultant of Public Services) is to assist the user in the process of searching and selecting services. The system gives a possibility of natural language communication in the first stage of interaction. This functionality has been achieved by means of the SKOS ontology. The article presents a general outline of the WKUP system architecture and the functioning of the search engine which interprets the user's natural-language questions semantically. The article describes the use of the SKOS ontology in the applied answers searching algorithm. -
Security Perception in E-commerce: a conflict between Customers' and Organizations' Perspectives
35 Security perception, tangible security features, intangible security features, and qualitative research Mohanad Halaweh, Christine Fidler, pages 443 – 449. Show abstract Security is one of the principal and continuing concerns that restrict customers and organizations engaging with e-commerce. The aim of this paper is to explore the perception of security in e-commerce B2C and C2C websites from both customer and organisational perspectives. It explores factors that influence customers’ perceptions of security. It also highlights conflicts between customer concerns with respect to security and those of an organization; existing research has not highlighted this issue greatly. This research provides a better understanding of customer needs and priorities on the subject of security, and enriches the currently available security perception literature, by providing new insights from empirical research conducted in Jordan. A qualitative research approach was adopted, since the research seeks an understanding of human (i.e., customer and organisational employee) perceptions.
4th Workshop on Large Scale Computations on Grids
-
Unicore 6 as a Platform for Desktop Grid
87 desktop grid Unicore middleware Jakub Jurkiewicz, Piotr Bała, Krzysztof Nowiński, pages 453 – 457. Show abstract The paper shows a possibility of arranging a desktop grid based on the UNICORE 6 which is a well established grid middleware. The grid consists of a number of PC computers which can communicate with the server in a secure way and perform scheduled computational tasks. The main advantage of this system is ease of deployment, flexibility and ease of integration with the large scale grid. We present here results of a simple performance test as well. -
Load balancing in the SOAJA Web Service Platform
221 Service Oriented Applications, Adaptative Applications, Load balancing, Grid Computing, Distributed Computing Richard Olejnik, Marek Tudruj, Bernard Toursel, Iyad Alshabani, Eryk Laskowski, pages 459 – 465. Show abstract The aim of the Service Oriented Adaptative Java Applications (SOAJA) project is to develop a service-oriented infrastructure, which enables efficient running of Java applications in complex, networked computing Grid environments. The SOAJA environment provides components and services for static and dynamic load balancing based on Java object observation. SOAJA can be used to design large scale computing tasks to be executed based on idle time of processor nodes. Java distributed applications consist of parallel objects which SOAJA allocates to Grid nodes at runtime. In this paper, we present mechanisms and algorithms for automatic placement and adaptation of application objects, in response to evolution of resource availability. These mechanisms enable to control the granularity data processing and distribution of the application on the Grid platform. -
Parallel Performance Prediction for Numerical Codes in a Multi-Cluster Environment
38 Parallel Distributed Algorithms; Grid Computing; Cluster Computing; Performance Evaluation and Prediction. Giuseppe Romanazzi, Peter K. Jimack, pages 467 – 474. Show abstract We propose a model for describing and predicting the performance of parallel numerical software on distributed memory architectures within a multi-cluster environment. The goal of the model is to allow reliable predictions to be made as to the execution time of a given code on a large number of processors of a given parallel system, and on a combination of systems, by only benchmarking the code on small numbers of processors. This has potential applications for the scheduling of jobs in a Grid computing environment where informed decisions about which resources to use in order to maximize the performance and/or minimize the cost of a job will be valuable. The methodology is built and tested for a particular class of numerical code, based upon the multilevel solution of discretized partial differential equations, and despite its simplicity it is demonstrated to be extremely accurate and robust with respect to both the processor and communications architectures considered. Furthermore, results are also presented which demonstrate that excellent predictions may also be obtained for numerical algorithms that are more general than the pure multigrid solver used to motivate the methodology. These are based upon the use of a practical parallel engineering code that is briefly described. The potential significance of this work is illustrated via two scenarios which consider a Grid user who wishes to use the available resources either (i) to obtain a particular result as quickly as possible, or (ii) to obtain results to different levels of accuracy. -
On the Robustness of the Soft State for Task Scheduling in Large-scale Distributed Computing Environment
48 volunteer computing, desktop grid, bag-of-tasks application, task replication, Workqueue Harumasa Tada, Makoto Imase, Masayuki Murata, pages 475 – 480. Show abstract In this paper, we consider task scheduling in distributed computing. In distributed computing, it is possible that tasks fail, and it is difficult to get accurate information about hosts and tasks. WQR (workqueue with replication), which was proposed by Cirne et al., is a good algorithm because it achieves a short job-completion time without requiring any information about hosts and tasks. However, in order to use WQR for distributed computing, we need to resolve some issues on task failure detection and task cancellation. For this purpose, we examine two approaches—the conventional task timeout method and the soft state method. Simulation results showed that the soft state method is more robust than the task timeout method.
First International Symposium on Multimedia—Applications and Processing
-
Similar Neighborhood Criterion for Edge Detection in Noisy and Noise-Free Images
180 Edge Detection, Impulse Noise Ali Awad, Hong Man, pages 483 – 486. Show abstract A novel approach for edge detection in noise-free and noisy images is presented in this paper. The proposed method is based on the number of similar pixels that each pixel in the image may have amongst its neighboring in the filtering window and within a pre-defined intensity range. Simulation results show that the new detector performs well in noise-free images but superior in corrupted images by salt and pepper impulse noise. Moreover, it is time efficient. -
Accurate Localization in Short Distance based on Computer Vision for Mobile Sensors
184 Mobile Sensor Network, Localization, Computer Vision, P3P Kwansik Cho, Ha Yoon Song, Jun Park, pages 487 – 493. Show abstract In order to maximize the utilization of mobile sensor network, formation of sensor set and localization of each sensor node must be implemented. It is required that localization is one of the most important functionality of mobile sensor nodes. In this paper, we present a technique which improves the relative location of the MSN with a computer vision technology. This technique effects only in short distance but only with low price sensors, we achieved precise localization in the resolution of 10 centimeters. The well known perspective-3-point problem have been exploited for the precise short distance localization. By experiment we present an interrelation between angle of camera view and a LED pattern interval. We measures the distance of the counterpart vehicle and vehicles shares distance information of obstacle and the relative vehicles with possible cooperation of vehicles. The angle of a vehicle can be identified by digital compass. Finally, with a share of location information we can achieve localization of mobile sensor nodes with high accuracy. -
Character Recognition Using Radon Transformation and Principal Component Analysis in Postal Applications
181 Handwritten characters recognition, Radon transformation, Principal Component Analysis Mirosław Miciak, pages 495 – 500. Show abstract This paper describes the method of handwritten characters recognition and the experiments carried out with it. The characters used in our experiments are numeric characters used in post code of mail pieces. The article contains basic image processing of the character and calculation of characteristic features, on basis of which it will be recognized. The main objective of this article is to use Radon Transform and Principal Component Analysis methods to obtain a set of features which are invariant under translation, rotation, and scaling. Sources of errors as well as possible improvement of classification results will be discussed. -
Quasi-One-Way Function and Its Applications to Image Watermarking
110 watermark, SVD, one-way function, inversion attack Kazuo Ohzeki, Engyoku Gi, pages 501 – 508. Show abstract This paper describes a one-way function for use in an image watermark to improve authentication ability. The one-way function is popular in cryptography and can prove encryption security. The existence of the one-way function has not been proved yet, while the public key encryption procedure is widely used in practical commerce. The authors proposed a quasi-one-way function, which is weakly defined for practical use. The differences between the strict one-way-function and the proposed quasi-one-way function are discussed. Applications of the quasi-one-way function to image watermarking are shown. Two watermarking systems with SVD method and Jordan canonical form are tried. The robustness of the proposed watermarking method is high. -
Question generation for learning evaluation
73 questions generator, concept maps Cosmin Stoica Spahiu, Liana Stanescu, Anca Ion, pages 509 – 513. Show abstract In the last decade the electronic learning became a very useful tool in the students’ education from different activity domains. The accomplished studies indicated that the students substantially appreciate the e-learning method, due to the facilities: the facile information access, a better storage of the didactic material, the curricula harmonization between universities, personalized instruction. The paper presents a software tool that can be used in the e-learning process in order to automatically generate questions from course materials, based on a series of tags defined by the professor. The Test Creator tool permits generation of questions based on electronic materials that students have. The solution implies teachers to have a series of tags and templates that they manage. These tags are used to generate questions automatically.
International Conference on Principles of Information Technology and Applications
-
Correctness issues of UML Class and State Machine Models in the C# Code Generation and Execution Framework
100 state machines, UML, MDE, code generation, C# Anna Derezinska, Romuald Pilitowski, pages 517 – 524. Show abstract Model driven approach for program development can assist in quick generation of complex and highly reliable applications. Framework for eXecutable UML (FXU) transforms UML models into C# source code and supports execution of the application reflecting the behavioral model. The framework consists of two parts code generator and run time library. The generated and executed code corresponds to structural model specified in class diagrams and behavioral model described by state machines of these classes. All single concepts of state machines included in the UML 2.0 specification (and further) are taken into account, including all kinds of events, states, pseudostates, submachines etc. The paper discusses the correctness issues of classes and state machine models that have to be decided in the framework in order to run a model-related and high quality C# application. The solution was tested on set of UML models. -
Designing new XML Based Multidimensional Messaging Interface for the new XWarehouse Architecture
59 Data Warehouse, Multidimesional, Cube, XML, XWarehouse, OLAP Query, XDXML Ahmed Bahaa Farid, Prof. Ahmed Sharaf Aldin Ahmed, Prof. Yehia Mostafa Helmy, pages 525 – 534. Show abstract OLAP (Online Analysis Processing) applications have very special requirements to the underlying multidimensional data that differs significantly from other areas of application (e.g. the existence of highly structured dimensions). In addition, providing access and search among multiple, heterogeneous, distributed and autonomous data warehouses, especially web warehouses, has become one of the leading issues in data warehouse research and industry. This paper proposes a new message interface for a new platform independent data warehouse architecture that can deliver location, platform, and schema transparency for clients that access autonomous data warehouses. The new message interface uses XML in order to provide interoperable way to query and administrate federated data warehouses in addition to compose the multidimensional query result sets. -
PAFSV: A Process Algebraic Framework for SystemVerilog
24 SystemVerilog, process algebras, formal semantics, PAFSV, formal specification and analysis Ka Lok Man, pages 535 – 542. Show abstract We develop a process algebraic framework, called process algebraic framework for IEEE 1800™ SystemVerilog (PAFSV), for formal specification and analysis of IEEE 1800™ SystemVerilog designs. The formal semantics of PAFSV is defined by means of deduction rules that associate a time transition system with a PAFSV process. A set of properties of PAFSV is presented for a notion of bisimilarity. PAFSV may be regarded as the formal language of a significant subset of IEEE 1800™ SystemVerilog. To show that PAFSV is useful for the formal specification and analysis of IEEE 1800™ SystemVerilog designs, we illustrate the use of PAFSV with some examples: a MUX, a synchronous reset D flip-flop and an arbiter. -
Dealing with redundancies and dependencies in normalization of XML data
112 XML databases, XML functional dependencies, normalization, database quality Tadeusz Pankowski, Tomasz Piłka, pages 543 – 550. Show abstract In this paper we discuss the problem of redundancies and data dependencies in XML data while an XML schema is to be normalized. Normalization is one of the main tasks in relational database design, where 3NF or BCNF, is to be reached. However, neither of them is ideal: 3NF preserves dependencies but may not always eliminate redundancies, BCNF on the contrary—always eliminates redundancies but may not preserve constraints. We discuss the possibility of achieving both redundancy-free and dependency preserving form of XML schema. We show how the XML normal form can be obtained for a class of XML schemas and a class of XML functional dependencies. -
Separation of crosscutting concerns at the design level: an extension to the UML metamodel
104 Aspect-Oriented Modeling, Aspect-Oriented Design Adam Przybyłek, pages 551 – 557. Show abstract Separation of Crosscutting Concerns at the Design Level: an Extension to the UML Metamodel.Abstract—Aspect-oriented programming (AOP) was proposed as a way of improving the separation of concerns at the implementation level by introducing a new kind of modularization unit - an aspect. Aspects allow programmers to implement crosscutting concerns in a modular and well-localized way. As a result, the well-known phenomena of code tangling and scattering are avoided. After a decade of research, AOP has gained acceptance within both academia and industry. The current challenge is to incorporate aspect-oriented (AO) concepts into the software design phase. Since AOP is built on top of OOP, it seems natural to adapt UML to AO design. In this context the author introduces an extension to the UML metamodel to support aspect-oriented modelling. -
Distributed Internet Systems Modeling Using TCPNs
139 Performance Engineering, Distributed Internet Systems, Queueing Systems, TCPN Tomasz Rak, Slawomir Samolej, pages 559 – 566. Show abstract This paper presents a Timed Coloured Petri Nets based programming tool that supports modeling and performance analysis of distributed World Wide Web environments. A distributed Internet system model, initially described in compliance with Queueing Theory (QT) rules, is mapped onto the Timed Coloured Petri Net (TCPN) structure by means of queueing system templates. Then, it is executed and analyzed using Design/CPN toolset. The proposed distributed Internet systems modeling and design methodology has been applied for evaluation of several system architectures under different external loads.
International Workshop on Real Time Software
-
Real Time Behavior of Data in Distributed Embedded Systems
133 real time data, validation, state transition system Tanguy Le Berre, Philippe Mauran, Gérard Padiou, Philippe Quéinnec, pages 569 – 575. Show abstract Nowadays, most embedded systems become distributed systems structured as a set of communicating components. Therefore, they display a less deterministic global behavior than centralized systems and their design and analysis must address both computation and communication scheduling in more complex configurations. We propose a modeling framework centered on data. More precisely, the interactions between the data located in components are expressed in terms of a so-called observation relation. This abstraction is a relation between the values taken by two variables, the source and the image, where the image gets past values of the source. We extend this abstraction with time constraints in order to specify and analyze the availability of timely sound values. The formal description of the observation-based computation model is stated using the formalisms of transition systems. Real time is introduced as a dedicated variable. As a first result, this approach allows to focus on specifying time constraints attached to data and to postpone task and communication scheduling matters. At this level of abstraction, the designer has to specify time properties about the timeline of data such as their freshness, stability, latency… As a second result, a verification of the global consistency of the specified system can be automatically performed. A forward or backward approach can be chosen. The verification process can start from either the timed properties (e.g. the period) of data inputs or the timed requirements of data outputs (e.g. the latency). As a third result, communication protocols and task scheduling strategies can be derived as a refinement towards an actual implementation. -
Compact RIO Embedded System in Power Quality Analysis Area.
140 Compact RIO, FPGA, VxWorks, LabVIEW, Power Quality Petr Bilik, Ludvik Koval, Jiri Hajduk, pages 577 – 580. Show abstract ―Electrical measurement department of VSB-Technical University has been involved for more than 14 years in research and development of Power Quality Analyzer built on Virtual Instrumentation Technology. PC-based power quality analyzer with National Instruments data acquisition board was designed and developed in this time frame. National Instruments LabVIEW is used as the development environment for all parts of power quality analyzer software running under MS Windows OS. Proved PC-based firmware was ported to new hardware platform for virtual instrumentation – National Instruments CompactRIO at the end of 2007. Platform change from PC to CompactRIO is not just code recompilation, but it brings up many needs for specific software redesigns. Paper describes how the monolithic executable for PC-based instruments was divided into three software layers to be ported on CompactRIO platform. The code for different parts of CompactRIO instrument is developed in a unified development environment no matter if the code is intended for FPGA, real-time processor or PC running Windows OS.Keywords―CompactRIO, FPGA, VxWorks, LabVIEW, Power Quality -
Real-time Task Reconfiguration Support Applied to an UAV-based Surveillance System
130 Dynamic reconfiguration, aspect-oriented paradigm, UAV, task schedule, heterogeneous platform profiling Alécio Pedro Delazari Binotto, Edison Pignaton de Freitas, Carlos Eduardo Pereira, André Stork, Tony Larsson, pages 581 – 588. Show abstract Modern surveillance systems, such as those based on the use of Unmanned Aerial Vehicles, require powerful high-performance platforms to deal with many different algorithms that make use of massive calculations. At the same time, low-cost and high-performance specific hardware (e.g., GPU, PPU) are rising and the CPUs turned to multiple cores, characterizing together an interesting and powerful heterogeneous execution platform. Therefore, reconfigurable computing is a potential paradigm for those scenarios as it can provide flexibility to explore the computational resources on heterogeneous cluster attached to a high-performance computer system platform. As the first step towards a run-time reconfigurable workload balancing framework targeting that kind of platform, application time requirements and its crosscutting behavior play an important role for task allocation decisions. This paper presents a strategy to reallocate specific tasks in a surveillance system composed by a fleet of Unmanned Aerial Vehicles using aspect-oriented paradigms in order to address non-functional application timing constraints in the design phase. An aspect support from a framework called DERAF is used to support reconfiguration requirements and provide the resource information needed by the reconfigurable load-balancing strategy. Finally, for the case study, a special attention on Radar Image Processing will be given. -
Student's Contest: Self-Driven Slot Car Racing
45 education contest self-driven slot car track mapping accelerometer microcontroller DC-motor Milan Brejl, Jaroslav Niečesaný, pages 589 – 592. Show abstract A recently announced university student’s contest is based on a well-known entertainment – slot car racing. In contrary to the classical racing, here, the challenge is to build a car which can drive on an unknown track without any human interface and achieve the best possible time. This document describes the technical, algorithmic and educational aspects of a self-driven slot car development. -
Real-time Support in Adaptable Middleware for Heterogeneous Sensor Networks
78 Sensor Networks; Adaptable/Reflective Middleware; Heterogeneous Networks; UAV-based Sensores Edison Pignaton de Freitas, Marco Aurelio Wehrmeister, Carlos Eduardo Pereira, Tony Larsson, pages 593 – 600. Show abstract — The use of sensor networks in different kinds of sophisticated applications is emerging due to several advances in sensor/embedded system technologies. However, the integration and coordination of heterogeneous sensors is still a challenge, especially when the target application environment is susceptible to changes. Such systems must adapt themselves in order to fulfil increasing requirements. Especially the handling of real-time requirements is a challenge in this context in which different technologies are applied to build the overall system. Moreover, these changing scenarios require services located at different places during the system runtime. Thus a support for adaptability is needed. Timing and precision requirements play an important role in such scenarios. Besides, QoS management must provide the necessary support to offer the flexibility demanded in such scenarios. In this paper we present the real-time perspective of a middleware that aims at providing the support required by sophisticated heterogeneous sensor network applications. We propose to address the real-time concerns by using the OMG Data Distribution Service for Real-time Systems approach, but with a more flexible approach that fits in the heterogeneous environment in which the proposed middleware is intended to be used. We also present a coordination protocol to support the proposed approach. -
Embedded control systems design based on RT-DEVS and temporal analysis using Uppaal
62 DEVS, real-time constraints, embedded control systems, model continuity, temporal analysis, timed automata, model checking, Java Angelo Furfaro, Libero Nigro, pages 601 – 608. Show abstract This work is concerned with modelling, analysis and implementation of embedded control systems using RT-DEVS, i.e. a specialization of classic DEVS (Discrete Event System Specification) for real-time. RT-DEVS favours model continuity, i.e. the possibility of using the same model for property analysis (by simulation or model checking) and for real time execution. Special case tools are proposed in the literature for RT-DEVS model analysis and design. In this work, temporal analysis exploits an efficient translation in UPPAAL timed automata. The paper shows an embedded control system model and its exhaustive verification. For large models a simulator was realized in Java which directly stems from RT-DEVS operational semantics. The same concerns are at the basis of a real-time executive. The paper discusses the implementation status and, finally, indicates research directions which deserve further work. -
Study of different load dependencies among shared redundant systems
123 Shared redundancy, Dependability, Networked control systems Ján Galdun, Jean-Marc Thiriet, Ján Liguš, pages 609 – 615. Show abstract The paper presents features and implementation of a shared redundant approach to increase the reliability of networked control systems. Common approaches based on redundant components in control system use passive or active redundancy. We deal with quasi-redundant subsystems (shared redundancy) whereas basic features are introduced in the paper. This type of redundancy offers several important advantages such as minimizing the number of components as well as increasing the reliability. The example of a four-rotor mini-helicopter is presented in order to show reliability improving without using any additional redundant components. The main aim of this paper is to show the influence of the load increasing following different scenarios. The results could help to determine the applications where quasi-redundant subsystems are a good solution to remain in a significant reliability level even if critical failure appears. -
Runtime resource assurance and adaptation with Qinna framework: a case study
77 Component Based Systems, Multimedia application, Resource management at runtime, evaluation Laure Gonnord, Jean-Philippe Babau, pages 617 – 624. Show abstract Even if hardware improvements have increased the performance of embedded systems in the last years, resource problems are still acute. The persisting problem is the constantly growing complexity of systems. New devices for service such as PDAs or smartphones increase the need for flexible and adaptive open software. Component-based software engineering tries to address these problems and one key point for development is the Quality of Service (QoS) coming from resource constraints. In this paper, we recall the concepts behind Qinna, a component-based QoS Architecture, which was designed to manage QoS issues, and we illustrate the developpement of a image viewer application whithin this framework. We focus on the general developpement methodology of resource-aware applications with Qinna framework, from the specification of resource constraints to the use of generic Qinna's algorithms for negociating QoS contracts at runtime. -
Real-time Control Teaching Using LEGO® MINDSTORMS® NXT Robot
196 real-time, education, control, Mindstorms Wojciech Grega, Adam Piłat, pages 625 – 628. Show abstract A current trend in learning programs, even at high-degree studies, is applying the concepts of “learning by projects”. In this context, the LEGO® MINDSTORMS® NXT modular robot appears as a simple, flexible and attractive educational platform to achieve the referred challenge in many domains of information technologies, especially in real-time control. The project team can operate with the real system (e.g. constructed robot) to realize a number of particular tasks. The ready-to-use embedded platform and dual microcontrollers architecture represents actual trends in the modern control systems designs. The open hardware and software architecture gives unlimited possibility to handle all devices connected to the main control brick and write effective control algorithms. The proposed rescue robot project is a good example of the simple system where, a number of real-time control problems can be analyzed. The project on this scale requires detailed planning, cooperation in teams, extensive literature survey, and comprehensive software design. That is exactly, what we need for “learning by projects” concept. -
A Safety Shell for UML-RT Projects
26 real-time systems, embedded systems, UML profiles and patterns, safety shell Roman Gumzej, Wolfgang A. Halang, pages 629 – 632. Show abstract A safety shell pattern was defined based on a re-configuration management pattern, and inspired by the architectural specifications in Specification PEARL. It is meant to be used for real-time applications to be developed with UML-RT as described. The implementation of the safety shell features as defined by in [8], namely its timing and state guards as well as I/O protection and exception handling mechanisms, is explained. The pattern is parameterised by defining the properties of its components as well as by defining the mapping between software and hardware architectures. Initial and alternative execution scenarios as well as the method for switching between them are defined. The goal pursued with the safety shell is to obtain clearly specified operation scenarios with well defined transitions between them. To achieve safe and timely operation, the pattern must provide safety shell mechanisms for an application designed, i.e., enable its predictable deterministic and temporally predictable operation now and in the future. -
An RSIC-SE2004 Curriculum Framework
65 real time control, software engineering, curricula Thomas Hilburn, Andrew Kornecki, Wojciech Grega, Jean-Marc Thiriet, Miroslav Sveda, pages 633 – 638. Show abstract This paper addresses the problem of educating software engineering professionals who can efficiently and effectively develop real-time software intensive control systems in the global community. A framework for developing curricula that support such education is presented. The curriculum framework is based on the work of two education projects: the ILERT (International Learning Environment for Real-Time Software Intensive Control System) project, and the software engineering efforts of the ACM/IEEE-CS Joint Task Force on Computing Curricula. The authors describe a curriculum framework that integrates principles, content, and organization from the two projects, and which satisfies the intent and requirement of both projects. -
You Can't Get There From Here! Problems and Potential Solutions in Developing New Classes of Complex Computer Systems
224 complex systems, autonomic computing, formal methods Michael Hinchey, James Rash, Walter Truszkowski, Christopher Rouff, Roy Sterritt, pages 639 – 647. Show abstract The explosion of capabilities and new products within the sphere of Information Technology (IT) has fostered widespread, overly optimistic opinions regarding the industry, based on common but unjustified assumptions of quality and correctness of software. These assumptions are encouraged by software producers and vendors, who at this late date have not succeeded in finding a way to overcome the lack of an automated, mathematically sound way to develop correct systems from requirements. NASA faces this dilemma as it envisages advanced mission concepts that involve large swarms of small spacecraft that will engage cooperatively to achieve science goals. Such missions entail levels of complexity that beg for new methods for system development far beyond today's methods, which are inadequate for ensuring correct behavior of large numbers of interacting intelligent mission elements. New system development techniques recently devised through NASA-led research will offer some innovative approaches to achieving correctness in complex system development, including autonomous swarm missions that exhibit emergent behavior, as well as general software products created by the computing industry. -
Minos—The design and implementation of an embedded real-time operating system with a perspective of fault tolerance
85 Real Time Operating System, Fault Tolerance, Thomas Kaegi-Trachsel, Juerg Gutknecht, pages 649 – 656. Show abstract This paper describes the design and implementation of a small real time operating system (OS) called Minos and its application in an onboard active safety project for General Aviation. The focus of the operating system is predictability, stability, safety and simplicity. We introduce fault tolerance aspects in software by the concept of a very fast reboot procedure and by an error correcting flight data memory (FDM). In addition, fault tolerance is supported by custom designed hardware. -
Simulator Generation Using an Automaton Based Pipeline Model for Timing Analysis
132 Embedded real-time systems, cycle-accurate simulator, pipeline modeling, Architecture Description Language. Rola Kassem, Mikaël Briday, Jean-Luc Béchennec, Guillaume Savaton, Yvon Trinquet, pages 657 – 664. Show abstract Hardware simulation is an important part of the design of embedded and/or real-time systems. It can be used to compute the Worst Case Execution Time (WCET) and to provide a mean to run software when final hardware is not yet available. Building a simulator is a long and difficult task, especially when the architecture of processor is complex. This task can be alleviated by using a Hardware Architecture Description Language and generating the simulator. In this article we focus on a technique to generate an automata based simulator from the description of the pipeline. The description is transformed into an automaton and a set of resources which, in turn, are transformed into a simulator. The goal is to obtain a cycle-accurate simulator to verify timing characteristics of embedded real-time systems. An experiment compares an Instruction Set Simulator with and without the automaton based cycle-accurate simulator. -
Software Certification for Safety-Critical Systems: A Status Report
226 Software certification, safety-critical systems, real-time systems Andrew Kornecki, Janusz Zalewski, pages 665 – 672. Show abstract This paper presents an overview and the role of certification in safety-critical computer systems focusing on software and hardware use in the domain of civil aviation. It discusses certification activities according to RTCA DO-178B “Software Considerations in Airborne Systems and Equipment Certification” and RTCA DO-254 “Design Assurance Guidance for Airborne Electronic Hardware.” Specifically, certification issues in real-time operating systems, programming languages, software development tools, complex electronic hardware and tool qualification are discussed. Results of an independent industry survey done by the authors are also presented. -
Modeling Real-Time Database Concurrency Control Protocol Two-Phase-Locking in Uppaal
156 real-time database systems, pessimistic protocol, 2-phase-locking, timed automata, model checking, verification, verification tool, Uppaal Martin Kot, pages 673 – 678. Show abstract Real-time database management systems (RTDBMS) are recently subject of an intensive research. Model checking algorithms and verification tools are of great concern as well. In this paper we show some possibilities of using a verification tool Uppaal on some variants of pessimistic concurrency control protocols used in real-time database management systems. We present some possible models of such protocols expressed as nets of timed automata, which are a modeling language of Uppaal. -
Distributed Scheduling for Real-Time Railway Traffic Control
170 scheduling, train traffic control, distributed scheduling Tiberiu Letia, Mihai Hulea, Radu Miron, pages 679 – 685. Show abstract The increase of railway traffic efficiency and flexibility requires new real-time scheduling and control methods. New charter trains have to be added continuously without disturbing the other (periodic) train moves or decreasing the safety conditions. A distributed method to schedule new trains such that their real-time constraints are fulfilled is presented. The trains have timelines to meet and hence the deadlines are extracted taking into account the included laxity. The trains have pre-established routes specifying the stations and required arrival times. The paths containing the block sections from one station to another are dynamically allocated without leading to deadlocks. -
Wireless Sensor and Actuator Networks: Characterization and Case Study for Confined Spaces Healthcare Applications
79 Sensor Networks, wireless Diego Martinez, Francisco Blanes, Jose Simo, Alfons Crespo, pages 687 – 693. Show abstract Nowadays developments in Wireless Sensor and Actuators Networks (WSAN) applications are determined by the fulfillment of constraints imposed by the application. For this reason, in this work a characterization of WSAN applications in health, environmental, agricultural and industrial sectors are presented. A case study for detecting heart arrhythmias in non-critical patients during rehabilitation sessions in confined spaces is presented, and finally an architecture for the network and nodes in these applications is proposed. -
Real-Time Service-oriented Communication Protocols on Resource Constrained Devices
124 SODA, Web services, DPWS, WS4D, deeply embedded, real-time Guido Moritz, Steffen Prüter, Frank Golatowski, Dirk Timmermann, pages 695 – 701. Show abstract Real-Time Service-oriented Communication Protocols on Resource Constrained Devices{NOTE: This work was partly supported by the ITEA LOMS project.} Abstract - Service-oriented architectures (SOA) become more and more important in connecting devices with each other. The main advantages of Service-oriented architectures are a higher abstraction level and interoperability of devices. In this area, Web services have become an important standard for communication between devices. However, this upcoming technology is only available on devices with sufficient resources. Therefore, embedded devices are often excluded from the deployment of Web services due to a lack of computing power and memory. Furthermore, embedded devices often require real-time capabilities for communication and process control. This paper presents a new table driven approach to handle real-time capable Web services communication, on embedded hardware through the Devices Profile for Web Services. -
Task jitter measurement under RTLinux and RTX operating systems, comparison of RTLinux and RTX operating environments
122 Linux, Windows, Real-time, Hardware Abstraction Layer, RTLinux, RTX, Jitter, Latency, Workload effects, Measurement, Experimental results Pavel Moryc, Jindřich Černohorský, pages 703 – 709. Show abstract This paper compares task jitter measurement performed under RTLinux and RTX hard real-time operating systems. Both the operating environments represent Hardware Abstraction Layer extensions to general-purpose operating systems, and make possible to create a real-time system according to the POSIX 1003.13 PSE 54 profile (multipurpose system). The paper is focused on discussion of experimental results, obtained on PC hardware, and their interpretation. -
Multilevel Localization for Mobile Sensor Network Platforms
185 Mobile Sensor Network, Localiation, RSSI, Computer Vision, Trajectory, Mobile Sensor Vehicle Jae-Young Park, Kwansik Cho, Ha Yoon Song, Jun Park, pages 711 – 718. Show abstract For a set of Mobile Sensor Network, a precise localization is required in order to maximize the utilization of Mobile Sensor Network. As well, mobile robots also need a precise localization mechanism for the same reason. In this paper, we showed a combination of various localization mechanisms. Localization can be classified in three big categories: long distance localization with low accuracy, medium distance localization with medium accuracy, and short distance localization with high accuracy. In order to present localization methods, traditional map building technologies such as grid maps or topological maps can be used. We implemented mobile sensor vehicles and composed mobile sensor network with them. Each mobile sensor vehicles act as a mobile sensor node with the facilities such as autonomous driving, obstacle detection and avoidance, map building, communication via wireless network, image processing and extensibility of multiple heterogeneous sensors. For localization, each mobile sensor vehicle has abilities of the location awareness by mobility trajectory based localization, RSSI based localization and computer vision based localization. With this set of mobile sensor network, we have the possibility to demonstrate various localization mechanisms and their effectiveness. In this paper, the preliminary result of sensor mobility trail based localization and RSSI based localization will be presented. -
A Component-based Approach to Verification of Embedded Control Systems using TLA
143 temporal logic, model-checking, synchronous systems, embedded components Ondrej Rysavy, Jaroslav Rab, pages 719 – 725. Show abstract The method for writing +TLA specifications that obey formal model called Masaccio is presented in this paper. The specifications consist of components, which are built from atomic components by parallel and serial compositions. Using a simple example, it is illustrated how to write specifications of atomic components and components those are products of parallel or serial compositions. The specifications have standard form of +TLA specifications hence they are amenable to automatic verification using the TLA+ model-checker. -
Algorithm for Real Time Faces Detection in 3D Space
72 stereo face detection Rauf Sadykhov, Denis Lamovsky, pages 727 – 732. Show abstract This paper presents the algorithm of stereo-faces detection in video sequences. A Stereo-face is a face of a man presented by set of images obtained from different points of view. Such data can be used for faces structure estimation. Our algorithm is based on computationally effective method of face detection in mono image. Information about face positions is then combined using sparse stereo matching algorithm. Sparse means that stereo correspondence is estimated not for all scene points but only for points of interest. This allows obtaining the low computation cost of algorithm. We use few criteria to estimate correspondence. These are: epipolar constraint, size correspondence, 3D region of interest constraint and histogram correspondence. Object distance estimation method that does not use projective transformations of stereo planes is also considered. -
Reliability of Malfunction Tolerance
218 real time systems, reliability, fault tolerance, computer design Igor Schagaev, pages 733 – 737. Show abstract Reliability of Malfunction ToleranceAbstract―Generalized algorithm of fault tolerance is presented, using time, structural and information redundancy types. It is shown that algorithm of fault tolerance might be implemented using hardware and software. It is also shown that for the design of efficient fault tolerant system elements must be malfunction tolerant. The advantage of element malfunction tolerance is proven in reliability terms. Reliability analysis of a fault tolerant system is performed with deliberate separation of malfunction and permanent faults. Special function of reliability gain is introduced and used to model system reliability. Maximum reliability of fault tolerant system is achievable with less than a duplication system and depends on malfunction/permanent fault ratio and coverage of faults. System level of fault tolerance prerogative is the reconfiguration from permanent faults. -
Rapid Control Prototyping with Scilab/Scicos/RTAI for PC-based and ARM-based Platforms
157 Real-time systems, Linux, RTAI, servomechanism, Rapid Control Prototyping, Scilab/Scicos, ARM processor, Embedded Systems Grzegorz Skiba, Tomasz Żabiński, Andrzej Bożek, pages 739 – 744. Show abstract This document describes three didactic systems with a Rapid Control Prototyping (RCP) suite based on Scilab/Scicos and a PC computer. Servomechanisms with wire signal transmission and PID as well as a fuzzy controller are presented. The servomechanism with wireless signal transmission is also described. As the RCP suite based on Scilab/Scicos/RTAI was successfully used in the servo controller development on a PC platform, the concept of a tool-chain for RCP on an embedded platform with ARM processor is discussed. The TS-7300 embedded system with RTAI real-time operating system is described. The changes in Scilab/Scicos needed to interface the generated controller code to TS-7300 are presented. The didactic use and a possible commercial use of the tool-chain is indicated. The didactic use is focused on controllers tuning, friction and long delays influence on servomechanism control and embedded control systems development. -
Real Time System in Aviation
174 control process, embedded system, graphic interface, industrial bus, real-time operating system, sensors, ultralight airplane Vilem Srovnal, Jiri Kotzian, pages 745 – 750. Show abstract This paper presents development of the hardware and software for the low cost avionic system of ultralight airplanes. There are shown three levels of a hardware and software system design. As far as the software is concerned, we focused on changeover from non real-time operating system (embedded Linux) to the real-time embedded platform (QNX). We discussed the problems that led to operating system change. Various advantages and disadvantages of both operating systems are presented in this contribution. Concerning the hardware we concentrated on the development of the avionic control, monitoring and display modules that are a components of the dash board. The paper has been focusing on safety and reliability in the ultralight aviation and what can be improved or extended by the real-time operating system. -
Towards Self-Managing Real-Time Systems
228 real-time systems, autonomic computing Roy Sterritt, Mike Hinchey, pages 751 – 756. Show abstract Upon first consideration, the Self-Managing Systems (and specifically Autonomic Systems) paradigm may seem to add too much additional functionality and overhead for systems that require strict real-time behavior. From our experience, not only does the Real-Time Systems area benefit from autonomicity, but the autonomic systems initiative requires the expertise of the RTS community in order to achieve its overarching vision. We briefly highlight some of the design considerations we have taken in incorporating self-management into Real-Time Systems. -
IEC Structured Text programming of a small Distributed Control System
70 IEC 61131-3, CPDev, Control Program Developer, SMC,miniDCS Leszek Trybus, Dariusz Rzońca, Jan Sadolewski, Andrzej Stec, Zbigniew Świder, pages 757 – 760. Show abstract A prototype environment called CPDev for programming small distributed control-and-measurement systems in Structured Text (ST) language of IEC 61131-3 standard is presented. The environment consists of a compiler, simulator and configurer of hardware resources, including communications. Programming a mini-DCS (Distributed Control System ) from LUMEL Zielona G\óra is the first application of CPDev. -
Web-based Laboratories: How Does That Affect Pedagogy? Thoughts for a Panel Discussion
229 Web-based labs, online labs, real-time systems education Janusz Zalewski, pages 761 – 762. Show abstract The author presents a few questions for discussion at the Panel on web-based laboratories. The key issue is whether and how such laboratories can enhance the pedagogy. Is this only a technological trend or are there real values in creating online labs for real-time systems education? Answers to sample questions from the literature are reviewed. -
Improving Energy-Efficient Real-Time Scheduling by Exploiting Code Instrumentation
131 DVS, power-aware, energy, real-time, scheduling Thorsten Zitterell, Christoph Scholl, pages 763 – 771. Show abstract Dynamic Frequency and Voltage Scaling is a promising technique to save energy in real-time systems. In this work we present a novel light-weight energy-efficient EDF scheduler designed for processors with discrete frequencies which performs on-line intra- and inter-task frequency scaling at the same time. An intra-task scheduling scheme based on cycle counters of a processor allows the application of our approach to shared code of library functions and to task setups where only sparse intra-task information is available. Our `Intra-Task Characteristics Aware EDF' (ItcaEDF) scheduler which aims to run with a low frequency by eliminating idle time and inter- and intra-task slack times was evaluated in an compiler, WCET analysis, and simulation framework. Our experiments show that state-of-the-art intra-task as well as inter-task frequency scaling approaches are clearly outperformed by our approach.
3rd International Workshop on Secure Information Systems
-
A Context-Risk-Aware Access Control Model for Ubiquitous Environments
125 Access Control, Context-aware, Risk assessment, Level of Assurance, Confidence Level, Role-based access control Ali Ahmed, Ning Zhang, pages 775 – 782. Show abstract Would like to thank the Faculty of Computers and Information, Cairo University, for its financial support.}A Context-Risk-Aware Access Control Model for Ubiquitous Environments Abstract—This paper reports our ongoing work to design a Context-Risk-Aware Access Control (CRAAC) model for Ubiquitous Computing (UbiComp) environments. CRAAC is designed to augment flexibility and generality over the current solutions. Risk assessment and authorisation level of assurance play a key role in CRAAC. Through risk assessment, resources are classified into groups according to their sensitivity levels and potential impacts should any unauthorised access occurs. The identified risks are mapped onto their required assurance levels, called Object Level of Assurance (OLoA). Upon receiving an object access request, the requester's run-time contextual information is assessed to establish a Requester's Level of Assurance (RLoA) denoting the level of confidence in identifying that requester. The access request is granted iif RLoA ≥ OLoA. This paper describes the motivation for, and the design of, the CRAAC model, and reports a case study to further illustrate the model. -
User Behaviour Based Phishing Websites Detection
135 Phishing Attacks, Phishing Websites Detection, Identity Theft, User Protection Protection Xun Dong, John Clark, Jeremy Jacob, pages 783 – 790. Show abstract Phishing detection systems are principally based on the analysis of data moving from phishers to victims. In this paper we describe a novel approach to detect phishing websites based on analysis of users' online behaviours—i.e., the websites users have visited, and the data users have submitted to those websites. Such user behaviours can not be manipulated freely by attackers; detection based on those data can not only achieve high accuracy, but also is fundamentally resilient against changing deception methods. -
A New Worm Propagation Threat in BitTorrent Modeling and Analysis
101 P2P worms, Propagation, Modeling, BitTorrent, Topological worms, Numerical Analysis Sinan Hatahet, Abdelmadjid Bouabdallah, Yacine Challal, pages 791 – 798. Show abstract Peer-to-peer (p2p) networking technology has gained popularity as an efficient mechanism for users to obtain free services without the need for centralized servers. Protecting these networks from intruders and attackers is a real challenge. One of the constant threats on P2P networks is the propagation of active worms. Recent events show that active worms can spread automatically and flood the Internet in a very short period of time. Therefore, P2P systems can be a potential vehicle for active worms to achieve fast worm propagation in the Internet. Nowadays, BitTorrent is becoming more and more popular, mainly due its fair load distribution mechanism. Unfortunately, BitTorrent is particularly vulnerable to topology aware active worms. In this paper we analyze the impact of a new worm propagation threat on BitTorrent . We identify the BitTorrent vulnerabilities it exploits, the characteristics that accelerate and decelerate its propagation, and develop a mathematical model of their propagation. We also provide numerical analysis results. This will help the design of efficient detection and containment systems. -
Information System Security Compliance to FISMA Standard: A Quantitative Measure
99 Security Standard Compliance Measurement, Security State Modeling, Pathfinder Networks, Quantitative Risk Analysis Elaine Hulitt, Rayford Vaughn, Jr, pages 799 – 806. Show abstract To ensure that safeguards are implemented to protect against a majority of known threats, industry leaders are requiring information processing systems to comply with security standards. The National Institute of Standards and Technology Federal Information Risk Management Framework (RMF) and the associated suite of guidance documents describe the minimum security requirements (controls) for non-national-security federal information systems mandated by the Federal Information Security Management Act (FISMA), enacted into law on December 17, 2002, as Title III of the E-Government Act of 2002. The subjective compliance assessment approach described in the RMF guidance, though thorough and repeatable, lacks the clarity of a standard quantitative metric to describe for an information system the level of compliance with the FISMA-required standard. Given subjective RMF assessment data, this article suggests the use of Pathfinder networks to generate a quantitative metric suitable to measure, manage, and track the status of information system compliance with FISMA. -
Analysis of Different Architectures of Neural Networks
197 Przemysław Kukiełka, Zbigniew Kotulski, pages 807 – 811. Show abstract Usually, Intrusion Detection Systems (IDS) work using two methods of identification of attacks: by signatures that are specific defined elements of the network traffic possible to identification and by anomalies being some deviations form of the network behavior assumed as normal. In the both cases one must pre-define the form of the signature (in the first case) and the network’s normal behavior (in the second one). In this paper we propose application of Neural Networks (NN) as a tool for application in IDS. Such a method makes possible utilization of the NN learning property to discover new attacks, so (after the training phase) we need not deliver attacks’ definitions to the IDS. In the paper, we study usability of several NN architectures to find the most suitable for the IDS application purposes. -
A Semantic Framework for Privacy-Aware Access Control
164 Privacy, ontology, middleware, privacy legislation, access control Georgios Lioudakis, Nikolaos Dellas, Eleftherios Koutsoloukas, Georgia Kapitsaki, Dimitra Kaklamani, Iakovos Venieris, pages 813 – 820. Show abstract The issue of privacy is constantly brought to the spotlight since an ever increasing number of services collects and processes personal information from users. In fact, recent advances in mobile communications, location and sensing technologies and data processing are boosting the deployment of context-aware personalized services and the creation of smart environments but, at the same time, they pose a serious risk on individuals’ privacy rights. Being situated in the realms of legal and social studies, the notion of privacy is mainly left, concerning its protection, to legislation and service providers’ self-regulation by means of privacy policies. However, all laws and codes of conduct are useless without enforcement. Based on this concept, this paper presents a framework conceived on the basis of privacy legislation. It uses a semantic model for the specification of privacy-aware data access rules and a middleware system which mediates between the service providers and the data sources and caters for the enforcement of the regulatory provisions. -
Access Control Models in Heterogeneous Information Systems: from Conception to Exploitation
41 access control, access control models, heterogeneous information systems Aneta Poniszewska-Maranda, pages 821 – 826. Show abstract The development of the information systems should answer more and more to the problems of federated data sources and the problems with the heterogeneous distributed information systems. The assurance of data access security realized in the cooperative information systems with loose connection among local data sources is hard to achieve mainly for two reasons: the local data sources are heterogeneous (i.e. data, models, access security models, semantics, etc.) and the local autonomy of systems does not allow to create a global integrated security schema. The paper proposes to use one common set of access control concepts to support the access control management in security of heterogeneous information systems. The UML (Unified Modelling Language) concepts can be used to define and implement the most popular access control models, such as DAC, MAC or RBAC. Next, the concepts derived from different models can be joined to use one common approach comprehensible for each administrator of each cooperative information system in the federation. -
A Semantic Aware Access Control Model with Real Time Constraints on History of Accesses
118 access control, semantic awareness, temporal authorization, access history, real time Ali Noorollahi Ravari, Morteza Amini, Rasool Jalili, pages 827 – 836. Show abstract : Given (si,o,±a), if sj≺si then (sj,o,±a).Propagation in objects domain: Given (s,oi,±a), if oj≺oi then (s,oj,±a).Propagation in actions domain:Given (s,o,+ai), if aj≺ai then (s,o,+aj).Given (s,o,-aj), if aj≺ai then (s,o,-ai).▄Example 1. Subject s1 is allowed to get another loan (on Deposit1), if he has paid all his (past 36) payments. It is assumed that the date of getting the first loan is 2004/07/01. This rule can be expressed as: ▄To formalize the semantics of temporal authorization rules, we first define an evaluation function freal. This function evaluates the predicate F of temporal authorization rules at a real time point t and based on the elements stored in History Base. The semantics of such an evaluation is given in first order logic and is reported in Table 2. The semantics of a set X of temporal authorization rules, denoted by S(X), is the conjunction of the first order formulas corresponding to each element in the set.Note that a temporal authorization rule can be removed and therefore not be applicable anymore for the derivation of authorizations. In the formalization we take this possibility into account, by associating with each temporal authorization rule the time td at which it is removed. Note that time td is not a constant and it is not known from the former. We use it as shorthand for expressing the point up to which a temporal authorization rule is applicable. A function removed() can be defined, which, given a temporal authorization rule, X, and a time t returns false if at time t, X is still present in the TAB, and ,true, otherwise. Time td is the smallest time t for which function removed(t , X) returns true.By the definition of evaluation function freal and by the assumption described above, the semantics of authorization rules are in Table 3 . In the following, grantt,s,o,a denotes subject s is granted to exercise action a on object o and analogously denyt,s,o,a denotes the access request of s for exercising an access a on object o is denied. Determine the explicit and implicit valid authorization rules in TAB at time t (following the definition of valid authorization rules), satisfying the following conditions:Extract the set of valid authorization rules such as ([ts,tf],(s,o,±a),F) which match the access request. These authorization rules must satisfy, at least, one of the following conditions:s=sr , o=or , a=ar Following the propagation rules of the SBAC model, in the case of a positive action (+a), we have sr≺s , or≺o , ar≺a, and in the case of a negative action (-a), we have sr≺s , or≺o , a≺ar.If there exist just positive valid authorization rule(s) such as ([ts,tf],(s,o,+a),F) in MVA, grant the requested access,If there exist just negative valid authorization rule(s) such as ([ts,tf],(s,o,-a),F) in MVA, deny the access request,If there exist both positive and negative authorization rules in MVA, do conflict resolution and follow the result,If there exists no valid authorization rule, which matches the requested access, follow the default access policy,Record done(sr,or.ar) in the case that the requested access is granted, and denied(sr,or,ar) in the case that the access request is denied.Conflict due to semantic relations between the entities: as mentioned before, in the domains of subjects and objects, the subsumee has all the privileges (positive and negative) of the subsumer, but, in the domain of actions, positive access rights is propagated from subsumer to subsumee, while negative access rights is propagated in the opposite direction (that is from subsumee to subsumer). These semantic relationships and the propagation of negative and positive authorizations between the entities may result in conflicting situations. As an example, consider the following History Base, semantic relation, and authorization rules:Conflict due to sub-interval relationship between authorization rules: as an example, consider the following HB, and authorization rules:Fine-grained and Coarse-grained Authorization: TSBAC allows definition of policies for entities in three domains of access control (namely subjects, objects, and actions), so it provides coarse-grained authorization. Moreover, with the existence of ontology, and possibility of defining entities to the individual level, fine-grained authorization is provided. Conditional Authorization: With the existence of temporal operators, TSBAC supports this type of authorization. In this model, due to wide spectrum of temporal operators, and using first order logic operators for combining temporal expressions, conditional authorization is provided, on the basis of existence or non-existence of specific authorizations in the past. Different Policies and Expressing Exceptions: TSBAC provides synthetic policy (including negative and positive authorizations). Moreover, by using ontology in domains of subjects, objects, and actions, and utilizing different authorization propagation methods, expressing exceptions and synthetic policies is possible. Conflict Detection and Resolution: Conflict occurrence may be a result of semantic relationships between authorizations, or, sub-interval relations between validity constraint intervals. TSBAC detects these conflicts, and resolves them. Different conflict resolution policies include: denials take precedence, positives take precedence, most specific takes precedence, and newer overrides older. Ease of Implementation and Integration with Semantic Web technologies: Security models designed for Semantic Web should be compatible with the technology infrastructure under it. In other words, the implementation of security mechanisms should be possible based on the semantic expression models. SBAC is designed based on the widely accepted semantic web languages, OWL and SWRL; therefore its implementation can be easily achieved by existing tools designed for working with these languages. Supporting History-based Information: The main feature of TSBAC is that authorizing an access request is done based on granted or denied access requests (done and denied access requests, which are stored in History Base), or, access requests that have not been done or have not been denied in the system (which can be inferred from History Base). These elements could be combined with temporal operators, or first order logics operators to compose temporal expressions. Interoperability : Interoperating across administrative boundaries is achieved through exchanging authorizations for distributing and assembling authorization rules. The ontological modeling of Authorization rules in SBAC results in a higher degree of interoperability compared with other approaches to access control. This is because of the nature of ontologies in providing semantic interoperability. Generality: Modeling different domains of access control has added a considerable generality to the model. In the subject domain, TSBAC uses credentials which are going to be universally used for user authentication. In the domain of object, different kinds of resources such as web pages or web services can be modeled and can be identified by their URI in authorization rules. Time Complexity: Since every access request is validated at the time of the request, and the process of authorization is based upon searching History Base and evaluating the temporal predicate, due to vast amount of elements of History Base and temporal predicate complexity, access control in TSBAC is time consuming. In some situations, in order to evaluate the temporal predicate, we need to scrutinize the existence of "not-done" or "not denied" requests, and this adds to the time complexity of access control process.Space Complexity: All of the access requests (granted or denied) are stored in History Base. Storing all the requested accesses in the system, gradually, requires a huge amount of storage space. In case of a vast amount of history elements, and thus incapability of keeping all these elements on volatile storage, time complexity of access control process is amplified. Brewer, D.F.C. and M.J. Nash. The Chinese Wall Security Policy. in IEEE Symposium on Security and Privacy. 1989. Oakland, California. -
Network-level properties of modern anonymity systems
114 anonymity, deployment, performance evaluation Marta Rybczyńska, pages 837 – 843. Show abstract This paper shows the network-level view of the behaviour of two popular and deployed anonymity systems; Tor and JAP (AN.ON). The analysis uses the fact that both of them depend on TCP (Transmission Control Protocol) and shows cases when network conditions may affect the systems' operations. The main topics are: on-off traffic, difference in available bandwidth between peers, and usage of the window mechanism in the TCP. The analysis is divided into two parts: from both the sender's and receiver's point of view. We present the results of experiments made on live systems, showing that the analysed effects may be encountered in practice. -
Performance Evaluation of a Machine Learning Algorithm for Early Application Identification
106 traffic classification, machine learning Giacomo Verticale, Paolo Giacomazzi, pages 845 – 849. Show abstract The early identification of applications through the observation and fast analysis of the associated packet flows is a critical building block of intrusion detection and policy enforcement systems. The simple techniques currently used in practice, such as looking at the transport port numbers or at the application payload, are increasingly less effective for new applications using random port numbers and/or encryption. Therefore, there is increasing interest in machine learning techniques capable of identifying applications by examining features of the associated traffic process such as packet lengths and interarrival times. However, these techniques require that the classification algorithm is trained with examples of the traffic generated by the applications to be identified, possibly on the link where the the classifier will operate. In this paper we provide two new contributions. First, we apply the C4.5 decision tree algorithm to the problem of early application identification (i.e. looking at the first packets of the flow) and show that it has better performance than the algorithms proposed in the literature. Moreover, we evaluate the performance of the classifier when training is performed on a link different from the link where the classifier operates. This is an important issue, as a pre-trained portable classifier would greatly facilitate the deployment and management of the classification infrastructure.
Workshop on Wireless and Unstructured Networking: Taming the Stray
-
Supporting Wireless Application Development via Virtual Execution
116 wireless ad hoc sensor networks, operating system, PicOS, virtualization, emulator, VUEE, application development Nicholas M. Boers, Pawel Gburzynski, Ioanis Nikolaidis, Wlodek Olesinski, pages 853 – 860. Show abstract We introduce our “holistic” platform for building wireless ad hoc sensor networks and focus on its most representative and essential virtualization component: VUE² (the Virtual Underlay Emulation Engine). Its role is to provide a vehicle for authoritative emulation of complete networked applications before physically deploying the wireless nodes. The goal is to be able to verify those applications exhaustively before programming the hardware, such that no further (field) tests are necessary. We explain how VUE² achieves this goal owing to several facilitating factors, most notably the powerful programming paradigm adopted in our platform. As implied by the holistic nature of the discussed system, our presentation touches upon operating systems, simulation, network protocols, real-time systems, and programming methodology. -
Wireless Ad Hoc Networks: Where Security, Real-time and Lifetime Meet
227 ad hoc networks, sensor networks, multihop routing, lifetime, secure routing Zdravko Karakehayov, pages 861 – 868. Show abstract This paper deals with the multihop nature of wireless ad hoc networks where security, real-time and lifetime meet. We propose a hierarchical communication model to study how medium access control and network energy signatures provide opportunities for power-efficient partitioning of communication links. We show possibilities to integrate multihop services into routing protocols. At the same time, malicious nodes included in a routing path may misbehave and the paper analyzes vulnerable points in case of communication attacks. The REWARD algorithm for secure routing is used as a main example for corrective actions. Nodes listen to neighbor transmissions to detect black hole attacks. The energy overhead is broken down into static, the additional energy required to watch for attacks, and dynamic, which is application specific. We evaluate the static energy overhead associated with symmetrical routing. -
Self Organization of Mobile Base Station for High Altitude Platform Networking
187 High Altitude Platform, Ad-hoc network, Unmanned Aerial Vehicle, Clustering, Mobile Base Station Placement Ha Yoon Song, pages 869 – 876. Show abstract High Altitude Platforms (HAPs) such as Unmanned Aerial Vehicles (UAVs) which can be deployed as stratospheric infrastructures enable a sort of new configurations of wireless networks. Ground nodes must be clustered in multiple sets and one dedicated UAV is assigned to each set and act as an MBS. For the intra-set nodes, UAVs must communicate each other in order to establish network links among intra-set nodes. Here we find a geographical clustering problem of networking nodes and a placement problem of MBSs. The clustering technique of mobile ground nodes can identify the geographical location of MBSs as well as the coverage of MBSs. In this paper we proposed a clustering mechanism to build such a configuration and the effectiveness of this solution is demonstrated by simulation. For a selected region with a relatively big island, we modeled mobile ground nodes and showed the result of dynamic placement of MBSs by our clustering algorithm. The final results will be shown graphically with the mobility of ground nodes as well as the placement of MBSs. -
Energy Efficient Percolation-Driven Flood Routing for Large-Scale Sensor Networks
94 flood routing, percolation Gergely Vakulya, Gyula Simon, pages 877 – 883. Show abstract Energy Efficient Percolation-Driven Flood Routing for Large-Scale Sensor NetworksAbstract—Flooding algorithms are widely used in ad-hoc sensor networks, especially as building blocks of more sophisticated routing protocols. A distributed percolation-driven energy efficient probabilistic flood routing algorithm is proposed, which provides large message delivery ratio with small number of sent messages, using neighborhood information only. The performance of the algorithm is analyzed and the theoretical results are verified through simulation examples.
Workshop on Computational Optimization
-
Ant Colony System Approach for Protein Folding
33 Ant Colony Optimization, metaheuristics, hydrophobic-polar model, protein folding Stefka Fidanova, Ivan Lirkov, pages 887 – 891. Show abstract The protein folding problem is a fundamental problem in computational molecular biology and biochemical physics. The high resolution 3D structure of a protein is the key to the understanding and manipulating of its biochemical and cellular functions. All information necessary to fold a protein to its native structure is contained in its amino-acid sequence. Even under simplified models, the problem is NP-hard and the standard computational approach are not powerful enough to search for the correct structure in the huge conformation space. Due to the complexity of the protein folding problem simplified models such as hydrophobic-polar (HP) model have become one of the major tools for studying protein structure. Various optimization methods have been applied on folding problem including Monte Carlo methods, evolutionary algorithm, ant colony optimization algorithm. In this work we develop an ant algorithm for 3D HP protein folding problem. It is based on very simple design choices in particular with respect to the solution components reinforced in the pheromone matrix. The achieved results are compared favorably with specialized state-of-the-art methods for this problem. Our empirical results indicate that our rather simple ant algorithm outperforms the existing results for standard benchmark instances from the literature. Furthermore, we compare our folding results with proteins with known folding. -
Estimating Time Series Future Optima Using a Steepest Descent Methodology as a Backtracker
150 Time series optimization, Lipschitz constant, Steepest Descent Eleni G. Lisgara, George S. Androulakis, pages 893 – 898. Show abstract Recently it was produced a backtrack technique for the efficient approximation of a time series' future optima. Such an estimation is succeeded based on a selection of sequenced points produced from the repetitive process of the continuous optima finding. Additionally, it is shown that if any time series is treated as an objective function subject to the factors affecting its future values, the use of any optimization technique finally points local optimum and therefore enables accurate prediction making. In this paper the backtrack technique is compiled with a steepest descent methodology towards optimization. -
On a Class of Periodic Scheduling Problems: Models, Lower Bounds and Heuristics
80 periodic shceduling, lower bounds, heuristics, complexity, experiments Philippe Michelon, Dominique Quadri, Marcos Negreiros, pages 899 – 906. Show abstract We study in this paper a generalization of the basic strictly periodic scheduling problem where two positive integer constants, associated to each task, are introduced such that to replace the usual strict period. This problem is motivated by the combat of the dengue which is one of the major tropical disease. We discuss general properties and propose two integer mathematical models of the problem considered which are compared theoretically. We also suggest a lower bound which is derived from the structure of the problem. It appears to be quickly obtained and of good quality. Three greedy algorithms are proposed to provide feasible solutions which are compared with the optimum (when it can be obtained by the use of ILOG-Cplex10.0). It is shown that for special instances greedy algorithms are optimal. -
Visualizing Multi-Dimensional Pareto-Optimal Fronts with a 3D Virtual Reality System
90 Multiobjective optimization, Pareto-optimal front, virtual reality, 3D visualization, evolutionary computation Elina Madetoja, Henri Ruotsalainen, Veli-Matti Mönkkönen, Jari Hämäläinen, Kalyanmoy Deb, pages 907 – 913. Show abstract In multiobjective optimization, there are several targets that are in conflict, and thus they all cannot reach their optimum simultaneously. Hence, the solutions of the problem form a set of compromised trade-off solutions (a Pareto-optimal front or Pareto-optimal solutions) from which the best solution for the particular problem can be chosen. However, finding that best compromise solution is not an easy task for the human mind. Pareto-optimal fronts are often visualized for this purpose because in this way a comparison between solutions according to their location on the Pareto-optimal front becomes somewhat easier. Visualizing a Pareto-optimal front is straightforward when there are only two targets (or objective functions), but visualizing a front for more than two objective functions becomes a difficult task. In this paper, we introduce a new and innovative method of using three-dimensional virtual reality (VR) facilities to present multi-dimensional Pareto-optimal fronts. Rotation, zooming and other navigation possibilities of VR facilities make easy to compare different trade-off solutions, and fewer solutions need to be explored in order to understand the interrelationships among conflicting objective functions. In addition, it can be used to highlight and characterize interesting features of specific Pareto-optimal solutions, such as whether a particular solution is close to a constraint boundary or whether a solution lies on a relatively steep trade-off region. Based on these additional visual aids for analyzing trade-off solutions, a preferred compromise solution may be easier to choose than by other means. -
Communication Network Design Using Particle Swarm Optimization
149 network topology design, particle swarm optimization, genetic algorithm, multiobjective optimization, NetKeys Chryssa Papagianni, Kostas Papadopoulos, Chris Pappas, Nikolaos Tselikas, Dimitra Kaklamani, Iakovos Venieris, pages 915 – 920. Show abstract Particle Swarm Optimization is applied on an instance of single and multi criteria network design problem. The primary goal of this study is to present the efficiency of a simple hybrid particle swarm optimization algorithm on the design of a network infrastructure including decisions concerning the locations and sizes of links. A complementary goal is to also address Quality of Service issues in the design process. Optimization objectives in this case are the network layout cost and the average packet delay in the network. Therefore a multi-objective instance of the hybrid PSO algorithm is applied. The particular hybrid PSO includes mutation to avoid premature convergence. For the same reason repulsion/attraction mechanisms are also applied on the single objective case. Mutation is passed on to the multi-objective instance of the algorithm. Obtained results are compared with corresponding evolutionary approaches. -
Two Stages Optimization Problem: New Variant of Bin Packing Problem for Decision Making
217 Combinatorial optimization, Generalized assignment problem (GAP), Bin packing, Integer Programming Ahmad Shraideh, Hervé Camus, Pascal Yim, pages 921 – 925. Show abstract In this paper, we present a new multi-criteria assignment problem that groups characteristics from the well known Bin Packing Problem (BPP) and Generalized Assignment Problem (GAP). Similarities and differences between these problems are discussed, and a new variant of BPP is presented. The new variant will be called generalized assignment problem with identified first-use bins (GAPIFB). The GAPIFB will be used to supply decision makers with quantitative and qualitative indicators in order to optimize a business process. An algorithm based on the GAP problem model and on GAPIFB is proposed. -
Adaptive Differential Evolution and Exponential Crossover
95 Differential evolution; self-adaptation; exponential crossover; comparison in benchmarks Josef Tvrdik, pages 927 – 931. Show abstract Several adaptive variants of differential evolution are described and compared in two sets of benchmark problems. The influence of exponential crossover on efficiency of the search is studied. The use of both types of crossover together makes the algorithms more robust. Such algorithms are convenient for the real-world problems, where we need an adaptive algorithm applicable without time-wasting parameter tuning. -
Solving IK Problems for Open Chains Using Optimization Methods
103 IK, inverse kinematics, robotics, open chains, optimization methods Krzysztof Zmorzyński, pages 933 – 937. Show abstract Algorithms solving inverse problems for simple (open) kinematic chains are presented in this paper. Due to very high kinematics chains construction complexity, those algorithms should be as universal as possible. That is why optimization methods are used. They allow fast and accurate calculations for arbitrary kinematics chains constructions, permitting them to be used with success in robotics and similar domains.