ENASE 2018 Abstracts


Area 1 - Service Science and Business Information Systems

Full Papers
Paper Nr: 4
Title:

An Action Research Study towards the Use of Cloud Computing Scenarios in Undergraduate Computer Science Courses

Authors:

Heleno Cardoso da Silva Filho and Glauco de Figueiredo Carneiro

Abstract: Cloud computing has been a successful paradigm in its goal to provide remote computing resources in a competitive and scalable way when compared to traditional computing scenarios. Companies have a growing interest in migrating and using cloud services. However, the literature has reported difficulties and challenges faced by companies while migrating their assets to the cloud. One of the possible reasons for this is the difficulty in the identification of qualified professionals to support companies to plan, perform and monitor the migration of their legacy systems to the cloud. This paper presents an action-research study analyzing the inclusion of cloud computing scenarios in the System Analysis and Design and Operating Systems undergraduate courses at Salvador University (UNIFACS). The results of the action-research study provided initial evidence that cloud computing resources integrated to the contents of the aforementioned courses can contribute to motivate and engage students in activities. In addition, the knowledge and experience gained by these students can improve their qualification to facilitate access to the labor market.

Paper Nr: 44
Title:

Using COSMIC FSM Method to Analyze the Impact of Functional Changes in Business Process Models

Authors:

Wiem Khlif, Asma Sellami, Mariem Haoues and Hanêne Ben-Abdallah

Abstract: Today's dynamic, socioeconomic constraints often force enterprises to change their Business Processes (BP), for instance, by deleting some activities or merging certain work units. These changes induce functional changes on the business process whose performance might be affected. Consequently, there is need for a well-defined change measure and a control process to ensure the success of the BP management projects. This paper proposes to apply COSMIC Functional Size Measurement method to evaluate the BP functional changes to help both business process developers and decision makers to accept/defer or deny a functional change. The proposed evaluation identifies the real impact of a functional change on the project and provides indicators for analyzing the functional change status. In addition, when there are multiple change requests, it uses a heuristic algorithm to prioritize the changes while minimizing the required effort to answer them.

Paper Nr: 64
Title:

Iterative Process for Generating ER Diagram from Unrestricted Requirements

Authors:

Muhammad Javed and Yuqing Lin

Abstract: Requirements analysis for generating a conceptual model such as an Entity Relationship Diagram (ERD) is an essential task in software development life cycle. In this paper, we are presenting a Natural Language Processing (NLP) based approach to generate the ERD from requirements in an unrestricted format such as general requirements, user stories or Use Case Specification (UCS). To assess the performance and correctness of the proposed technique, we compare our approach with existing automated techniques by processing the same requirements. The preliminary results show a significant improvement.

Paper Nr: 90
Title:

Software Evolution for Digital Transformation

Authors:

Alfred Zimmermann, Rainer Schmidt, Justus Bogner, Dierk Jugel and Michael Möhring

Abstract: In current times, a lot of new business opportunities appeared using the potential of the Internet and related digital technologies, like Internet of Things, services computing, cloud computing, big data with analytics, mobile systems, collaboration networks, and cyber physical systems. Enterprises are presently transforming their strategy, culture, processes, and their information systems to become more digital. The digital transformation deeply disrupts existing enterprises and economies. Digitization fosters the development of IT environments with many rather small and distributed structures, like Internet of Things. This has a strong impact for architecting digital services and products. The change from a closed-world modeling perspective to more flexible open-world and living software and system architectures defines the moving context for adaptable and evolutionary software approaches, which are essential to enable the digital transformation. In this paper, we are putting a spotlight to service-oriented software evolution to support the digital transformation with micro-granular digital architectures for digital services and products.

Short Papers
Paper Nr: 28
Title:

Model-Aware Software Engineering - A Knowledge-based Approach to Model-Driven Software Engineering

Authors:

Robert Andrei Buchmann, Mihai Cinpoeru, Alisa Harkai and Dimitris Karagiannis

Abstract: Standard modelling languages enabled the Model-Driven Software Engineering paradigm, allowing the development of model compilers for code generation. This, however, induces a subordination of implementation to the modelling language: the modelling benefits are confined to a fixed semantic space. On the other hand, the rise of agile software development practices has impacted model-driven engineering practices - an Agile Modelling paradigm was consequently introduced. This was later expanded towards the Agile Modelling Method Engineering (AMME) framework which generalizes agility at the modelling method level. By observing several AMME-driven implementation experiences, this paper specialises the notion of Model-Driven Software Engineering to that of Model-Aware Software Engineering – an approach that relies on modelling language evolution, in response to the evolution of the implemented system's requirements. The key benefit is that the modelling language-implementation dependency is reversed, as the implementation needs propagate requirements towards an agile modelling language.

Paper Nr: 57
Title:

Anomalies Correlation for Risk-Aware Access Control Enhancement

Authors:

Pierrette Annie Evina, Faten Labbene Ayachi, Faouzi Jaidi and Adel Bouhoula

Abstract: In the context of database management systems (DBMS), the integrity of access control policies (ACP) is a constantly neglected aspect. However, throughout its evolution, ACP is not valid and free from irregularities due to users and administrators actions, intentionally or not. So, considering regular ACP updating activities, we pay a particular attention on anomalies in ACP expression. Taking into account the correlation that exists between two or more of such anomalies, we present the “correlated threats management system” (CORMSYS). This system must detect and analyze the correlation between anomalies since we believe that handling correlations between anomalies can reveal sophisticated intrusion scenarios in DBMS. The presented system also produces the necessary input for new risk management approach that will consider and overcome the effects induced by the correlation between anomalies found in the ACP expression. CORMSYS is composed of four main parts: (i) the Correlation Definition and Analysis subsystem; (ii) the Users Tracking subsystem; (iii) the Intrusion Scenario Identification subsystem and (iv) the Illegal Behavior Modeling subsystem.

Paper Nr: 78
Title:

A Practical Extension of Frameworks for Auditing with Process Mining

Authors:

Ella Roubtsova and Niels Wiersma

Abstract: Audit of processes to verify legal compliance is a necessary activity in banks, municipalities and many other sectors. In theory, by using log-files and process mining tools, auditors can automate the auditing process instead of data gathering and taking samples. However, audits are rarely supported by process mining tools in practice. This paper investigates the reasons for that. We identified the fact that the published audit frameworks with process mining are not oriented on real auditors. They replace the auditors with compliance experts and do not see the necessary steps of refinement of audit statements to make them useful both for writing process mining filters and for analysis of the process instances on correspondence to audit goals and policies. We also identified that the building or correcting business process models for audit is often driven by the log information. On the basis of our findings, we propose an extension of the audit frameworks with process mining and evaluate our extension by conducting case studies of audit in different business domains.

Posters
Paper Nr: 35
Title:

Comprehensive View on Architectural Requirements for Maintenance Information Systems

Authors:

Andreas Reidt, Stefan Schuhbäck and Helmut Krcmar

Abstract: Concepts of Industry 4.0 and digitization are leading to a much higher complexity of production facilities, related processes, and activities such as maintenance. As a result, the demands made on the process of maintenance and the maintenance workers involved are getting more demanding. To meet these requirements, many information systems have been developed with different purposes and technologies. Yet the development of these systems is occurring in isolation and the main challenges of the integration of these maintenance solutions and their information with each other are not addressed. The resulting solutions and their architectures lack a sustainable holistic viewpoint from the start. To solve these challenges and to give developers a framework in which to develop information systems for maintenance, a holistic view of the architectural and requirements is needed. Therefore, a framework for the general requirements of all maintenance information systems has been developed in this paper. To achieve this, a literature review was conducted where the requirements for a broad set of maintenance information systems were gathered and compared with each other. Based on this information, general principles for the architectures of maintenance information systems were derived.

Paper Nr: 61
Title:

Deep Learning Process Prediction with Discrete and Continuous Data Features

Authors:

Stefan Schönig, Richard Jasinski, Lars Ackermann and Stefan Jablonski

Abstract: Process prediction is a well known method to support participants in performing business processes. These methods use event logs of executed cases as a knowledge base to make predictions for running instances. A range of such techniques have been proposed for different tasks, e.g., for predicting the next activity or the remaining time of a running instance. Neural networks with Long Short-Term Memory architectures have turned out to be highly customizable and precise in predicting the next activity in a running case. Current research, however, focuses on the prediction of future activities using activity labels and resource information while further event log information, in particular discrete and continuous event data is neglected. In this paper, we show how prediction accuracy can significantly be improved by incorporating event data attributes. We regard this extension of conventional algorithms as a substantial contribution to the field of activity prediction. The new approach has been validated with a recent real-life event log.

Paper Nr: 74
Title:

Handling Tenant-Specific Non-Functional Requirements through a Generic SLA

Authors:

Khadija Aouzal, Hatim Hafiddi and Mohamed Dahchour

Abstract: In a multi-tenant architecture of a Software as a Service (SaaS) application, one single instance is shared among different tenants. However, this architectural style supports only the commonalities among tenants and does not cope with the variations and the specific context of each tenant. These variations concern either functional or non-functional properties. In this paper, we deal with non-functional variability in SaaS services in order to support the different quality levels that a service may have. For that purpose, we propose an approach that considers Service Level Agreements (SLAs) as Families in terms of Software Product Line Engineering. We define two metamodels: NFVariability metamodel and VariableSLA metamodel. The first one models and captures variability in quality attributes of services. The second one models a dynamic and variable SLA. Model-to-model transformations are performed to transform Feature Model (NFVariability metamodel instance) to Generic SLA (VariableSLA instance) in order to dynamically deal with the tenant-specific nonfunctional requirements.

Paper Nr: 75
Title:

Assessing the Impact of Measurement Tools on Software Mantainability Evaluation

Authors:

Lerina Aversano and Maria Tortorella

Abstract: A relevant aspect of development and maintenance tasks is the evaluation of the software system quality. Measurement tools facilitate the measurement of software metrics and application of the quality models. However, differences and commonalities exist among the evaluation results obtained by the adoption of different measurement tools. This does not allow an objective and unambiguous evaluation of a software product quality level. In this direction, this paper proposes a preliminary investigation on the impact of measurement tools on the evaluation of the software maintainability metrics. Specifically, metrics values have been computed by using different software analysis tools for three software systems of different size. Measurements show that the considered measurement tools provide different values of metrics evaluated for the same software system.

Area 2 - Software Engineering

Full Papers
Paper Nr: 8
Title:

Problem-based Elicitation of Security Requirements - The ProCOR Method

Authors:

Roman Wirtz, Maritta Heisel, Rene Meis, Aida Omerovic and Ketil Stølen

Abstract: Security is of great importance for many software systems. The security of a software system can be compromised by threats, which may harm assets with a certain likelihood, thus constituting a risk. All such risks should be identified, and unacceptable risks should be reduced, which gives rise to security requirements. The relevant security requirements should be known right from the beginning of the software development process. Eliciting security requirements should be done in a systematic way. We propse a method to elicit security requirements that address unacceptable risks. They require a reduction of the risk to an acceptable level. Our method combines the CORAS risk management method with Jackson’s problem-based requirements analysis approach. Based on the functional requirements for a software system, security risks are identified and evaluated. Unacceptable risks give rise to high-level security requirements. To reduce the risk, treatments are selected. Based on the selected treatments, concretized security requirements are set up and represented in a similar way as functional requirements. Thus, both functional and security requirements can then drive the software development process.

Paper Nr: 13
Title:

Incremental Bidirectional Transformations: Applying QVT Relations to the Families to Persons Benchmark

Authors:

Bernhard Westfechtel

Abstract: Model transformations constitute a key technology for model-driven software engineering. In round-trip engineering processes, model transformations are performed not only in forward, but also in backward direction. Since defining forward and backward transformations separately is both awkward and error-prone, bidirectional transformation languages provide a single definition for both directions. This paper evaluates the transformation language QVT Relations (QVT-R) which allows to specify incremental bidirectional transformations --- as required for round-trip engineering for propagating changes in both directions --- declaratively at a high level of abstraction. We apply QVT-R to a well-known benchmark example, the Families to Persons case. This case study demonstrates a number of limitations of the QVT-R language which result from the strictly state-based design of the language as well as from the way in which the semantics of QVT-R transformations are defined.

Paper Nr: 17
Title:

Adopting Collaborative Games into Agile Requirements Engineering

Authors:

Adam Przybyłek and Mateusz Zakrzewski

Abstract: In agile software development, where great emphasis is put on effective informal communication involving diverse stakeholders, success depends on human and social factors. Not surprisingly, the Agile Manifesto advocates principles and values such as “individuals and interactions over processes and tools”, “focus on the customer”, “collaborate regularly”, “communicate face-to-face within the team” and “have regular team introspection”. However, agile methodologies have hardly provided any tools or techniques that aid the human side of software development. Additionally, more and more research suggests that customers no longer should be viewed as a passive source of information but need to be engaged in envisioning future business practice, discovering opportunities, and shaping solutions. To deal with these challenges, we propose a framework for extending Scrum with 9 collaborative games. Collaborative games refer to several structured techniques inspired by game play and designed to facilitate collaboration, foster customer involvement, and stimulate creative thinking. The feedback received from a Scrum team that leveraged our framework in two commercial projects, indicates that the adopted collaborative games: (1) make customers more willing to attend the meeting; (2) foster stakeholders’ commitment; and (3) produce better results than the standard approach.

Paper Nr: 21
Title:

A Framework to Support Behavioral Design Pattern Detection from Software Execution Data

Authors:

Cong Liu, Boudewijn van Dongen, Nour Assy and Wil M. P. van der Aalst

Abstract: The detection of design patterns provides useful insights to help understanding not only the code but also the design and architecture of the underlying software system. Most existing design pattern detection approaches and tools rely on source code as input. However, if the source code is not available (e.g., in case of legacy software systems) these approaches are not applicable anymore. During the execution of software, tremendous amounts of data can be recorded. This provides rich information on the runtime behavior analysis of software. This paper presents a general framework to detect behavioral design patterns by analyzing sequences of the method calls and interactions of the objects that are collected in software execution data. To demonstrate the applicability, the framework is instantiated for three well-known behavioral design patterns, i.e., observer, state and strategy patterns. Using the open-source process mining toolkit ProM, we have developed a tool that supports the whole detection process. We applied and validated the framework using software execution data containing around 1000.000 method calls generated from both synthetic and open-source software systems.

Paper Nr: 25
Title:

Optimized Realization of Software Components with Flexible OpenCL Functionality

Authors:

Gabriel Campeanu, Jan Carlson and Séverine Sentilles

Abstract: Today, the newly available embedded boards with GPUs provide a solution to satisfy the ever-increasing requirements of modern embedded systems. Component-based development is a well-known paradigm used to develop embedded systems. However, this paradigm lacks GPU support to address the specifics of these new boards. This leads to components that typically have reduced reusability, poor maintainability and portability. One way to tackle the existing shortcomings is through flexible components, i.e., platform-agnostic components that, at design time, offer the possibility to be executed either on CPU or GPU. The current realization of flexible components, i.e., as regular components with functionality tailored for the selected hardware, introduces additional overheads such as component communication overhead. In order to tackle the introduced overheads, our solution groups connected flexible components under a flexible group that conceptually behaves as a component. We introduce an algorithm to identify the existing groups in a given component-based system and the generation rules that automatically realizes groups as regular components. To evaluate the feasibility of the new concept, the flexible group is implemented using a state-of-the-practice component model (i.e., Rubus) and examined through the vision system of an underwater robot.

Paper Nr: 33
Title:

A New Approach for Optimal Implementation of Multi-core Reconfigurable Real-time Systems

Authors:

Wafa Lakhdhar, Rania Mzid, Mohamed Khalgui and Georg Frey

Abstract: This paper deals with a multi-core reconfigurable real-time system specified with a set of implementations, each of which is raised under a predefined condition and executes multiple functions which are in turns executed by threads. The implementation as threads generates a complex system code. This is due to the huge number of threads and the redundancy between the different implementations which may lead to an increase in the energy consumption. Thus we aim in this paper to optimize the system code by avoiding the redundancy between implementations and reducing the number of threads while meeting all related real-time constraints. The proposed approach adopts mixed integer linear programming (MILP) techniques in the exploration phase in order to provide a feasible task model. An optimal reconfigurable POSIX-based code of the system is manually generated as an output of this technique. An application to a case study and performance evaluation confirm and validate the expected results.

Paper Nr: 34
Title:

Mapping of Periodic Tasks in Reconfigurable Heterogeneous Multi-core Platforms

Authors:

Aymen Gammoudi, Daniel Chillet, Mohamed Khalgui and Adel Benzina

Abstract: Multi-core Real-time Systems (MRS) powered by a battery have been adopted for a wide range of high performance applications, such as mobile communication and automotive systems. A system is composed of N dependent and periodic Operating System (OS) tasks to be assigned to p heterogeneous cores linked by a network-on-chip (NoC). This paper deals with the problem of task allocation in MRS in such a way that the cost of communication between cores is minimized by trying to place the dependent tasks as close as possible to each other. The main objective is to develop a new strategy for allocating N tasks to p cores of a given distributed system using task clustering by considering both the cost of inter task communication and that of communication between cores. The proposed strategy guarantees that, when a task is mapped into the system and accepted, then it is correctly executed prior to the task deadline. A novel periodic task model based on elastic coefficients is proposed to compute useful temporal parameters allowing to assign all tasks to p cores, by minimizing the traffic between cores. Experimental results reveal the effectiveness of the proposed strategy by comparing the derived solutions with the optimal ones, obtained by solving an Integer Linear Program (ILP).

Paper Nr: 38
Title:

Refactoring Object-Oriented Applications for a Deployment in the Cloud - Workflow Generation based on Static Analysis of Source Code

Authors:

Anfel Selmadji, Abdelhak-Djamel Seriai, Hinde Lilia Bouziane, Christophe Dony and Chouki Tibermacine

Abstract: Cloud Computing delivers to customers computing/storage resources as services via the internet. It is characterized by its elastic nature and its payment model (pay-as-you-go). To optimize the use of these resources, one of the requirements related to this type of environment is to dynamically configure the applications to reduce the costs of their deployment.The dynamic configuration requires the ability to determine which resources are used, as well as when and where they are utilized. This can be done using workflows. In fact, several works rely on workflows to reduce execution costs in the cloud. Unlike workflows, OO applications have an architecture which exposes little or no behavioral (temporal) aspect. Hence, to execute an OO application in the cloud, the entire application needs to be deployed and all its used resources need to be allocated during its entire execution time. To reduce execution costs, we propose a re-engineering process aiming to restructure these applications from OO architectural style to workflow style. In this paper, we focus on the first step of the process which has as a goal generating a workflow from OO source code.

Paper Nr: 47
Title:

A Hybrid Approach To Detect Code Smells using Deep Learning

Authors:

Mouna Hadj-Kacem and Nadia Bouassida

Abstract: The detection of code smells is a fundamental prerequisite for guiding the subsequent steps in the refactoring process. The more the detection results are accurate, the more the performance of the refactoring on the software is improved. Given its influential role in the software maintenance, this challenging research topic has so far attracted an increasing interest. However, the lack of consensus about the definition of code smells in the literature has led to a considerable diversity of the existing results. To reduce the confusion associated with this lack of consensus, there is a real need to achieve a deep and consistent representation of the code smells. Recently, the advance of deep learning has demonstrated an undeniable contribution in many research fields including the pattern recognition issues. In this paper, we propose a hybrid detection approach based on deep Auto-encoder and Artificial Neural Network algorithms. Four code smells (God Class, Data Class, Feature Envy and Long Method) are the focus of our experiment on four adopted datasets that are extracted from 74 open source systems. The values of recall and precision measurements have demonstrated high accuracy results.

Paper Nr: 53
Title:

Exploring Crowdsourced Reverse Engineering

Authors:

Sebastian Heil, Felix Förster and Martin Gaedke

Abstract: While Crowdsourcing has been successfully applied in the field of Software Engineering, it is widely overseen in Reverse Engineering. In this paper we introduce the idea of Crowdsourced Reverse Engineering and identify the three major challenges: 1) automatic task extraction, 2) source code anonymization and 3) quality control and results aggregation. To illustrate Crowdsourced Reverse Engineering, we outline our approach for performing the Reverse Engineering activity of concept assignment as a crowdsourced classification task and address suitable methods and considerations with regard to each of the the three challenges. Following a brief overview on existing research in which we position our approach against related work, we report on our experiences from an experiment conducted on the crowdsourcing platform microworkers.com, which yielded 187 results by 34 crowd workers, classifying 10 code fragments with decent quality.

Paper Nr: 58
Title:

Cultural Influences on Requirements Engineering Process in the Context of Saudi Arabia

Authors:

Tawfeeq Alsanoosy, Maria Spichkova and James Harland

Abstract: Software development requires intensive communication between the requirements engineers and software stakeholders, particularly during the Requirements Engineering (RE) phase. Therefore, the individuals’ culture might influence both the RE process and the result. Our aims are to investigate the extend of cultural influences on the RE process, and to analyze how the RE process can be adapted to take into account cultural aspects. The model we present is based on Hofstede’s cultural theory. The model was applied on a pilot case study in the context of the conservative Saudi Arabian culture. The results reveal 6 RE aspects and 10 cultural factors that have a large impact on the RE practice.

Paper Nr: 60
Title:

Verification of Feature Coordination using the Fluent Calculus

Authors:

Ralph Hoch and Hermann Kaindl

Abstract: Previously, an approach based on the Situation Calculus was published for specifying feature coordination of a software system, but without a physical model or any additional autonomous agent in the environment. Hence, no verification of the feature coordination was possible in spite of its formal specification. Verification of safety-critical feature coordination is important, however, and requires additional models. This paper shows that a specification of a software coordinator can be formally verified using the Fluent Calculus (a derivative of the Situation Calculus), when combined with additional models. The overall qualitative model is a reimplementation of a recently published one based on synchronized finite-state machines, which was used for model checking. In fact, we show how the model in Fluent Calculus can be systematically derived from the finite-state machines. The results of verification using the Fluent Calculus correspond to those using model checking. We also contrast our approach using the Fluent Calculus with model checking. In summary, we present verification of (safety-critical) feature coordination using the Fluent Calculus.

Paper Nr: 63
Title:

A MDE Approach for Heterogeneous Models Consistency

Authors:

Mahmoud El Hamlaoui, Saloua Bennani, Mahmoud Nassar, Sophie Ebersold and Bernard Coulette

Abstract: To design a complex system, we often proceed via separation of viewpoints. Each viewpoint is described by a model that represents a domain expertise. Those partial models are generally heterogeneous (i.e conform to different metamodels) and thus performed by different designers. We proposed a matching process that links partial models through a virtual global model in order to create a complete view of the system. As models evolve, we should consider the impact of changing an element involved in a correspondence on other models to keep the coherence of the global view. So, we have defined a process that automatically identify changes, classify them and treat their potential repercussions on elements of other partial models in order to maintain the global model consistency.

Short Papers
Paper Nr: 1
Title:

An Approach to Prioritize Classes in a Multi-objective Software Maintenance Framework

Authors:

Michael Mohan and Des Greer

Abstract: Genetic algorithms have become popular in automating software refactoring and an increasing level of attention is being given to the use of multi-objective approaches. This paper investigated the use of a multi-objective genetic algorithm to automate software refactoring using a purpose built tool, MultiRefactor. The tool used a metric function to measure quality in a software system and tested a second objective to measure the importance of the classes being refactored. This priority objective takes as input a set of classes to favor and, optionally, a set of classes to disfavor as well. The multi-objective setup refactors the input program to improve its quality using the quality objective, while also focusing on the classes specified by the user. An experiment was constructed to measure the multi-objective approach against the alternative mono-objective approach that does not use an objective to measure priority of classes. The two approaches were tested on six different open source Java programs. The multi-objective approach was found to give significantly better priority scores across all inputs in a similar time, while also generating improvements in the quality scores.

Paper Nr: 15
Title:

Tool-assisted Game Scenario Representation Through Flow Charts

Authors:

Maria-Eleni Paschali, Nikolaos Bafatakis, Apostolos Ampatzoglou, Alexander Chatzigeorgiou and Ioannis Stamelos

Abstract: Game development is one of the fastest-growing industries in IT. In order for a game to be successful, the game should engage the player through a solid and interesting scenario, which does not only describe the state of the game, but also outlines the main characters and their interactions. By considering the increasing complexity of game scenarios, we seek for existing methods for scenario representation approaches, and based on the most popular one, we provide tool support for assisting the game design process. To evaluate the usefulness of the developed tool, we have performed a case study with the aim to assess the usability of the tool. The results of the case study suggested that after some interaction with end-users the tool has reached a highly usable state that to some extent guarantees its applicability in practice.

Paper Nr: 42
Title:

Designing BP-IS Aligned Models: An MDA-based Transformation Methodology

Authors:

Wiem Khlif, Nourchene Elleuch, Enaam Alotabi and Hanêne Ben-Abdallah

Abstract: The necessity of aligning an enterprise’s IS model to its business process model (BPM) is irrefutable. How to ensure the establishment and/or maintenance of this alignment remains, however, a pressing need for enterprises seeking to establish a new IS, better govern its enterprise architecture, and/or update its existing IS face to business-driven changes. The main difficulty of establishing/maintaining BP-IS models alignment stems from the divergent knowledge domains of the stakeholders (business process experts and software developers). To bridge the gap between these two stakeholders, this paper proposes an MDA compliant approach to automate the generation of UML class diagrams from BPMN models. The generated IS design can be used either to establish a new IS system, or analyze or maintain an existing one. The generation is defined in terms of transformations that ensure the alignment of the class diagram to the BPMN model by both accounting for the semantics and structure of the BPMN model, and providing for all business objects and activities.

Paper Nr: 46
Title:

Guard Evaluation and Synchronization Issues in Causal Semantics for UML2.X Sequence Diagrams

Authors:

Fatma Dhaou, Ines Mouakher, J. Christian Attiogbé and Khaled Bsaies

Abstract: UML2.X sequence diagrams (SD) permit the modelling of behaviours of systems. With the combined fragments (CF) that can be nested and aggregated in a single SD more complex behaviours can be created. However the standard interpretation and the consideration of all the aspects related to the CF are not obvious to handle, this represses their proper use. We proposed in a previous work, a causal semantics based on partial order theory, which is suitable for the computation of all possible valid traces for UML2.X sequence diagrams with nested CF modelling behaviours of distributed systems. In this work, we deal with an important aspect that is the guard evaluation and the synchronization between the lifelines that are complex issues.

Paper Nr: 48
Title:

A Novel Formal Approach to Automatically Suggest Metrics in Software Measurement Plans

Authors:

Sarah A. Dahab, Juan Jose Hernandez Porras and Stephane Maag

Abstract: The software measurement is an integral part of the software engineering process. With the rise of the software system and their complexity distributed through diverse development phases, the software measurement process has to deal with more management and performance constraints. In fact, the current software measurement process is fixed and manually planned at the beginning of the project and has to manage a huge amount of data resulting from the complexity of the software. Thereby, measuring software becomes costly and heavy. In addition, the implementation of the measures is dependent on the developer and reduce the scalability, maintainability and the interoperability of the measurement process. It becomes expert-dependent and thus more costly. In order to tackle these difficulties, first, we propose in this paper a formal software measurement implementation model based on the standard measurement specification SMM. Then, a software measurement plan suggestion framework based on a learning-based automated analysis.

Paper Nr: 59
Title:

Towards Classification of Lightweight Formal Methods

Authors:

Anna Zamansky, Maria Spichkova, Guillermo Rodriguez-Navas, Peter Herrmann and Jan Olaf Blech

Abstract: The use of lightweight formal methods (LFM) for the development of industrial applications has become a major trend. Although the term “lightweight formal methods” has been used for over ten years now, there seems to be no common agreement on what “lightweight” actually means, and different communities apply the term in all kinds of ways. In this paper, we explore the recent trends in the use of LFM, and establish our opinion that cost-effectiveness is the driving force to deploy LFM. Further, we propose a simple framework that should help to classify different LFM approaches and to estimate which of them are most cost-effective for a certain software engineering project. We demonstrate our framework using some examples.

Paper Nr: 65
Title:

VarSearch: Annotating Variations using an e-Genomics Framework

Authors:

José Fabián Reyes Román, David Roldán Martínez, Alberto García Simón, Urko Rueda and Óscar Pastor

Abstract: Nowadays experts in the genomics field work with bioinformatics tools (software) to generate genomic diagnoses, but the reality is that these solutions do not fully meet their needs. From the perspective of Information Systems (IS), the real problems lie in the lack of an approach (i.e., Software Engineering techniques) that can generate correct structures for data management. Due to the problems of dispersion, heterogeneity and the inconsistency of the data, understanding the genomic domain is a huge challenge. To demonstrate the advantages of Conceptual Modeling (CM) in complex domains -such as genomics- we propose “VarSearch”, a web-based tool for genomic diagnosis that incorporates the Conceptual Model of the Human Genome (CMHG) and takes advantage of Next-Generation Sequencing (NGS) for ensuring genomic diagnostics that help to maximize the Precision Medicine (PM).

Paper Nr: 69
Title:

A Bug Assignment Approach Combining Expertise and Recency of Both Bug Fixing and Source Commits

Authors:

Afrina Khatun and Kazi Sakib

Abstract: Automatic bug reports assignment to fixers is an important activity for software quality assurance. Existing approaches consider either the bug fixing or source commit activities which may result in inactive or inexperienced developers suggestions. Considering only one of the information can not compensate another leading to reduced accuracy of developer suggestion. An approach named BSBA is proposed, which combines the expertise and recency of both bug fixing and source commit activities. Firstly, BSBA collects the source code and commit logs to construct an index, mapping the source entities with their commit time, which presents developers’ source code activities. Secondly, it processes the fixed bug reports to build another index which connects the report keywords to the fixing time. Finally, on arrival of new reports, BSBA queries the two indexes, and combines the query results using tf-idf weighting technique to measure a BSBA score for the developers. The top scored developers are suggested as appropriate fixers. BSBA is applied on two open source projects - Eclipse JDT and SWT, and is compared with three existing techniques. The results show that BSBA obtains the actual fixer at Top 1 position in 45.67%, and 47.50% cases for Eclipse JDT and SWT respectively, which is higher than the existing approaches. It also shows that BSBA improves the accuracy of existing techniques on average by 3%-60%.

Paper Nr: 70
Title:

K-Taint: An Executable Rewriting Logic Semantics for Taint Analysis in the K Framework

Authors:

Md. Imran Alam, Raju Halder, Harshita Goswami and Jorge Sousa Pinto

Abstract: The K framework is a rewrite logic-based framework for defining programming language semantics suitable for formal reasoning about programs and programming languages. In this paper, we present K-Taint , a rewriting logic-based executable semantics in the K framework for taint analysis of an imperative programming language. Our K semantics can be seen as a sound approximation of programs semantics in the corresponding security type domain. More specifically, as a foundation to this objective, we extend to the case of taint analysis the semantically sound flow-sensitive security type system by Hunt and Sands, considering a support to the interprocedural analysis as well. With respect to the existing methods, K-Taint supports context- and flow-sensitive analysis, reduces false alarms, and provides a scalable solution. Experimental evaluation on several benchmark codes demonstrates encouraging results as an improvement in the precision of the analysis.

Paper Nr: 79
Title:

Gamification and Evaluation of the Use the Agile Tests in Software Quality Subjects: The Application of Case Studies

Authors:

Isaac Souza Elgrably and Sandro Ronaldo Bezerra Oliveira

Abstract: With the greater immersion of software development teams to agile methods and practices, it became necessary for students to have earlier contact with Agile Testing practices. Thus, this study aims to use the gamification concepts to stimulate the support to teach and engage the motivation of students in the Software Quality subject taught in postgraduate and undergraduate courses in computer science. For this, classes were set up to teach agile tests that used games elements as motivation for students. Therefore, this research resulted in an enrichment of the knowledge of these students in testing practices. This work aims to contribute to the teaching of agile test practices for students, aiming at a better preparation for the software development market. It was also verified that the use of gamification elements for the teaching of agile tests was efficient, because the participating students dedicated themselves more to the tasks and were participative in all the different types of classes.

Paper Nr: 86
Title:

Software Architecture Evaluation: A Systematic Mapping Study

Authors:

Sofia Ouhbi

Abstract: Software architecture provides the big picture of software systems, hence the need to evaluate its quality before further software engineering work. In this paper, a systematic mapping study was performed to summarize the existing software architecture evaluation approaches in literature and to organize the selected studies according to six classification criteria: research types, empirical types, contribution types, software quality models, quality attributes and software architecture models. Publication channels and trends were also identified. 60 studies were selected from digital libraries.

Paper Nr: 89
Title:

Automated Refactoring of Software using Version History and a Code Element Recentness Measure

Authors:

Michael Mohan and Des Greer

Abstract: This paper proposes a multi-objective genetic algorithm to automate software refactoring and validates the approach using a tool, MultiRefactor, and a set of open source Java programs. The tool uses a metric function to measure quality in a software system and tests a second objective to measure the recentness of the code elements being refactored. Previous versions of the software project are analyzed and a recentness measure is then calculated with respect to previous versions of code. The multi-objective setup refactors the input program to improve its quality using the quality objective, while also focusing on the recentness of the code elements inspected. An experiment has been constructed to measure the multi-objective approach against an alternative mono-objective approach that does not use an objective to measure element recentness. The two approaches are validated using six different open source Java programs. The multi-objective approach is found to give significantly better recentness scores across all inputs in a similar time, while also generating improvements in the quality score.

Posters
Paper Nr: 29
Title:

Quality Requirements Analysis with Machine Learning

Authors:

Tetsuo Tamai and Taichi Anzai

Abstract: The importance of software quality requirements (QR) is being widely recognized, which motivates studies that investigate software requirements specifications (SRS) in practice and collect data on how much QR are written vs.\ functional requirements (FR) and what kind of QR are specified. It is useful to develop a tool that automates the process of filtering out QR statements from an SRS and classifying them into the quality characteristic attributes such as defined in the ISO/IEC 25000 quality model. We propose an approach that uses a machine learning technique to mechanize the process. With this mechanism, we can identify how each QR characteristic scatters over the document, i.e. how much in volume and in what way. A tool \textit{QRMiner} is developed to support the process and case studies were conducted, taking thirteen SRS documents that were written for real use. We report our findings from these cases

Paper Nr: 43
Title:

Design and Implementation of a Geis for the Genomic Diagnosis using the SILE Methodology. Case Study: Congenital Cataract

Authors:

Manuel Navarrete-Hidalgo, José Fabián Reyes Román and Óscar Pastor López

Abstract: Biomedicine and the preventive diagnosis of diseases open a series of lines of research as diverse as proposed solutions. However, the information that the humans contain within the genome represents a great challenge related to the processing and management of their biological information, whose success will depend directly on the structures that will be generated through the application of conceptual modeling techniques. In this context, this research work presents the development of a prototype of "Extraction-Transformation-Load", where biological information can be obtained from multiple scientific repositories that do not have direct interaction. For that reason, the Conceptual Model of the Human Genome (CMHG) proposed is used as a holistic representation of the domain with the aim of generating Genomic Information Systems (GeIS), which facilitate an efficient management of all the existing knowledge in the genome in order to enhance “Precision Medicine” (PM). This work defines a GeIS for the preventive diagnosis of “congenital cataracts”, whose condition is not related to age and lifestyle, but to the genetic component of each person. In this way, we can provide an early diagnosis and possible means of personalized treatments.

Paper Nr: 55
Title:

Requirements Engineering Tools for Global Software Engineering - A Feature Analysis Study

Authors:

Somnoup Yos and Caslon Chua

Abstract: Demand in Global Software Engineering (GSE) is increasing every year. While GSE reduces development cost and provides access to resources poll, GSE practitioners also need to deal with many challenges. This impacts Requirements Engineering (RE) process in term of teamwork, collaboration, knowledge management, time and cultural differences. Providing that RE is considered to be an important part of the software development, several studies have pointed out the need of RE process that supports GSE environment. We recognize the importance of RE tools in supporting RE process and conduct the study to discover the best way to use RE tools to solve the challenges in GSE. The study adopts the Feature Analysis Screening Mode approach and generates the list of features with four categories that could address the challenges: (1) Shared Knowledge Management, (2) Workflow and Change Management, (3) Traceability, and (4) System and Data Integration. Four RE tools on the market are selected for investigation. We have found out how the tools best support three of the categories but have limited capability for the first category. Some suggestions are provided for the future development to provide the support for RE work in GSE environment.

Paper Nr: 62
Title:

FocusST Solution for Analysis of Cryptographic Properties

Authors:

Maria Spichkova and Radhika Bhat

Abstract: To analyse cryptographic properties of distributed systems in a systematic way, a formal theory is required. In this paper, we present a theory that allows (1) to specify distributed systems formally, (2) to verify their cryptographic wrt. composition properties, and (3) to demonstrate the correctness of syntactic interfaces for specified system components automatically. To demonstrate the feasibility of the approach we use a typical example from the domain of crypto-based systems: a variant of the Internet security protocol TLS. A security flaw in the initial version of TLS specification was revealed using a semi-automatic theorem prover, Isabelle/HOL.

Paper Nr: 67
Title:

Function References as First Class Citizens in UML Class Modeling

Authors:

Steffen Heinzl and Vitaliy Schreibmann

Abstract: Concepts of functional programming have entered widespread object-oriented languages. Modeling of functional programming concepts---or more precisely---modeling of behavior in UML class diagrams, however, has not adapted the functional concepts introduced in OO languages. Still behavior modeling is only achieved by modeling a new class containing the desired function. If several alternatives for a certain behavior have to be expressed, the modeling becomes even more complex: an interface and for each alternative an additional class has to be introduced (strategy pattern). We propose a new function element that circumvents these problems and reduces the complexity of the model. Due to the proposed Function stereotype, functions in the model can be identified at first glance. The new model is evaluated against a more complex design pattern. A possible first implementation is presented.

Paper Nr: 68
Title:

Goal-Satisfaction Verification to Combination of Use Case Components

Authors:

Saeko Matsuura, Shinpei Ogata and Yoshitaka Aoki

Abstract: Functional requirements of a system can be specified by fundamental use cases that satisfy "the effective and useful scenarios in the system usage" so as to meet "the goal of the system". Ambiguous non-functional requirements against the system goal often cause uncertainty of use cases and scenarios at the early stage of software development. In this paper, from the viewpoint of non-functional requirements that are included in the goal, we discuss how to check satisfaction of the goal due to the combination of functional requirements during requirements analysis using an example.

Paper Nr: 71
Title:

A Structured Approach to Support Collaborative Design, Specification and Documentation of Communication Protocols

Authors:

Fabian Ohler, Markus C. Beutel, Sevket Gökay, Christian Samsel and Karl-Heinz Krempels

Abstract: Especially in complex software development projects, involving various actors and communication interdependencies, the design of communication protocols is crucially important. In this work, a structured approach to support the design, specification and documentation of communication protocol standards is presented. To do so, we refer to a complex use case, dealing with the integration of multiple mobility services on a single platform. This endeavor requires the development of a large number of independently usable protocol standards which adhere to a multitude of quality aspects. A structured approach is required to speed up and simplify development and also to enable synergies between these protocols. Our requirements analysis methodology consists of interviewing domain experts to identify important aspects and shortcomings of the current development process and to elicit potential improvements. These intermediate results are prioritized and incorporated into a requirements specification for a standardized communication protocol development process. Furthermore, we assess existing software solutions in terms of their applicability.

Paper Nr: 73
Title:

Translation of Heterogeneous Requirements Meta-Models Through a Pivot Meta-Model

Authors:

Imed Eddine Saidi, Mahmoud El Hamlaoui, Taoufiq Dkaki, Nacer Eddine Zarour and Pierre-Jean Charrel

Abstract: Companies use these different approaches to elicit, specify, analyse and validate their requirements in different contexts. The globalization and the rapid development of information technologies sometimes require companies to work together in order to achieve common objectives as quickly as possible. We propose a Unified Requirements Engineering meta-model (UREM) that allows cooperation in the requirements engineering process between heterogeneous RE (Requirement Engineering) models. In this paper, we explore UREM as a pivot meta-model to ensure interoperability between heterogeneous RE models.

Paper Nr: 76
Title:

Toward a Better Understanding of How to Develop Software Under Stress – Drafting the Lines for Future Research

Authors:

Joseph Alexander Brown, Vladimir Ivanov, Alan Rogers, Giancarlo Succi, Alexander Tormasov and Jooyong Yi

Abstract: Software is often produced under significant time constraints. Our idea is to understand the effects of various software development practices on the performance of developers working in stressful environments, and identify the best operating conditions for software developed under stressful conditions collecting data through questionnaires, non-invasive software measurement tools that can collect measurable data about software engineers and the software they develop, without intervening their activities, and biophysical sensors and then try to recreated also in different processes or key development practices such conditions.

Paper Nr: 81
Title:

A Methodology to Teaching Statistical Process Control in Computer Courses

Authors:

Julio Cezar Costa Furtado and Sandro Ronaldo Bezerra Oliveira

Abstract: A process considered in statistical control must be stable and repeatable. The Statistical Process Control (SPC) importance for the software industry has grown in recent years, mainly due to the use of quality models. In this context, this work aims to propose a teaching methodology for SPC where the learning process is student centered. The methodology is composed of reading experience reports, PBL, practical cases discussion, use of games, practical projects, and reflections on the contents learned.

Paper Nr: 83
Title:

An Approach of Text to Model Transformation of Software Models

Authors:

Olena V. Chebanyuk

Abstract: The text to model transformation is an important step to process UML diagrams that are designed in software modeling environments. Modeling environments, such as IBM Rational products, Eclipse Papyrus, Microsoft Visual Studio and others, store UML diagrams in XMI compatible format. For performing UML diagram processing operations it is necessary to restore their structure. This article outlines an approach that allows obtaining an analytical representation of an UML diagram saving information about its structure. In order to solve this task the solutions of the next research problems are proposed: (i) how to extract information about UML diagram elements from theirs XMI representation in different modeling environments (Microsoft Visual Studio and Eclipse Papyrus); (ii) how to decompose software models into chains of linked elements; (iii) how to restore a software model structure from these chains.

Paper Nr: 84
Title:

Towards Model based Testing for Software Defined Networks

Authors:

Asma Berriri, Jorge López, Natalia Kushik, Nina Yevtushenko and Djamal Zeghlache

Abstract: Software Defined Networks (SDNs) and corresponding platforms are expected to be widely used in future generation networks and especially deployed and activated on-demand as agile networking control service components. The correct functioning of SDN platforms must be assured, i.e., such platforms should be thoroughly tested before deployment. After thorough verification of SDN controllers and switches, the composition of them still requires additional testing in order to guarantee the absence of critical faults. We propose a model based testing technique for checking SDN platforms that relies on appropriate graph enumeration. In particular, we define a fault model where the fault domain contains the wrongly and correctly implemented paths allowed with respect to the underlying resource connectivity graph. We also establish the conditions for deriving a complete test suite with respect to such fault model under black box and white box testing assumptions.