A Case Study in Information Security Awareness Improvement: A Short Description
Corradini, Isabella; Nardelli, Enrico (Italy)
ABSTRACT:
In this paper we provide a short description of a training experience aiming at improving information security awareness. It has been conducted in a multinational company operating in the electronic payments sector. The overall training effort has been based on an organizational analysis and a survey on cyber risk perception involving 1164 employees. The survey pointed out the needs of strengthening education and training in internal cooperation, socio-technical awareness of cyber risks, risk profiling and management.
A Comparative Study on Early Skin Cancer Detection Using Computer Aided Diagnosis Techniques
Fernandes, Steven Lawrence; Prabhu G., Ananth; Visvesvaraya, Yogananda A.; Maheshappa, Shruthi (India)
ABSTRACT:
Skin cancers are cancers that due to the development of abnormal cells that have the ability to invade or spread to other parts of the body. There are three main types: basal-cell cancer, squamous-cell cancer and melanoma. Among the three melanoma spreads through metastasis, and therefore it has been proved to be very fatal. Melanomas typically occur in the skin and identification of skin cancer can be done based on the Melanoma images. A system to prevent this type of skin cancer is being awaited and is highly in-demand. Melanomas are
asymmetrical and have irregular borders, notched edges, and color variations, so analyzing the shape, color, and texture of the skin lesion is important for melanoma early detection. There are two Computer Aided Diagnosis (CAD) techniques which are used for early skin cancer detection include color constancy approach and skin lesion analysis. The key contribution of this paper is the comparative study done between color constancy and skin lesion analysis for early skin cancer detection on EDRA database and PH2 database.
A Hybrid Evaluation Method for Pallet Quality Evaluation
Wang, Yijun; Li, Yong; Liang, Qirong; Sheng, Jiaqi; Shang, Wei (China)
ABSTRACT:
Pallet is an important element in logistic operation, which is themost widely used loading platform. The quality of the pallet is the basic guarantee to the smooth operation of the supply chain. In this paper, we propose the AHP-LCA combined
method to evaluate the quality of the pallet. First we propose eight evaluation factors, and then we determine the weight for each factor. Finally, we get the overall quantitative formula to calculate the scores for the pallet quality.
A Novel Approach for Performance Improvement in Parallel Processing
Prabhu G., Ananth; Aithal, Ganesh; Ali Ahammed, G. F.; Tale, Sarika; Basthikodi, Mustafa (India)
ABSTRACT:
With the high requirements of gene sequencing in the field of scientific research, it is essential to make faster sequencing process. The one of the main sequencing operation is done using Smith Waterman algorithm. This algorithm is used in two conventional ways to evaluate the matrix elements. They are i. Sequential processes ii. Conventional parallel processes. Since the work is to consider these both of these approaches and evolve a new so that these there are three main objectives are met a) Should take less time for the computation of the matrix elements b) optimize the processor used. Even though later one conventional parallel (0(2m)) is faster than the former one sequential processing 0(mn), this work tried an attempt for reducing the timing still further, with a challenge to reduce to an extent of 0(m) by introducing cross diagonal element wise parallel processing approach. Also as a part of the work processor optimization of the processor for the conventional parallel and the cross diagonal element wise parallel approach is completed with a satisfactory result. The Cross Diagonal Element Wise Parallel Processing Approach (CDEWPPA) performs better than the conventional parallel approach for the query execution time test as well as for the speed up ratio.
A Review and Extension of the Visual Information Seeking Mantra (VISM)
Stauffer, Michael; Ryter, Remo; Dornberger, Rolf; Hil, Darjan (Switzerland)
ABSTRACT:
The primary Visual Information Seeking Mantra (VISM) [1] has been extended overcoming known weaknesses [2], [3] regarding complex information handling. An improved framework, the so called Visual Information Seeking Mantra 2.0 (VISM 2.0), is derived supporting formerly missing tasks and improving existing ones. The proposed framework is furthermore embedded in a User-Centered Development (UCD) process in order to support a (front-end) developer throughout the whole development process of an information system.
A Survey of Mobile Cloud Computing Applications: Perspectives and Challenges
Oludele, Awodele; Oluwabukola, Otusile (Nigeria)
ABSTRACT:
As mobile computing has been developed for decades, a new model for mobile computing, namely, mobile cloud computing, emerges resulting from the marriage of powerful yet affordable mobile devices and cloud computing. MCC integrates the cloud computing into the mobile environment and overcomes obstacles related to the performance.
This paper gives a survey of MCC application including the definition, architecture, as well as speculate future generation mobile cloud computing applications. The challenges and existing solutions and approaches are
presented
A Visual Analytics Technique for Identifying Heat Spots in Transportation Networks
Nistor, Marian Sorin; Pickl, Stefan Wolfgang; Zsifkovits, Martin (Germany)
ABSTRACT:
The public transportation system, as part of urban critical infrastructures, needs to increase its resilience. For doing so, decision makers need to understand the network itself and be aware of critical nodes. For doing so, we identified analysis tools for biological networks as an adequate basis for visual analytics. In the paper at hand we therefore adapt such techniques for transportation systems and demonstrate the benefits based on the Munich subway network. Here, visual analytics is used to identify vulnerable stations from different perspectives. The applied technique is presented step by step. We propose a network of networks analysis for the multiple vulnerable areas of the transportation system. Furthermore, the key challenges in applying this technique on transportation systems are identified. Finally, we propose the implementation of the presented features in a management cockpit for integrating the visual analytics mantra for advanced decision support on transportation systems.
A Web-Based System for Analyzing Electrical Impulses in Human Brain
Early, Christopher; Chan, Alexander; Garza, Jonathan; Schreiber, Gregor; Lin, Hong (United States)
ABSTRACT:
While many advances have been made towards an understanding of the human brain, its inner workings are still not well understood, and many mysteries still remain. The primary objective of this project is to shed light on some of these mysteries by creating a model that can be applied to Electroencephalographic (EEG) brainwave data, with the goal of predicting what a person is doing or what is happening to them. The dependent variables of this study are the five major brainwave frequency ranges, and the independent variable is the activity being performed by the subject in question. In this paper we describe the creation of an environment wherein we can seamlessly capture and analyze EEG brainwave data using various custom developed tools, as well as off the shelf
software and hardware components. An essential component of this environment is a website that we developed to facilitate the collection, review, and analysis of collected EEG data. The types of analysis that can currently be performed on the data stored on the web server are wave analysis, statistical analysis, and categorical classification using a number of wellestablished machine learning algorithms.
An Example of the Contradiction between Dynamic Optimization and the Traditional Maximization of Future Expectation
Yoshimura, Jin; Ito, Hiromu (Japan)
ABSTRACT:
In economics and monetary engineering, the optimal option was usually selected based on the expectation (arithmetic mean: AM) of future wealth or its expected utility (EU). Recently Yoshimura et al. (2013) proposed dynamic utility optimization based on the maximization of stochastic process by applying the optimality principle of dynamic programming. This theory indicates that the
long-term optimal choice is to maximize the geometric mean (GM) of the growth rate that is equal to the log growth rate of future wealth, i.e., a dynamic utility function. We here compare the traditional AM and EU with the GM used in dynamic utility optimization using a simple example. Our results indicate that
both the traditional AM and EU criteria often result in the bankruptcy, while GM secures the long-term growth of monetary wealth. We suggest that the GM criterion should be used for the long-term capital management in economics and monetary engineering.
Automatic Parallelization Tool: Classification of Program Code for Parallel Computing
Basthikodi, Mustafa; Ahmed, Waseem (India)
ABSTRACT:
Performance growth of single-core processors has come to a halt in the past decade, but was re-enabled by the introduction of parallelism in processors. Multicore frameworks along with Graphical Processing Units empowered to enhance parallelism broadly. Couples of compilers are updated to developing challenges forsynchronization and threading issues. Appropriate program and
algorithm classifications will have advantage to a great extent to the group of software engineers to get opportunities for effective parallelization. In present work we investigated current species for classification of algorithms, in that related work on classification is discussed along with the comparison of issues that challenges the classification. The set of algorithms are chosen which matches the structure with different issues and perform given task. We have tested these algorithms utilizing existing automatic species extraction toolsalong with Bones compiler. We have added functionalities to existing tool, providing a more detailed characterization. The contributions of our work include support for pointer arithmetic, conditional and incremental statements, user defined types, constants and mathematical functions. With this, we can retain significant data which is not captured by original speciesof algorithms. We executed new theories into the device, empowering automatic characterization
of program code.
Benchmarking the Operating Efficiency of U.S. Regional Banks
Malhotra, D. K.; Malhotra, Rashmi; Mendoza, Ruben a. (United States)
ABSTRACT:
Banks strike a balance by achieving high profitability levels while undertaking the risks involved. Thus, understanding the factors that drive an efficiently-operating bank is an important issue. This study illustrates the use of data envelopment analysis (DEA) methodology to benchmark the operating efficiency of thirty four U.S. Regional banks during the period 2009 to 2013. DEA is a linear programming technique that determines the financial strength of a decision making unit (bank) by comparing the relative efficiencies of all banks. Thus, using DEA to perform a comparison analysis of a bank with other banks in the industry, we can assess the standing of the bank amongst its peers. In addition, we also use the DEA’s slack analysis to understand the factors responsible for the poor performance of a bank. Finally, we also investigate the factors contributing to the performance of the banking industry by covering a time period spanning the start of the economic crisis and the consequent passing of new law to regulate the financial services industry.
Call and Bandwidth Quality in Small Cell Mobile Network from End User Perspective
Aburas, Akram; Al-Mashouq, Khalid (Saudi Arabia)
ABSTRACT:
Call and Bandwidth quality are the primary services that the mobile operators are struggling to provide the best service. The major issue for data network quality is the handling of the unprecedented increase in the data volume on backhaul network. To overcome this challenge of providing the high bandwidth in urban areas, the mobile operators are using smallcell technology. Backhaul network is considered to be in between the BTS and mobile core network. The call quality computation that has been proposed earlier has been summarized and Bandwidth quality parameter has been proposed and incorporated in QMeter® in this research, which can be utilized by the operators to gauge the quality of the small cell network and its backhaul from end users perspective.
Comparative Study of Private Information Retrieval Protocols
Eltarjaman, Wisam; Annadata, Prasad (United States)
ABSTRACT:
Private Information Retrieval (PIR) techniques allow users to query and retrieve records from a database without revealing the query to the database. These techniques gather more importance as more and more users rely on online services at the same time demand better privacy. For example, they utilize more location based services, but at same time prefer not to reveal their exact
location to the service providers. Computational PIR (cPIR) is a category of PIR that uses mathematical techniques to achieve its goals. As new mathematical techniques such as fully homomorphic encryption emerge, so does adoption of them into cPIR. The increased privacy achieved using these techniques is balanced by added compuatational and communication costs. The effort in the
research community has been trending towards making cPIR efficient enough for practical adoption. There have been a several notable proposals in cPIR in recent years. We picked the most promising of these proposals, implemented them and compared their performance. In this paper we present we present a brief survey of PIR techniques as well as our findings.
Comparison between Dynamic Utility Optimization and Dynamic Programing
Yoshimura, Jin; Ito, Hiromu (Japan)
ABSTRACT:
Dynamic programing developed by Richard Bellman as the first model of dynamic optimization. Its core concept is the principle of optimality. Dynamic programing provides the computer algorithm to solve the optimal strategy numerically for given settings. Because of the limitation to numerical solutions, we cannot deduce an analytical (mathematical) theory from the solutions of dynamic programing. Recently we had developed dynamic utility optimization applying the principle of optimality into a stochastic process. Dynamic utility optimization provides an analytical solution for the problem of behavioral optimization under uncertainty. Therefore, we can develop the mathematical (analytical) theory of decision-making in human and other animals. This looks like a large advantage over dynamic programing. However, the optimal strategy derived from dynamic utility optimization is rather simple, implicating some differences to dynamic programing. In this report, we compare the differences of dynamic utility and dynamic programing, especially focusing on the characteristics of the solutions derived from these models.
Computing and Network Systems Administration, Operations Research, and System Dynamics Modeling: A Proposed Research Framework
Totaro, Michael W. (United States)
ABSTRACT:
Information and computing infrastructures (ICT) involve levels of complexity that are highly dynamic in nature. This is due in no small measure to the proliferation of technologies, such as: cloud computing and distributed systems architectures, data mining and multidimensional analysis, and large scale enterprise systems, to name a few. Effective computing and network systems administration is integral to the stability and scalability of these complex software, hardware and
communication systems. Systems administration involves the design, analysis, and continuous improvement of the performance or operation of information and computing systems. Additionally, social and administrative responsibilities
have become nearly as integral for the systems administrator as are the technical demands that have been imposed for decades. The areas of operations research (OR) and system dynamics (SD) modeling offer system administrators a rich array of analytical and optimization tools that have been developed from diverse disciplines, which include: industrial, scientific, engineering, economic and financial, to name a few. This paper proposes a research framework by which OR and SD modeling techniques may prove useful to computing and network systems administration, which include: linear programming, network analysis, integer programming, nonlinear optimization, Markov processes, queueing modeling, simulation, decision analysis, heuristic techniques, and system dynamics modeling.
Conditioning and Control of an Evaporator of Double Effect at Pilot Scale
Lorenzo, Carla; Suárez, Graciela; Caamaño, Florencia (Argentina)
ABSTRACT:
The present work hold the objective of determine the type of instrumentation to automatically control a concentrator equipment of double effect at pilot scale, belonging to the Faculty of Engineering from the National University of San Juan. The tasks executed in this paper were done in order to accomplish the objectives of a research Project develop in the faculty. This tasks includes the
disassembly of the equipment from where the technical specifications were obtained, with which were calculated the values of each flow of the process through mass balance and energy balance. From this information, finally, the type of sensor and its range of work were decided. At the end, all the information was plotted on a flow sheet.
Enhanced Arabic Semantic Information Retrieval System Based on Arabic Text Classification
Nazmy, T.; Elsehemy, A. (Egypt)
ABSTRACT:
Not available//
Enterprise Level Security – Basic Security Model
Foltz, Kevin E.; Simpson, William R. (United States)
ABSTRACT:
Building a secure information sharing system is challenging. Maintaining, updating, and modifying such a system based on changing enterprise needs and advancing technology is even more challenging. Decisions and informal rules that were made and enacted in the initial build are often lost, forgotten, or
ignored when changes are needed. When the original system designers have moved on, the system is entrusted to an administrator who understands how the system works but not why it was designed to work that way. Without this higher-level understanding, the secure system devolves into a collection of
loosely integrated partial solutions with security vulnerabilities at the seams and edges. This work presents a method of documenting the design logic of a secure enterprise information system, from basic principles to implementable requirements. Important design decisions are captured, along with the logic
supporting them. Before changes to the system are made, an assessment is made against the core design decisions to ensure the original security goals are maintained. This provides clarity to the system owner and administrators to help guide future changes, and it provides a way to convey security goals to product vendors in a structured and logical way, which can help to reduce the back-and-forth arguing over whether a product meets security requirements. The Enterprise Level Security (ELS) architecture is used as an example of the application of this method to a real-world security system.
Environmental Assessment in Health Care Organizations
Carnero, María Carmen (Spain)
ABSTRACT:
Health Care Organizations should set standards in encouraging environmental sustainability, since protection of the environment implies the development of preventive measures in the area of health. Despite the copious literature analysing environmental matters, in the case of Health Care Organizations there are scarcely any models to assess environmental sustainability. This paper presents a model developed by the Fuzzy Analytic Hierarchy Process and utility theory to assess environmental sustainability in Health Care Organizations.
Foresight: A New Paradigm in Organizational Strategy
Klakurka, Jan A. C.; Irwin, Bill (Canada)
ABSTRACT:
In this paper, the authors address the issue of strategic foresight or 'futuring', from their perspective as management professors charged with educating a new generation of ambidextrous, creative, and complexity-savvy future leaders. While not a new practice, they point to examples of firms utilizing this technique since the 1960s, they postulate why foresight is not a mainstream strategy practitioner competency nor curriculum topic at many business schools and management programs. In the process, they define strategic foresight from the strategists’ perspective, illustrating key principles and deliverables attributed to the exercise. Each author shares a strategy practitioner-and-academic perspective, informing consideration of lingering disconnects between traditional strategy work and strategic foresight, and how this might change in practice going forward. Along the way, the authors postulate on sources of potential resistance to the teaching of this practice ranging from; a misunderstanding of its worth, to entrenched paradigms unwilling to try something out of the 'norm'. They conclude by posing a series of questions regarding the future of the strategic foresight to be imbedded as a key corporate cultural component, its wider merit up-and-down the corporate hierarchy, and its potential introduction into the academy as a core teachable.
Functional Resonances in Complex Sociotechnical Systems – Industrial Planning from a Systemic Point of View
Waefler, Toni; Kohli, Bjoern; Cerny, Noemi (Switzerland)
ABSTRACT:
Industrial planning, scheduling, and control (PSC) matches dynamic market demands with potentially unstable production resources. To do so PSC is a
complex sociotechnical system of information flow and distributed decision-making. The present case study models an SME's PSC-system by means of the
Functional Resonance Analysis Method (FRAM). Results show that performance failures (e.g. wrong or late delivery of components) can emerge from the interaction of distributed decisions, each of which in itself may be perfectly reasonable from a local perspective. From that a generic framework for improvement was developed that ensures compatibility of distributed decisions. This generic framework is referring to (a) a consistent system of integrated objectives, (b) ensuring that objectives are influenceable by the respective decision-maker, and (c) ensuring feedback regarding progress and outcome of
objective achievement.
GDV Based Imaging for Health Status Monitoring: Some Innovative Experiments and Developments
Bandyopadhyay, Asok; Chaudhuri, Amit; Mondal, Himanka Sekhar; Mukherjee, Bhaswati (India)
ABSTRACT:
New techniques using IT based Instruments are being tried everywhere to extend the reach and reliability of healthcare services. Computational bio-electro-photography based on gas discharge visualization (GDV) technique is a novel approach for monitoring health status of individuals. Increasing urbanization and the accompanying changes in lifestyles are leading to escalating epidemic of several chronic diseases including diabetes. Present research work has established that GDV imaging system can be used as an efficient tool in non-invasive diabetic screening. Medical Imaging based systems are being deployed in various medical applications to achieve particular tasks or to find a way around some difficulties in old technologies. Various new technologies and techniques are being tried to extend the reach and reliability of health care services. Electrophotonic (or Gas Discharge Visualization) imaging is one of the ways amongst those. Diabetes is the most common disease worldwide people suffer from. In this paper a GDV based imaging system has been reported to acquire and analyze GDV image for prognosis of diabetic patients. Computational models and physician’s perception validate the imaging tasks and the concepts may directly be used in biomedical measurements. Followed by the development of image possessing algorithms, a computation model has been developed based on the clinical inputs from physician to validate the developed medical imaging system. Information gain theory based machine learning algorithms are used for feature ranking. Necessary data clustering, pattern analysis and matching were done using Support Vector Machine (SVM). GDV image capturing system has been developed and used to perform planned GDV image acquisition from 85 numbers of subjects in a diabetic camp in India. The results of pilot study show tremendous possibilities towards development of a non-invasive medical aid for healthcare application like diabetes.
Hypertextuality in the Alexander von Humboldt Digital Library
Doherr, Detlev; Jankowski, Andreas (Germany)
ABSTRACT:
To do justice to the legacy of Alexander von Humboldt, a 19th century German scientist and explorer an information and knowledge management system is required to preserve the author’s original intent and promote an awareness of all his relevant works. Although all of Humboldt's works can be found on the internet as digitized papers, the complexity and internal interconnectivity of the writings is not very transparent. Humboldt's concepts of interaction cannot be adequately represented only by digitized papers or scanned documents. The Humboldt Portal is an attempt to create a new generation of digital libraries, providing a new form of interaction and synthesis between humanistic texts and scientific observation. The digital version of his documents supplies dynamic links to sources, maps, images, graphs and relevant texts in accordance with his visions, because “everything is interconnectedness”.
IDS Performance Enhancement with Intelligent Predictive Packet Inspection
Masud, Mohammad M.; Al Maleki, Mohamed Saleh; Trabelsi, Zouheir (United Arab Emirates)
ABSTRACT:
Intrusion detection systems (IDS) such as Snort apply deep packet inspection to detect intrusions. Usually these are rule-based systems, where each incoming packet is matched against a set of rules. Each rule consists of two parts, namely, the rule header and the rule options. The rule header is compared
against the packet header. The rule options usually contain a signature string that is matched against packet content using efficient string matching algorithm. The traditional approach for IDS packet inspection works by checking a packet against the detection rules by scanning from the first rule in the set and
continuing to scan rules until a match is found. If no match is found, then a default rule is applied. This approach is inefficient if the number of rules is too large and majority of the packets match with rules located towards the end of the rule set. In this paper, we propose an intelligent predictive technique for packet inspection based on data mining. We consider each rule in the rule set as a class. A classifier is first trained with labeled training data. Each such labeled data point contains a packet header info and the packet content summary info and the corresponding class label (i.e., rule number with which the packet matches). Then the classifier is used to classify new incoming packets. The predicted class, i.e., rule, is checked against the packet to see if this packet
really matches the predicted rule. If yes, the corresponding action (i.e., alert) of the rule is taken. Otherwise (prediction of the classifier is wrong), we go back to the traditional way of matching rules. The advantage of this intelligent predictive packet matching is that it offers a much faster rule matching. We have proved
both analytically and empirically that even with millions of real network traffic packets and hundreds of rules, the classifier can achieve very high accuracy, thereby making the IDS several times faster in making matching decisions.
Implementation of Fingerprint Recognition System on FPGA
Gayathri, S.; Sridhar, V. (India)
ABSTRACT:
The performance of fingerprint recognition system depends on the minutiae present in the fingerprint image. In most of the applications, acquired fingerprint images are of medium quality. This may not precisely extract minutiae and thereby pulls down the performance of the fingerprint recognition system. To
enhance the performance, fingerprint images have to be processed before extracting minutiae. Proposed fingerprint recognition system comprises of fingerprint recognition process with minutiae matcher. It provides better performance in comparison with the existing system. Number of stages in
fingerprint recognition process depends on the methodology adopted by the designer to improve the quality of the fingerprint image.
One of the key contributions of the proposed system is the reduced computational complexity and processing time due to fast and improved thinning algorithm. Due to scalable hardware resources supported by Field Programmable Gate Arrays (FPGA) further improvement in the processing time is possible.
Validation of the fingerprint recognition system is carried out in terms of false acceptance rate (FAR) and false rejection rate (FRR). State-of-the-art fingerprint recognition system with improved recognition rate is implemented on Virtex-5 FPGA development board. The system is validated for 500 samples of medium quality fingerprint images generated by considering people between age 5-65 using optical scanner. Further, the results obtained from hardware design are compared with that of earlier work implemented on different platforms.
Increasing Potenial of Valorization in Technical Universities through Internationalization
Zeps, Artūrs; Iljins, Juris; Ribickis, Leonīds (Latvia)
ABSTRACT:
Valorization or creation of new products and services through innovation process is an important task for all technical Universities that want to maintain strong bond with the industry and capitalize on this process. Especially topical this task is for technical Universities in Baltic States that have deepened this
activity during the last decades. Since science and valorization no longer can be performed within the borders of one single University, there is a need for collaboration and internationalization. This means establishing new contacts in
other leading Universities, joining different networks and working jointly with international partners. But the task of management is to monitor the outcomes of valorization and efforts of internationalization to define if the university reaches its aims.
This paper analyzes the importance of valorization for Baltic technical Universities and indicates the importance of internationalization in promoting the valorization as a process. As well paper indicates the indicators that could be set as measurements of the valorization process and introduces the potential of IT systems in maintaining this task.
The purpose of the study is to analyze the importance of internationalization for the valorization in Baltic technical Universities and introduce the monitoring system for this process.
Influence of Intelligent Transportation System in a Road Infrastructure
Porto, Marcelo Franco; Carvalho, Izabela Ribas Vianna de; Baracho Porto, Renata Maria A. (Brazil)
ABSTRACT:
Over the past 30 years, it has seen the development of Intelligent Transportation Systems - ITS around the world in different ways, designed to reach new transport patterns, and with it, the concept is gaining more credibility among users. With this technology, the goal is to establish communications between user, vehicle and infrastructure in an integrated manner and thus provide a supply of a more sustainable and efficient transport. This paper introduces the basic concepts of ITS, its Architecture, and shows the most recent studies in the area. Property use of this system can bring gains in sustainability, mobility and security, being a breakthrough for transport system.
Management Cockpits: Concept, Benefits and Challenges
Frei, Michael; von Bergen, Pascal; Boernert, Eric; Dornberger, Rolf; Hil, Darjan (Switzerland)
ABSTRACT:
Management cockpits for businesses allow to cumulate all available data in real-time for the strategically important information. They pre-process information and thus reduce the complexity of managing a business. Employing principles of cybernetics, the managers are immediately getting feedback to the decisions they have taken and can manage the business accordingly. Despite of the
obvious potential of this approach, only few companies have introduced a management cockpit. This paper identifies the main benefits of such a concept as well as its general requirements in order to successfully introduce and use the management cockpit approach in a company. The inhibiting factors why management cockpits are not used more widely are researched and discussed. A solution approach of how to integrate management cockpits in businesses and companies is presented.
Matching Composite Sketches with Images Captured from Drone
Fernandes, Steven Lawrence; Bala, G. Josemin (India)
ABSTRACT:
Pencil sketches when used for matching by police and officers worldwide suffer from exaggeration. Skilled police sketch artist is required to draw sketch pencil sketches. The skilled police sketch artist needs specialized training and matching these pencil sketches with large number of images available in police database is a time consuming task. The skilled police sketch artist needs specialized training and matching these pencil sketches with large number of images available in police database is a time consuming task. Moreover police database
will not have images of first time offenders. To overcome these problems we use Composite Sketches. Composite Sketches are generated using a computer and it doesn’t require skilled artist, hence it can be drawn very quickly by the eyewitness. Police database cannot have images of first time offenders so we use images captured by unmanned aerial vehicle in the area where the offender is likely to be present. The image captured by unmanned aerial vehicle (Phantom 3 professional) also known as drone is matched using Multiscale Circular Feature
Extraction and Boosting. In this method local information around fiducially features remains unchanged but there may be variation in digital face images and composites.
Mathematical Theory of Development (presentation)
Korzhova, V. N.; Ivanov, V. V. (United States)
ABSTRACT:
The paper presents the main notions, mathematical models, and applications of the novel branch of mathematics and its applications that called Mathematical Theory of Development.
Maximum Period Non-Binary Key Sequence Generation for Image Encryption/Decryption
Sudeepa, K. B.; Aithal, Ganesh (India)
ABSTRACT:
The stream cipher is a binary cipher system if the operation of plain text and key is carried out bit by bit. The word by word operation of key and plain text is known as word oriented stream cipher system. The encryption / decryption operation can be performed on key and plain text where both are neither word nor binary or of the form 2n is known as non binary non word oriented stream cipher system. This can be defined over any finite field GF (p) where p is prime integer and further extended over the ring ZM, where M is a composite integer.
The effectiveness of the non binary stream cipher system is based on different parameters mainly on encryption / decryption algorithm, also on key generation and its parameters. In this work, non binary non word oriented key sequence of maximum length is generated using feedback shift register of four stages. The generated key sequence is used in RNS based additive cryptosystem (encryption / decryption) and the effectiveness of maximum length key sequence on cipher systems is discussed.
Measuring Human Emotions with Modular Neural Networks
Albu, Veaceslav (Moldova)
ABSTRACT:
In this paper, we propose a hybrid architecture for detection of human emotions. The architecture represents an effective tool for real-time processing of customer’s behaviour for distributed on-land systems, such as kiosks and ATMs. The proposed approach combines most recent biometric techniques with the NN approach for real-time emotion and behavioural analysis. The architecture of the system represents the combination of radial basis function neural networks with selforganised maps (RBF-SOM).
Monitoring Heart Health and Structural Health: mDFA Quantification
Yazawa, Toru (Japan)
ABSTRACT:
The aim of this study was to make a method for an early detection of malfunction, e.g., abnormal vibration/fluctuation in recorded signals. We conducted experimentations of heart health and structural health monitoring. We collected natural signals, e.g., heartbeat fluctuation and mechanical vibration. For the analysis, we used modified detrended fluctuation analysis (mDFA) method that we have made recently. mDFA calculated the scaling
exponent (SI) from the time series data, e.g., R-R interval time series obtained from electrocardiograms. In the present study, peaks were identified by our own method. In every single mDFA computation, we identified ~2000 consecutive peaks from a data: “2000” was necessary number to conduct mDFA. mDFA was
able to distinguish between normal and abnormal behaviors: Normal healthy hearts exhibited an SI around 1.0, which is a phenomena comparable to 1/f fluctuation. Job-related stressful hearts and extrasystolic hearts both exhibited a low SI such as 0.7. Normally running car’s vibration―recorded steering wheel
vibration―exhibited an SI around 0.5, which is white noise like fluctuation. Normally spinning ball-bearings (BB) exhibited an SI around 0.1, which belongs to the anti-correlation phenomena. A malfunctioning BB showed an increased SI. At an SI value over 0.2, an inspector must check BB’s correct functioning. Here we propose that healthiness in various cyclic vibration behaviors can be quantitatively analyzed by mDFA.
Multi-User Virtual Reality Therapy for Post-Stroke Hand Rehabilitation at Home
Tsoupikova, Daria; Triandafilou, Kristen; Thielbar, Kelly; Rupp, Greg; Preuss, Fabian; Kamper, Derek (United States)
ABSTRACT:
Our paper describes the development of a novel multi-user virtual reality (VR) system for post-stroke rehabilitation that can be used independently in the home to improve upper extremity motor function. This is the pre-clinical phase of an ongoing collaborative, interdisciplinary research project at the Rehabilitation Institute of Chicago involving a team of engineers, researchers, occupational therapists and artists. This system was designed for creative collaboration within a virtual environment to increase patients’ motivation, further engagement and to alleviate the impact of social isolation following stroke. This is a low-cost system adapted to everyday environments and designed to run on a personal computer that combines three VR environments with audio integration, wireless Kinect tracking and hand motion tracking sensors. Three different game exercises for this system were developed to encourage repetitive task practice, collaboration and competitive interaction. The system is currently being tested with 15 subjects in three settings: a multi-user VR, a single-user VR and at a tabletop with standard exercises to examine the
level of engagement and to compare resulting functional performance across methods. We hypothesize that stroke survivors will become more engaged in therapy when training with a multi-user VR system and this will translate into greater gains.
Parellel Implementation of Modified Apriori Algorithm on Multicore Systems
Maheshappa, Shruthi; Basthikodi, Mustafa; Prabhu G., Ananth (India)
ABSTRACT:
Data mining is the implementation of axiomatically explorating humongous stores of data to determine patterns and trends that go beyond simple analysis. Data mining avails jaded mathematical algorithms to articulate the data and assess the possibility of future actions. The apriori algorithm is dominant for mining frequent item sets for Boolean association rules. Apriori uses a bottom up approach, where frequent sets are extended one item at a time. Using this algorithm we can find frequent item sets. The algorithm is implemented on supermarket transactions. This paper is an attempt to get frequent item sets from a supermarket using hadoop framework.
Putting on the Troll Suit: Governmental Strategies in the Cyber-Surveillance of MMOs
Salihu, Flurije (United States)
ABSTRACT:
In this paper, I address the current governmental processes of gathering and disseminating information gathered online, especially in MMOs. The lack of cohesion and clarity in these regulations, as well as the ineptness displayed by agencies surveilling MMOs (Massively Multiplayer Online games) have resulted in both anxiety and ridicule from the general public. What I suggest is that the members of the Intelligence Community 1) clarify and publicize their information-gathering policies, 2) recruit gaming insiders to educate them on the structure and capabilities of the communities they inhabit and 3) refocus their attention from just hacking attempts and monitoring isolated conversations between players to the real communicative power of MMOs: the networked relationships
that evolve among players.
Q-Learning Multi-Objective Sequential Optimal Sensor Parameter Weights
Cohen, Raquel; Rahmes, Mark; Fox, Kevin; Lemieux, George (United States)
ABSTRACT:
The goal of our solution is to deliver trustworthy decision making analysis tools which evaluate situations and potential impacts of such decisions through acquired information and add efficiency for continuing mission operations and analyst information. We discuss the use of cooperation in modeling and simulation and show quantitative results for design choices to resource allocation. The key contribution of our paper is to combine remote sensing decision making with Nash Equilibrium for sensor parameter weighting optimization. By calculating all Nash Equilibrium possibilities per period, optimization of sensor allocation is achieved for overall higher system efficiency. Our tool provides insight into what are the most important or optimal weights for sensor parameters and can be used to efficiently tune those weights.
Recognition of Japanese Sign Language Words Represented by Both Arms Using Multi-Stream HMMs
Igari, Shinnpei; Fukumura, Naohiro (Japan)
ABSTRACT:
We have been studied Japanese sign Language (JSL) recognition system. In our previous research, we focused on only JSL words performed by the movement of the dominant arm. We showed that recognition rate improved when the number of states of HMM was set to the number of via-points that were extracted from the trajectory data using the minimum jerk model. In this study, we deal with recognition of JSL words performed by both arms movement. At first, we classified the JSL movements into three categories. Because the recognition results of both arm movements have to be integrated, we use multi-stream HMM which is commonly suitable for the recognition of multi-modal data. We tested the stream-weight of multi-stream HMM for each JSL word category and the number of states of HMM for left arm movement. In summary, recognition rate showed 90.2% by cross validation using 160 JSL words measured by three-dimensional electromagnetic tracking system from 15 experienced signers. From these results, it is suggested that suitable stream weights of multi-stream HMM should be set in consideration of the property of the data. Furthermore, synchronous multi-stream HMM is effective when there is strong correlation between the data to be integrated.
Risk Management in Medical Software
Taliga, Miklos; Balla, Katalin (Hungary)
ABSTRACT:
In the following article, we aim to examine and analyze the efficacy of product related risk management directives in the development of medical devices, along with the resource requirement and the reliability of risk management. We will
also inspect the development methodologies suggested for keeping errors at a low level, detailing measures which help in analyzing and eliminating causes of errors, and in a higher rate of error detection. We focus our attention on a real case and the definition of concepts laid out in related standards. In doing so,
we will analyze the adaptability of these concepts and processes based on them to various development methodologies, which are used in the development of medical software. Through the concept of risk and hazard, and using related formulas, we will inspect whether the time and resources allocated to risk management and assessment are proportional to the expected final quality level of a given product. We will summarize the results and propose an optimization algorithm. We will also suggest a way of reducing risk even before development, during the design phase; we investigate how much does a possible risk add to required resources during the development and testing period.
Securing the Cloud through Utility Virtual Machines
Denz, Robert; Taylor, Stephen (United States)
ABSTRACT:
The advent of hypervisors has revolutionized the computing industry with new technologies for malware prevention and detection, secure virtual machine managers, and cloud resilience. However, this has resulted in a disjointed response to handling known threats rather than preventing unknown zeroday
threats. This paper introduces an alternative paradigm for cloud computing – utility virtual machines -- that leverage the rapid advancement multi-core processors with virtualization hardware. Utilizing hardware supported guest-virtual isolation, we introduce hypervisor designs that blur or eliminate the
distinction between the hypervisor and the guest operating system’s kernel. Initial performance evaluations of our prototypes are presented together with a summary of how the methods compare to other systems.
SIGMATA: Storage Integrity Guaranteeing Mechanism against Tampering Attempts for Video Event Data Recorders
Kwon, Hyuckmin; Kim, Seulbae; Lee, Heejo (South Korea)
ABSTRACT:
The usage and market size of video event data recorders (VEDRs), also known as car black boxes, are rapidly increasing. Since VEDRs can provide more visual information about car accident situations than any other device that is currently used for accident investigations (e.g., closed-circuit television), the integrity of the VEDR contents is important to any meaningful investigation. Researchers have focused on the file system integrity or photographic approaches to integrity verification. However, unlike other general data, the video data in VEDRs exhibit a unique I/O behavior in that the videos are stored chronologically. In addition, the owners of VEDRs can manipulate unfavorable scenes after accidents to conceal their recorded behavior. Since prior arts do not consider the time relationship between the frames and fail to discover frame-wise forgery, a more detailed integrity assurance is required. In this paper, we focus on the development of a frame-wise forgery detection mechanism that resolves the limitations of previous mechanisms. We introduce SIGMATA, a novel storage integrity guaranteeing mechanism against tampering attempts for VEDRs. We describe its operation, demonstrate its effectiveness for detecting possible frame-wise forgery, and compare it with existing mechanisms. The result shows that the existing mechanisms fail to detect any frame-wise forgery, while our mechanism thoroughly detects every frame-wise forgery. We also evaluate its computational overhead using real VEDR videos. The results show that SIGMATA indeed discovers frame-wise forgery attacks effectively and efficiently, with the encoding overhead less than 1.5 milliseconds per frame.
Simplifying the Complexity for Vehicle Health Management System
Wahl, Harald; Naz, Erum; Kaufmann, Christian; Mense, Alexander (Austria)
ABSTRACT:
Numerous causalities are caused by traffic accidents that are resulted from malfunctioning of cars’ components. There are some vehicle health monitoring systems available, which diagnose fault but those are mostly working in isolation. The main challenge of this work is to provide an intelligent and integrated automatic fault diagnosis system at low cost, which can significantly
drop the accident rate. The purpose of the project is to monitor the health of vehicles to make vehicles more reliable and to strengthen the driver’s confidence while driving by acquiring data from vehicle built-in sensor data and other sensor devices which are not built-in.
Some Challenges in Mobile Context-Aware Applications for Courses in Academia
Voytenko, Volodymyr (Canada)
ABSTRACT:
In this paper some challenges of writing context-aware applications for our courses in academia were considered. Different types of context-aware applications suggested as individual case studies, with their short description, analysis, and characteristics.
Study Proposal on Cognitive Database Recovery Techniques
Alhayyan, Khalid N. (Saudi Arabia)
ABSTRACT:
A database may be vulnerable to various types of failures and/or errors, which may hinder its normal operations. Therefore, appropriate database recovery techniques are essential requirements for any database system. In response to many research calls, this paper introduces a research study proposal for extracting the most effective cognitive database recovery techniques that individual DBAs may utilize during the activity of recovering a failed database. The human cognitive theory is drawn upon to guide the research conduct. Verbal protocol methods are employed as means for capturing cognitive
techniques utilized by individual DBAs toward answering the research questions. The anticipated findings are reported at the end of the paper
The Attorney’s Role in Cyber Security Compliance
Villegas, Alejandro (United States)
ABSTRACT:
Cyber Security has become a predominant challenge for organizations responsible for protecting and safeguarding customer data. Attorneys serve a critical function ensuring that companies adhere to the cyber security requirements mandated by local, national, international and industry information security frameworks. The purpose of this article is to provide an overview of the attorney’s role in cyber security compliance; emphasizing the focus areas where legal counsel serves an imperative part. Attorneys can reference this journal paper to better understand how to perform cyber security compliance due diligence for their clients. While cyber security attacks continue to evolve into sophisticated threats, the attorneys must zealously advice their clients to prevent inadvertent negligent behavior regarding cyber security compliance which could cause long term adverse repercussions.
The Concept of Information Sharing Behaviors in Complex Organizations: Research in Latvian Enterprises
Cekuls, Andrejs (Latvia)
ABSTRACT:
The purpose of this paper is to explore the factors influencing behaviors of information sharing in complex organizations. Evaluation of the previous studies on provision of information turnover process and the role of organizational culture in competitive intelligence of business environment in Latvia indicated the trends that employees of Latvian enterprises lack incentive to share
information.
Tasks of the study were to research the basis of the review of scientific sources and study aspects influencing habits of information sharing in complex organizations. For this particular study, the focus group is selected as the most appropriate data collection method for high-quality research.
To find out individuals' opinions and attitudes two focus group discussions were carried out. Members from various industries and with different employment period were included in discussion groups. In aggregate, opinions of the employees from 41 different companies were summarized regarding the aspects affecting the process of information sharing in organizations.
Results of researches show that that influence the sharing of information are closely related to the values: interpersonal trust, organizational trust, and
organizational identification, support, fairness etc. Results of discussions showed that it is important for a manager to be aware of the factors affecting
the performance of the organization. To identify the need for changes, a manager should follow events in the environment and analyze the extent, to which they affect the performance of the organization.
Complexity science suggests that maturity to changes emerges when the system is far from balance, but the tension makes to accept changes.
The MACOSC-IASC Collaboration Fund. A Complex Systems Framework to Address One of Mexico’s National Problems
Orozco y Orozco, Octavio (Mexico)
ABSTRACT:
The MACOSC-IASC Collaboration Fund (MICF) is a complex systems framework focused on the use of venture capital (VC) and open source software (OSS) technology to sustainably increase productivity of micro, small and medium-sized enterprises (MSMEs); to increase the production of high technology goods with high technological content created in Mexico; and to generate youth self-employment. Using Complexity Science and agent-based modeling, the MICF was designed for organizational practice, which makes use of VC services and OSS technology adoption in very small software development entities (VSEs). These endeavor to sustainably increase the productivity of MSMEs, a need clearly identified in Mexico. The MICF as a business model provides guidance to entrepreneurial postgraduates to launch their own VSE and contribute to the competitiveness of MSMEs. This would take MSMEs one step further towards the knowledge economy.
The Pseudonym on the Internet: Identity Creation and Space of Freedom
Martin, Marcienne (France)
ABSTRACT:
New Technologies of Information and Communication (TIC) are located out of time and out of space. Indeed, the permanent connectivity of people through digital interfaces (binary type) is at the origin of the implementation of completely new paradigms. Anonymity and privacy are two phenomena that
oppose one another with respect to both the social practices and their concepts involved; they have a huge impact on the organizational structure of the various civil societies in the world. In addition, if the digital society is rooted in civil society, it does not duplicate. Moreover, on Internet users identify themselves by creating their pseudonymous and by the use of pseudonyms in a relationship without any hierarchy, while in civil society nomination is subject to the law.
Copyright © 2016 by International Institute of Informatics and Systemics
Published by International Institute of Informatics and Cybernetics