Friday, February 10, 2017

DEVELOPMENT AND IMPLEMENTATION OF MOBILE GEAR FOR SKIER TRAINING SYSTEM

Author : Ayman M. Abdalla, Mohammad M. Abdallah and Mosa I. Salah
  

Volume, Issue, Month, Year :
  Vol 8, No 1, January, 2017

ABSTRACT

The purpose of this study is to develop a Skier Training System for monitoring skiing courses and analyzing skier posture using smartphones. The system is composed of three elements: first, operating servers that receive information on the skier’s exercise; second, Skier Training Apps developed using Java and C based on Android SDK 4.2 or above; and third, Data Gathering Gear embedded with gyroscope sensors and GPS for measuring skier stance. When a skier wearing our Data Gathering Gear runs down a slope, his or her stance is uploaded to the operating servers. Then the servers calculate the recorded data, such as velocity and turn trajectory, in order to suggest body angles that are optimized for the skier; subsequently, the skier can improve his or her skills by checking his or her records, and then attempt to correct his or stances based on the suggestions offered by the Skier Training Apps.


A BRIEF PROGRAM ROBUSTNESS SURVEY

Author : Ayman M. Abdalla, Mohammad M. Abdallah and Mosa I. Salah
  

Volume, Issue, Month, Year :
  Vol 8, No 1, January, 2017

ABSTRACT

Program Robustness is now more important than before, because of the role software programs play in our life. Many papers defined it, measured it, and put it into context. In this paper, we explore the different definitions of program robustness and different types of techniques used to achieve or measure it. There are many papers about robustness. We chose the papers that clearly discuss program or software robustness. These papers stated that program (or software) robustness indicates the absence of ungraceful failures. There are different types of techniques used to create or measure a robust program. However, there is still a wide space for research in this area.




LEVERAGING CLOUD BASED BIG DATA ANALYTICS IN KNOWLEDGE MANAGEMENT FOR ENHANCED DECISION MAKING IN ORGANIZATIONS

Author : Mohammad Shorfuzzaman
  

Volume, Issue, Month, Year :
  Vol 8, No 1, January, 2017

ABSTRACT

In recent past, big data opportunities have gained much momentum to enhance knowledge management in organizations. However, big data due to its various properties like high volume, variety, and velocity can no longer be effectively stored and analyzed with traditional data management techniques to generate values for knowledge development. Hence, new technologies and architectures are required to store and analyze this big data through advanced data analytics and in turn generate vital real-time knowledge for effective decision making by organizations. More specifically, it is necessary to have a single infrastructure which provides common functionality of knowledge management, and flexible enough to handle different types of big data and big data analysis tasks. Cloud computing infrastructures capable of storing and processing large volume of data can be used for efficient big data processing because it minimizes the initial cost for the large-scale computing infrastructure demanded by big data analytics. This paper aims to explore the impact of big data analytics on knowledge management and proposes a cloud-based conceptual framework that can analyze big data in real time to facilitate enhanced decision making intended for competitive advantage. Thus, this framework will pave the way for organizations to explore the relationship between big data analytics and knowledge management which are mostly deemed as two distinct entities.


A SEMI-BLIND WATERMARKING SCHEME FOR RGB IMAGE USING CURVELET TRANSFORM

Author : Ranjeeta Kaushik , Sanjay Sharma and L. R. Raheja
  

Volume, Issue, Month, Year :
  Vol 7, No 1, January, 2017

ABSTRACT

 In this paper, a semi-blind watermarking technique of embedding the color watermark using curvelet coefficient in RGB cover image has been proposed. The technique used the concept of HVS that the human eyes are not much sensitive to blue color. So the blue color plane of the cover image is used as embedding domain. A bit planes method is also used, the most significant bit (MSB) plane of watermark image is used as embedding information. Selected scale and orientation of the curvelet coefficients of the blue channel in the cover image has been used for embedding the watermark information. All other 0-7 bit planes are used as a key at the time of extraction. The results of the watermarking scheme have been analyzed by different quality assessment metric such as PSNR, Correlation Coefficient (CC) and Mean Structure Similarity Index Measure (MSSIM). The experimental results show that the proposed technique gives the good invisibility of watermark, quality of extracted watermark and robustness against different attacks.

INFERENCE BASED INTERPRETATION OF KEYWORD QUERIES FOR OWL ONTOLOGY

Author : Noman Hasany and Batool Alwatban 
  

Volume, Issue, Month, Year :
  Vol.8, No.1, January 2017

ABSTRACT

 Most of the systems presented to date deals with RDF format so they are limited in actually addressing the knowledge base features from the ontology based on OWL semantics. Now, there is a need that actual OWL features i.e. rules and axioms must be addressed to give precise answers to the user queries. This paper presents an interface to OWL ontology which also considers axioms and restrictions that can result in inferring results in understanding user queries and in selecting appropriate SPARQL queries for getting better interpretation and answers. .

ANALYSIS OF POSSIBILITY OF GROWTH OF SEVERAL EPITAXIAL LAYERS SIMULTANEOUSLY IN GAS PHASES FRAMEWORK ONE TECHNOLOGICAL PROCESS. ON POSSIBILITY TO CHANGE PROPERTIES OF EPITAXIAL LAYERS

Author : E.L. Pankratov and E.A. Bulaeva
  

Volume, Issue, Month, Year :
  Vol.6, No.1, January 2017

ABSTRACT

 We analyzed nonlinear model with varying in space and time coefficients of growth of epitaxial layers from gas phase in a vertical reactor with account native convection. We formulate several conditions to increase homogeneity of epitaxial layers with varying of technological process parameters.

Tuesday, February 7, 2017

A SUCCINCT PROGRAMMING LANGUAGE WITH A NATIVE DATABASE COMPONENT

Author :  Hanfeng Chen and Wai-Mee Ching
  

Volume, Issue, Month, Year :
 Vol 7, No 1, January, 2017

ABSTRACT

 ELI is a succinct interactive programming language system organized around a few simple principles. Its main data structures include arrays, lists, dictionaries and tables. In addition, it has an integrated database management component which is capable of processing a basic set of SQL statements. ELI, with a compiler, covering the array portion of the language, is also an excellent tool for coding scientific and engineering solutions. Moreover, it is productive in writing complex applications as well, such as compilers and trading systems.



http://allconferencecfpalerts.com/cfp/view-paper.php?eno=871

LIGHTWEIGHT MOBILE WEB SERVICE PROVISIONING FOR THE INTERNET OF THINGS MEDIATION

Author :  Mohan Liyanage, Chii Chang and Satish Narayana Srirama
  

Volume, Issue, Month, Year :
  Vol.8, No.1, January 2017

ABSTRACT

 Emerging sensor-embedded smartphones motivated the mobile Internet of Things research. With the integrated embedded hardware and software sensor components, and mobile network technologies, smartphones are capable of providing various environmental context information via embedded mobile device-hosted Web services (MWS). MWS enhances the capability of various mobile sensing applications such as mobile crowdsensing, real time mobile health monitoring, mobile social network in proximity and so on. Although recent smartphones are quite capable in terms of mobile data transmission speed and computation power, the frequent usage of high performance multi-core mobile CPU and the high speed 3G/4G mobile Internet data transmission will quickly drain the battery power of the mobile device. Although numerous previous researchers have tried to overcome the resource intensive issues in mobile embedded service provisioning domain, most of the efforts were constrained because of the underlying resource intensive technologies. This paper presents a lightweight mobile Web service provisioning framework for mobile sensing which utilises the protocols that were designed for constrained Internet of Things environment. The prototype experimental results show that the proposed framework can provide higher throughput and less resource consumption than the traditional mobile Web service frameworks.


SMARTPHONE PREVENTIVE CUSTOMIZED POWER SAVING MODES

Author :  Ahmed Sameh and Abdulla Al-Masri
  

Volume, Issue, Month, Year :
  Vol.8, No.1, January 2017

ABSTRACT

 The postulate of this paper is that current smartphones' power saving modes can be improved towards saving more power and/or gain more user satisfaction only if they start following “preventive” and/or user customized power saving plans. We develop a number of preventive power saving modes that save the battery power without the need of using the power of the same battery for detecting abusage. It will be supporting the user with a preventive plan that could give him/her an idea about what to run or don’t run. Another issue of current power saving modes is the “One Size Fits All” philosophy which does not take into consideration the factors that could distinguish different smartphone users, for example, the nature of the workspace of the user (Indoor/Outdoor), the age, the gender and/or the user’s applications categories of interests. The paper develops a strategy to match a smartphone power saving mode with its perfect smartphone user by classifying smartphones users into classes depending on a set of different factors and having the user to identify himself/herself to the smartphone before the first use.


DESIGN, CONSTRUCTION AND EVALUATION OF A DIGITAL HAND-PUSHED PENETROMETER

Author :  Alireza Rezaeea
  

Volume, Issue, Month, Year :
 Vol 7, No.1, January 2017

ABSTRACT

 A cone penetrometer is widely used in tillage and off-road mobility research as an indicator of soil strength and density characteristics. Light-weight, manually operated units are especially useful in recording cone index determination at remote field locations. An electronically hand-pushed soil penetrometer with a microcontroller-based data logging system was designed and fabricated to provide a portable penetrometer for determining soil resistance to penetration in tillage studies. The device consists of three main components: a cantilever beam strain-gauge load cell held by housing to measure penetration force, depth measurement mechanism with a photodiode sensor, and a data logging system for amplifying, digitizing, and acquiring data. Data from data logging system can be downloaded into a personal computer by an RS232 cable and a software program. In evaluation stage, the performance of the developed penetrometer was compared with a commercial Eijkelkamp hand-pushed digital penetrometer in a controlled soil bin conditions. No significant difference was found (p<0.05) between the two penetrometers. The penetrometer performance was reliable and the penetrometer’s mechanical and electrical parts worked well without any malfunctions. The device is very light, easy to use and more economical compared to the conventional types.


SPEED AND TORQUE CONTROL OF AN INDUCTION MOTOR WITH ANN BASED DTC

Author :  Fatih Korkmaz
  

Volume, Issue, Month, Year :
 Vol.7, No.1, January 2017

ABSTRACT

 Due to advantages such as fast dynamic response, simple and robust control structure, direct torque control (DTC) is commonly used method in high performance control method for induction motors. Despite mentioned advantages, there are some chronically disadvantages with this method like high torque and current ripples, variable switching behaviour and control problems at low speed rates. On the other hand, artificial neural network (ANN) based control algorithms are getting increasingly popular in recent years due to their positive contribution to the system performance. The purpose of this paper is investigating of the effects of ANN integrated DTC method on induction motor performance by numerical simulations. For this purpose, two different ANN models have been designed, trained and implemented for the same DTC model. The first ANN model was designed to select optimum inverter and the second model was designed to use in the determination of the flux vector position. Matlab/Simulink model of the proposed ANN based DTC method was created in order to compare with the conventional DTC and the proposed DTC methods. The simulation studies proved that the induction motor torque ripples have been reduced remarkably with the proposed method and this approach can be a good alternative to the conventional DTC method for induction motor control.


CALIBRATION OF INERTIAL SENSOR BY USING PARTICLE SWARM OPTIMIZATION AND HUMAN OPINION DYNAMICS ALGORITHM

Author :  Vikas Kumar Sinha , Avinash Kumar Maurya
  

Volume, Issue, Month, Year :
 Vol 7, No 1, January, 2017

ABSTRACT

An Inertial Navigation System (INS) can easily track position, velocity and orientation of any moving vehicle. Generally, deterministic errors are present in an uncalibrated Inertial Measurement Unit (IMU) which leads to the requirement of an accurate estimation of navigation solution. These inertial sensors, thus, needs to be calibrated to reduce the error inherent in these systems. By mathematical model of IMU including both accelerometer and gyroscope is utilized for the purpose of error calibration. Particle Swarm Optimization (PSO) and Human Opinion Dynamics (HOD) Optimization based calibration techniques have used to obtain error parameters such as bias, scale factor and misalignment errors.

Friday, January 27, 2017

SERVICE LEVEL AGREEMENT BASED FAULT TOLERANT WORKLOAD SCHEDULING IN CLOUD COMPUTING ENVIRONMENT

Author :  Manpreet Singh Gill and Dr. R. K. Bawa
  

Volume, Issue, Month, Year :
 Vol 7 , No 3/4 , December, 2016

ABSTRACT

 Cloud computing is a concept of providing user and application oriented services in a virtual environment. Users can use the various cloud services as per their requirements dynamically. Different users have different requirements in terms of application reliability, performance and fault tolerance. Static and rigid fault tolerance policies provide a consistent degree of fault tolerance as well as overhead. In this research work we have proposed a method to implement dynamic fault tolerance considering customer requirements. The cloud users have been classified in to sub classes as per the fault tolerance requirements. Their jobs have also been classified into compute intensive and data intensive categories. The varying degree of fault tolerance has been applied consisting of replication and input buffer. From the simulation based experiments we have found that the proposed dynamic method performs better than the existing methods.


A REVIEW ON ZIGBEE, GSM AND WSN BASED HOME SECURITY BY USING EMBEDDED CONTROLLED SENSOR NETWORK

Author :  Anjali Kulkarni and Amol Patange
  

Volume, Issue, Month, Year :
 Vol 6 , No 3/4 , December, 2016

ABSTRACT

 Embedded controlled detector network is that the technology needs to implement environmental solutions effectively. The wireless sensor network is used to control respective devices and monitor environmental parameters effectively. Wireless sensor network technology is used to monitor environmental parameters such temperature sound pressure in many applications. Many researchers are developed the wireless sensor network to implement real time surveillance for many applications. The existing wired systems are bulky, very high cost and difficult to maintain. The proposed system is user friendly, low cost and controlled by embedded controlled sensor network. In the proposed system ARM based microcontroller is used for monitoring and wireless sensors are used to control the various devices.
 

HIGH AVAILABILITY AND LOAD BALANCING FOR POSTGRESQL DATABASES: DESIGNING AND IMPLEMENTING.

Author :  Pablo Bárbaro Martinez Pedroso
  

Volume, Issue, Month, Year :
 Vol 8  , No 6 , December, 2016

ABSTRACT

 The aim of the paper is to design and implement an approach that provides high availability and load balancing to PostgreSQL databases. Shared nothing architecture and replicated centralized middleware architecture were selected in order to achieve data replication and load balancing among database servers. Pgpool-II was used to implementing these architectures. Besides, taking advantage of pgpool-II as a framework, several scripts were developed in Bash for restoration of corrupted databases. In order to avoid single point of failure Linux HA (Heartbeat and Pacemaker) was used, whose responsibility is monitoring all processes involved in the whole solution. As a result applications can operate with a more reliable PostgreSQL database server, the most suitable situation is for applications with more load of reading statement (typically select) over writing statement (typically update). Also approach presented is only intended for Linux based Operating System.

DYNAMIC CLASSIFICATION OF SENSITIVITY LEVELS OF DATAWAREHOUSE BASED ON USER PROFILES

Author :  Amina El ouazzani , Nouria Harbi and Hassan Badir
  

Volume, Issue, Month, Year :
 Vol 8  , No 6 , December, 2016

ABSTRACT

 A data warehouse stores secret data about the privacy of individuals and important business activities. This makes access to this source a risk of disclosure of sensitive data. Hence the importance of implementing security measures which guarantee the data confidentiality by establishing an access control policy. In this direction, several propositions were made, but none are considered as a standard for access management to data warehouses. In this article, we will present our approach that allows first to exploit the permissions defined in the data sources in order to help the administrator to define access permissions to the data warehouse, and then our system will automatically generate the sensitivity level of each data warehouse element according to the permissions granted to an object in the data warehouse.
 

INTEGRATED FRAMEWORK TO MODEL DATA WITH BUSINESS PROCESS AND BUSINESS RULES

Author : Rajeev Kaula
  

Volume, Issue, Month, Year :
 Vol 8  , No 6 , December, 2016

ABSTRACT

Data modeling is an approach to model data by mapping operational tasks iteratively, while associated guidelines are either partly mapped in the data model or expressed through software applications. Since an organization is a collection of business processes, it is essential that data models utilize such processes to facilitate data modeling. Also, data models should incorporate guidelines for completing operational tasks through the concept of business rules. This paper outlines a unified framework on database modeling and design based on business process concepts that also incorporates business rules impacting business operations. The paper focuses on the relational database and its primary mode of conceptual modeling in the form of an entity relationship model. Concepts are illustrated through Oracle's database language PL/SQL and its Web variant PL/SQL Server Pages.

DESIGNING A WORKING MEMORY CAPACITY TEST FOR COGNITIVE-FRIENDLY TANGIBLE MULTIMEDIA

Author : Chau Kien Tsong , Zarina Samsudin and Wan Ahmad Jaafar Wan Yahaya
  

Volume, Issue, Month, Year :
 Vol 8  , No 6 , December, 2016

ABSTRACT

A working memory capacity (WMC) test called “objects-span tri-tasks” is designed for preschoolers undergoing treatment using a new genre of multimedia, tangible multimedia, created by the authors. It tests the dual-functions of the preschoolers’ working memory (WM), namely storage and manipulation capacity, essential in supporting academic skills. The third task in the test is the overt setting of task engaging the long-term memory that supports the operation of WM. Tangible multimedia potentially enhances the WMC of preschoolers to a considerable extent because firstly, it uses tangible objects that are cognitively appropriate to the “preoperational” stage of preschoolers, and secondly, it simultaneously stimulates three main sensory channels, prescribed as equally crucial in knowledge acquisition in human memory theories. A pragmatic significance of the research is that it deepens the scope of multimedia research by looking into the aspect of cognitive structure which is rarely conducted in the multimedia realm. It also demonstrates an important step forward in multimedia research by relating WMC to the newly explored tangible multimedia, which could determine the real capability and value of such system. This paper starts off by discussing the underlying theories that contribute to the formation of the system and test, followed by its procedure, and a brief report of a case study.

AN EXPLORATION OF THE CONCEPT OF TRANSMEDIA STORYTELLING IN THE UNITED STATES AND SOUTH KOREA: A SYSTEMATIC ANALYSIS

Author : Young-Sung Kwon* and Daniel H Byun†
  

Volume, Issue, Month, Year :
 Vol 8  , No 6 , December, 2016

ABSTRACT

This research study sought to investigate how the concept of transmedia storytelling had been studied in both the United States and South Korea. The general objective of this research was to discover whether researches in the United States and South Korea studied the concept of transmedia storytelling differently and whether researchers in both the United States and South Korea have focused on different aspects of transmedia storytelling. Qualitative methods were used in the study. A systematic review was used as a method of data collection and analysis. The results of the study have revealed that the difference between studies that have been done in the United States and South Korea on the concept of transmedia storytelling have discernible differences. The only slight differences are with respect to the use of mobile technology and the scope and nature of the channels through which transmedia is experienced.  

REDUCED COMPLEXITY QUASI-CYCLIC LDPC ENCODER FOR IEEE 802.11N

Author :  Monica Manka , Gajendra Asutkar , Pravin Dakhole
  

Volume, Issue, Month, Year :
 Vol 7 , No 5/6, December, 2016

ABSTRACT

In this paper, we present a low complexity Quasi-cyclic -low-density-parity-check (QC-LDPC) encoder hardware based on Richardson and Urbanke lower- triangular algorithm for IEEE 802.11n wireless LAN Standard for 648 block length and 1/2 code rate. The LDPC encoder hardware implementation works at 301.433MHz and it can process 12.12 Gbps throughput. We apply the concept of multiplication by constant matrices in GF(2) due to which hardware required is also optimized. Proposed architecture of QC-LDPC encoder will be compatible for high-speed applications. This hardwired architecture is less complex as it avoids conventionally used block memories and cyclic-shifters.


500nW A LOW POWER SWITCHED CAPACITOR BASED ACTIVE LOW PASS FILTER FOR BIOMEDICAL APPLICATIONS

Author :  U. Gnaneshwara chary, L.Babitha and Vandana.Ch 

Volume, Issue, Month, Year :
 Vol 7 , No 5/6, December, 2016

ABSTRACT

This paper aims to implement an area efficient 2-parallel FIR digital filter. Xilinx 14.2 is used for synthesis and simulation. Parallel filters are designed by using VHDL. Comparison among primary 2–parallel FIR digital filter and area efficient 2-parallel FIR digital filter has been done. Since adders are less weight in term of silicon area, compare to multipliers. Therefore multipliers are replaced with adders for reducing area and speed of the filter. 2-parallel FIR filter is used in digital signal processing (DSP) application. 


VLSI IMPLEMENTATION OF AREA EFFICIENT 2-PARALLEL FIR DIGITAL FILTER

Author :  L kholee phimu and Manoj kumar  

Volume, Issue, Month, Year :
 Vol 7 , No 5/6, December, 2016

ABSTRACT

This paper aims to implement an area efficient 2-parallel FIR digital filter. Xilinx 14.2 is used for synthesis and simulation. Parallel filters are designed by using VHDL. Comparison among primary 2–parallel FIR digital filter and area efficient 2-parallel FIR digital filter has been done. Since adders are less weight in term of silicon area, compare to multipliers. Therefore multipliers are replaced with adders for reducing area and speed of the filter. 2-parallel FIR filter is used in digital signal processing (DSP) application. 


SIMULTANEOUS OPTIMIZATION OF STANDBY AND ACTIVE ENERGY FOR SUB-THRESHOLD CIRCUITS

Author :  Ali T. Shaheen and Saleem M. R. Taha  

Volume, Issue, Month, Year :
 Vol 7 , No 5/6, December, 2016

ABSTRACT

Increased downscaling of CMOS circuits with respect to feature size and threshold voltage has a result of dramatically increasing in leakage current. So, leakage power reduction is an important design issue for active and standby modes as long as the technology scaling increased. In this paper, a simultaneous active and standby energy optimization methodology is proposed for 22 nm sub-threshold CMOS circuits. In the first phase, we investigate the dual threshold voltage design for active energy per cycle minimization. A slack based genetic algorithm is proposed to find the optimal reverse body bias assignment to set of noncritical paths gates to ensure low active energy per cycle with the maximum allowable frequency at the optimal supply voltage. The second phase, determine the optimal reverse body bias that can be applied to all gates for standby power optimization at the optimal supply voltage determined from the first phase. Therefore, there exist two sets of gates and two reverse body bias values for each set. The reverse body bias is switched between these two values in response to the mode of operation. Experimental results are obtained for some ISCAS-85 benchmark circuits such as 74L85, 74283, ALU74181, and 16 bit RCA. The optimized circuits show significant energy saving ranged (from 14.5% to 42.28%) and standby power saving ranged (from 62.8% to 67%).

For more Details...

A MINIMUM RECONFIGURATION PROBABILITY ROUTING ALGORITHM FOR RWA IN ALL-OPTICAL NETWORKS

Author :  D.Ratna kishore , Dr. M. Chandra Mohan and Dr.Akepogu. Ananda Rao

Volume, Issue, Month, Year :
 Vol 7, No 6, December, 2016

ABSTRACT

In this paper, we present a detailed study of Minimum Reconfiguration Probability Routing (MRPR)
algorithm, and its performance evaluation in comparison with Adaptive unconstrained routing (AUR) and Least Loaded routing (LLR) algorithms. We have minimized the effects of failures on link and router failure in the network under changing load conditions, we assess the probability of service and number of light path failures due to link or route failure on Wavelength Interchange(WI) network. The computation complexity is reduced by using Kalman Filter(KF) techniques. The minimum reconfiguration probability routing (MRPR) algorithm selects most reliable routes and assign wavelengths to connections in a manner that utilizes the light path(LP) established efficiently considering all possible requests. 

VISION BASED HAND GESTURE RECOGNITION USING FOURIER DESCRIPTOR FOR INDIAN SIGN LANGUAGE

Author :  D.Ratna kishore , Dr. M. Chandra Mohan and Dr.Akepogu. Ananda Rao

Volume, Issue, Month, Year :
 Vol 7, No 6, December, 2016

ABSTRACT

Indian Sign Language (ISL) interpretation is the major research work going on to aid Indian deaf and dumb people. Considering the limitation of glove/sensor based approach, vision based approach was considered for ISL interpretation system. Among different human modalities, hand is the primarily used modality to any sign language interpretation system so, hand gesture was used for recognition of manual alphabets and numbers. ISL consists of manual alphabets, numbers as well as large set of vocabulary with grammar. In this paper, methodology for recognition of static ISL manual alphabets, number and static symbols is given. ISL alphabet consists of single handed and two handed sign. Fourier descriptor as a feature extraction method was chosen due the property of invariant to rotation, scale and translation. True positive rate was achieved 94.15% using nearest neighbourhood classifier with Euclidean distance where sample data were considered with different illumination changes,

A Novel Probabilistic Based Image Segmentation Model for Realtime Human Activity Detection

Author :  D.Ratna kishore , Dr. M. Chandra Mohan and Dr.Akepogu. Ananda Rao

Volume, Issue, Month, Year :
 Vol 7, No 6, December, 2016

ABSTRACT

Today, coins are used to operate many electric devices that are open to the public service. Washing machines, play stations, computers, auto brooms, foam machines, beverage machines, telephone chargers, hair dryers and water heaters are some examples of these devices These devices include coin recognition systems. In these systems, there are coils at two different radius, which become electromagnets when the current is passed through them. The AC current supplied to the coils creates a variable magnetic field, which induces the eddy current on the coil during the passing of money. The magnetic field generated by the Eddy current reduces the current passing through the coil. The amount of change of current in the coil gives information about the coin; the type of metal (element) and the amount of metal (element). In this study, a new coin identification system (magnetic measurement system) is designed. In this system, the magnetic anomaly generated by the coin as a result of ap

A FLUXGATE SENSOR APPLICATION: COIN IDENTIFICATION

Author :  Hakan Çıtak , Mustafa Çoramık , Yavuz Ege
Volume, Issue, Month, Year :
 Vol 7, No 6, December, 2016

ABSTRACT

Today, coins are used to operate many electric devices that are open to the public service. Washing machines, play stations, computers, auto brooms, foam machines, beverage machines, telephone chargers, hair dryers and water heaters are some examples of these devices These devices include coin recognition systems. In these systems, there are coils at two different radius, which become electromagnets when the current is passed through them. The AC current supplied to the coils creates a variable magnetic field, which induces the eddy current on the coil during the passing of money. The magnetic field generated by the Eddy current reduces the current passing through the coil. The amount of change of current in the coil gives information about the coin; the type of metal (element) and the amount of metal (element). In this study, a new coin identification system (magnetic measurement system) is designed. In this system, the magnetic anomaly generated by the coin as a result of ap

MODEL BASED TECHNIQUE FOR VEHICLE TRACKING IN TRAFFIC VIDEO USING SPATIAL LOCAL FEATURES

Author :   Arun Kumar H. D and Prabhakar C. J

Volume, Issue, Month, Year :  Vol 3, No 4, December, 2016

ABSTRACT

In this paper, we proposed a novel method for visible vehicle tracking in traffic video sequence using model based strategy combined with spatial local features. Our tracking algorithm consists of two components: vehicle detection and vehicle tracking. In the detection step, we subtract the background and obtained candidate foreground objects represented as foreground mask. After obtaining foreground mask of candidate objects, vehicles are detected using Co-HOG descriptor. In the tracking step, vehicle model is constructed based on shape and texture features extracted from vehicle regions using Co-HOG and CSLBP method. After constructing the vehicle model, for the current frame, vehicle features are extracted from each vehicle region and then vehicle model is updated. Finally, vehicles are tracked based on the similarity measure between current frame vehicles and vehicle models. The proposed algorithm is evaluated based on precision, recall and VTA metrics obtained on GRAM-RTM




http://allconferencecfpalerts.com/cfp/view-paper.php?eno=545  

Wednesday, January 11, 2017

MULTIMODAL BIOMETRIC AUTHENTICATION: SECURED ENCRYPTION OF IRIS USING FINGERPRINT ID

Author :  Sheena S1 and Sheena Mathew2

Volume, Issue, Month, Year : Vol. 6, No. 3/4, December 2016

ABSTRACT


Securing data storage using biometrics is the current trend. Different physiological as well as behavioral biometrics like face, fingerprint, iris, Gait, voice etc.. is used in providing security to the data. The proposed work explains about the biometric encryption technology which will securely generate a digital key using two biometric modalities. Iris is encrypted using Fingerprint ID of 32-bit as the key in this work. For encryption Blowfish algorithm is used and the encrypted template is stored in the database and one is given to the user. During the authentication time user input the template and the fingerprint. This template is then decrypted and verified with the original template taken from the database to check whether the user is genuine or an imposter. Hamming distance is used to measure the matching of the templates. CASIA Iris database is used for experimentation and fingerprint images read through the R303 - fingerprint reader.



A NEW ERA OF CRYPTOGRAPHY: QUANTUM CRYPTOGRAPHY

Author :  Sandeepak Bhandari

Volume, Issue, Month, Year : Vol. 6, No. 3/4, December 2016

ABSTRACT


Security is the first priority in today digital world for secure communication between sender and receiver. Various Cryptography techniques are developed time to time for secure communication. Quantum Cryptography is one of the latest and advanced cryptography technique, it is different from all other cryptography technique and more secure. It based on the Quantum of physics since its name which make it more secure from all other cryptography and UN breakable. In this paper about quantum cryptography i.e working, limitation and advantages discussed.



Tuesday, January 10, 2017

ALGEBRAIC DEGREE ESTIMATION OF BLOCK CIPHERS USING RANDOMIZED ALGORITHM; UPPER-BOUND INTEGRAL DISTINGUISHER.

Author :  Haruhisa Kosuge and Hidema Tanaka

Volume, Issue, Month, Year : Vol. 6, No. 3/4, December 2016


ABSTRACT


Integral attack is a powerful method to recover the secret key of block cipher by exploiting a characteristic that a set of outputs after several rounds encryption has ( integral distinguisher). Recently, Todo proposed a new algorithm to construct integral distinguisher with division property. However, the existence of integral distinguisher which holds in additional rounds can not be denied by the algorithm. On the contrary, we take an approach to obtain the number of rounds which integral distinguisher does not hold ( upper-bound integral distinguisher). The approach is based on algebraic degree estimation. We execute a random search for a term which has a degree equals the number of all inputted variables. We propose an algorithm and apply it to PRESENT and RECTANGLE. Then, we confirm that there exists no 8-round integral distinguisher in PRESENT and no 9-round integral distinguisher in RECTANGLE. From the facts, integral attack for more than 11-round and 13-round of PRESENT and RECTANGLE is infeasible, respectively.



NEW ALGORITHM FOR WIRELESS NETWORK COMMUNICATION SECURITY

Author :  Sirwan Ahmed1 and Majeed Nader2

Volume, Issue, Month, Year : Vol. 6, No. 3/4, December 2016


ABSTRACT


is paper evaluates the security of wireless communication network based on the fuzzy logic in Mat lab. A new algorithm is proposed and evaluated which is the hybrid algorithm. We highlight the valuable assets in designing of wireless network communication system based on network simulator (NS2), which is crucial to protect security of the systems. Block cipher algorithms are evaluated by using fuzzy logics and a hybrid algorithm is proposed. Both algorithms are evaluated in term of the security level. Logic (AND) is used in the rules of modelling and Mamdani Style is used for the evaluations



LOCALITY SIM: CLOUD SIMULATOR WITH DATA LOCALITY

Author :  Ahmed H.Abase1 , Mohamed H. Khafagy2 and Fatma A. Omara3

Volume, Issue, Month, Year : Vol. 6, No. 6, December 2016


ABSTRACT


Cloud Computing (CC) is a model for enabling on-demand access to a shared pool of configurable computing resources. Testing and evaluating the performance of the cloud environment for allocating, provisioning, scheduling, and data allocation policy have great attention to be achieved. Therefore, using cloud simulator would save time and money, and provide a flexible environment to evaluate new research work. Unfortunately, the current simulators (e.g., CloudSim, NetworkCloudSim, GreenCloud, etc..) deal with the data as for size only without any consideration about the data allocation policy and locality. On the other hand, the NetworkCloudSim simulator is considered one of the most common used simulators because it includes different modules which support needed functions to a simulated cloud environment, and it could be extended to include new extra modules. According to work in this paper, the NetworkCloudSim simulator has been extended and modified to support data locality. The modified simulator is called LocalitySim. The accuracy of the proposed LocalitySim simulator has been proved by building a mathematical model. Also, the proposed simulator has been used to test the performance of the three-tire data center as a case study with considering the data locality feature.