IJITCS Vol. 4, No. 4, Apr. 2012
Cover page and Table of Contents: PDF (size: 138KB)
The volume and quality of broadband services in any country is being understood not only as a major parameter related to the economic growth of that country, but also as a major parameter how much that country is ready for economic growth in the near future. With this understanding the main reasons why expansion of broadband services in Slovakia is an urgent issue are briefly outlined in this paper.[...] Read more.
Software design is one of the most important and key activities in the system development life cycle (SDLC) phase that ensures the quality of software. Different key areas of design are very vital to be taken into consideration while designing software. Software design describes how the software system is decomposed and managed in smaller components. Object-oriented (OO) paradigm has facilitated software industry with more reliable and manageable software and its design. The quality of the software design can be measured through different metrics such as Chidamber and Kemerer (CK) design metrics, Mood Metrics & Lorenz and Kidd metrics. CK metrics is one of the oldest and most reliable metrics among all metrics available to software industry to evaluate OO design. This paper presents an evaluation of CK metrics to propose an improved CK design metrics values to reduce the defects during software design phase in software. This paper will also describe that whether a significant effect of any CK design metrics exists on total number of defects per module or not. This is achieved by conducting survey in two software development companies.[...] Read more.
The introduction of e-service solutions within the public sector has primarily been concerned with moving away from traditional information monopolies and hierarchies. E-service aims at increasing the convenience and accessibility of government services and information to citizens. Providing services to the public through the Web may lead to faster and more convenient access to government services with fewer errors. It also means that governmental units may realize increased efficiencies, cost reductions, and potentially better customer service. The main objectives of this work are to study and identify the success criteria of e-service delivery and to propose a comprehensive, multidimensional framework of e-services success. To examine the validity of the proposed framework, a sample of 200 e-service users were asked to assess their perspectives towards e-service delivery in some Egyptian organizations. The results showed that the proposed framework is applicable and implementable in the e-services evaluation; it also shows that the proposed framework may assist decision makers and e-service system designers to consider different criteria and measures before committing to a particular choice of e-service or to evaluate any existing e-service system.[...] Read more.
The stock market domain is a dynamic and unpredictable environment. Traditional techniques, such as fundamental and technical analysis can provide investors with some tools for managing their stocks and predicting their prices. However, these techniques cannot discover all the possible relations between stocks and thus there is a need for a different approach that will provide a deeper kind of analysis. Data mining can be used extensively in the financial markets and help in stock-price forecasting. Therefore, we propose in this paper a portfolio management solution with business intelligence characteristics. We know that the temporal high utility itemsets are the itemsets with support larger than a pre-specified threshold in current time window of data stream. Discovery of temporal high utility itemsets is an important process for mining interesting patterns like association rules from data streams. We proposed the novel algorithm for temporal association mining with utility approach. This make us to find the temporal high utility itemset which can generate less candidate itemsets.[...] Read more.
This paper presents the latency and potential of central nervous system based system intelligent computer engineering system for detecting shelf life of soft mouth melting milk cakes stored at 10o C. Soft mouth melting milk cakes are exquisite sweetmeat cuisine made out of heat and acid thickened solidified sweetened milk. In today’s highly competitive market consumers look for good quality food products. Shelf life is a good and accurate indicator to the food quality and safety. To achieve good quality of food products, detection of shelf life is important. Central nervous system based intelligent computing model was developed which detected 19.82 days shelf life, as against 21 days experimental shelf life.[...] Read more.
In this research paper we examine RFID based toll deduction system and how to make more efficient and perfect. The vehicle will be equipped with a radio frequency (RF) tag which will detect RF Reader located in on toll plaza. The amount will then automatically deduct from the bank account. This research paper can be considered scalable to implement in motor vehicles used today.[...] Read more.
This paper presents a centrality measurement and analysis of the social networks for tracking online community. The tracking of single community in social networks is commonly done using some of the centrality measures employed in social network community tracking. The ability that centrality measures have to determine the relative position of a node within a network has been used in previous research work to track communities in social networks using betweenness, closeness and degree centrality measures. It introduces a new metric K-path centrality, and a randomized algorithm for estimating it, and shows empirically that nodes with high K-path centrality have high node betweenness centrality.[...] Read more.
Many strategies had been proposed in the literature to hide the information containing sensitive items. Some use distributed databases over several sites, some use data perturbation, some use clustering and some use data distortion technique. Present paper focuses on data distortion technique. Algorithms based on this technique either hide a specific rule using data alteration technique or hide the rules depending on the sensitivity of the items to be hidden. The proposed approach is based on data distortion technique where the position of the sensitive items is altered but its support is never changed. The proposed approach uses the idea of representative rules to prune the rules first and then hides the sensitive rules. Experimental results show that proposed approach hides the more number of rules in minimum number of database scans compared to existing algorithms based on the same approach i.e. data distortion technique.[...] Read more.