IJITCS Vol. 14, No. 3, Jun. 2022
Cover page and Table of Contents: PDF (size: 130KB)
Nowadays, computer software plays a significant role in all fields of our life. Essentially open-source software provides economic benefits for software companies such that it allows building new software without the need to create it from scratch. Therefore, it is extremely used, and accordingly, open-source software’s quality is a critical issue and one of the top research directions in the literature. In the development cycles of the software, checking the software reliability is an important indicator to release software or not. The deterministic and probabilistic models are the two main categories of models used to assess software reliability. In this paper, we perform a comparative study between eight different software reliability models: two deterministic models, and six probabilistic models based on three different methodologies: perfect debugging, imperfect debugging, and Gompertz distribution. We evaluate the employed models using three versions of a standard open-source dataset which is GNU’s Not Unix Network Object Model Environment projects. We evaluate the employed models using four evaluation criteria: sum of square error, mean square error, R-square, and reliability. The experimental results showed that for the first version of the open-source dataset SRGM-4 based on imperfect debugging methodology achieved the best reliability result, and for the last two versions of the open-source dataset SRGM-6 based on Gompertz distribution methodology achieved the best reliability result in terms of sum of square error, mean square error, and R-square.[...] Read more.
The growth of microblogging sites such as Biomedical, biomedical, defect, or bug databases makes it difficult for web users to share and express their context identification of sequential key phrases and their categories on text clustering applications. In the traditional document classification and clustering models, the features associated with TREC texts are more complex to analyze. Finding relevant feature-based key phrase patterns in the large collection of unstructured documents is becoming increasingly difficult, as the repository's size increases. The purpose of this study is to develop and implement a new hierarchical document clustering framework on a large TREC data repository. A document feature selection and clustered model are used to identify and extract MeSH related documents from TREC biomedical clinical benchmark datasets. Efficiencies of the proposed model are indicated in terms of computational memory, accuracy, and error rate, as demonstrated by experimental results.[...] Read more.
Intelligent Transport System (ITS) is a transport system that uses communicating technologies such as cellular network communication, digital video broadcasting and adhoc wireless communication to link people on the road, vehicles with aim of solving various traffic related issues. Vehicle to infrastructure (V2I) communication is an important research area to develop cooperative self-driving support system using DSRC technology. V2I develops an environment friendly system that also accelerates the fuel efficiency by establishing high quality links between vehicles to roadside infrastructure. It is a system to prevent and help drivers to overlooking or missing the red lights at junctions. V2I system along the road side and intersections continuously transmit the traffic signal information to vehicles by warning the driver about red lights and thus help us to prevent road rule violations. ITS helps to prevent drivers’ oversight about signals right/left turn collision avoidance and timely activation of brake system. In the proposed work we used a three-layer Vehicle to Infrastructure (V2I) network architecture to collect and disseminate the safety information using static and dynamic agents. These methods help us to quickly selecting high quality error free links to forward the data packets. In a highway road scenario with moderate traffic density, the proposed system gives an improved performance in terms of coverage area, lossless transmission and reduced latency. Finally, qualitative comparison is made with present V2I system and found significance improvement in its performance metrics. The outcome of the proposed system improved by 23%, 13%, 15% compared to the existing system in terms of end-to-end delay, communication overhead and energy consumption respectively considering V2I network architecture.[...] Read more.
Much like other processing domains, cloud computing is not surely safe. Security of cloud processing needs the same attention as any other aspect of cloud processing requires. Cyber world is shifting toward ontological technique and web 3.0 or web semantics for security. Cloud hosts servers demand more attention in context of security as number of resources and their access increases. Security measures have to be more extensive in cloud. Semantic web is usually new revolution inside the web science, which usually works upon base of ontologies. Ontologies are receiving great attention in the domain of computing and hence in the domain of security. This review paper examines different proposed “ontology centered techniques” and also provides a comprehensive analysis on these tactics. This research paper gives critical analysis of different models presented by different authors and researchers for ensuring security of a cloud based environment. This analysis helps different vendors of cloud technology to adapt one or all of these models to practically implement in their cloud machines whether they are offering IaaS, PaaS or SaaS. Any new security model using ontologies can also be proposed based on this study, as this paper gives a comprehensive comparison of the previously proposed ontologies for monitoring security state of cloud environment as safe or malicious.[...] Read more.