Work place: Shaheed Zulfikar Ali Bhutto Institute of Science and Technology, Islamabad, Pakistan
Research Interests: Software Engineering, Computational Engineering
M.N.A. Khan obtained D.Phil. degree from the University of Sussex, Brighton. His research interests are in the fields of software engineering, cyber administration, digital forensic analysis and machine learning techniques.
DOI: https://doi.org/10.5815/ijeme.2017.02.04, Pub. Date: 8 Mar. 2017
Customer Relationship Management (CRM) system is used to manage company relations with the existing and prospect customers. Data mining is used in organization for decision making and forecasting of prospective customers. We have studied recent literature related to use of data mining techniques for CRM. Based on review of the contemporary literature, we analyzed different data mining techniques employed in different types of business, corporate sectors and organizations. We illustrated a critical review table which provides the problem addressed, proposed techniques, significance, limitations and suggested possible improvements for each proposed techniques review during this study. The critical review of the data mining techniques which are being used for CRM is provided in this paper.[...] Read more.
DOI: https://doi.org/10.5815/ijmecs.2016.07.06, Pub. Date: 8 Jul. 2016
Testing is a crucial step in designing and implementing software in the distributed environment. Testing in the distributed applications is not only difficult, but also a costly method. This Research briefly discusses the performance testing in the distributed software environment along with different other testing techniques proposed in the literature that correspond to distributed applications. Additionally, we discuss the key testing challenges faced during the whole process of testing the distributed applications. Much of the focus of this paper is on intelligent agent-based testing. Agent based testing is provide better coordination mechanism between multiple testers and exert more controllability and observablility on fault detection. In this study we have critically analyzed testing methodologies being practiced in the distributed environment. We have studied the merits and limitations of these methodologies proposed in the contemporary literature and have identified the possible improvements in those methodologies to make them more robust.[...] Read more.
DOI: https://doi.org/10.5815/ijmecs.2016.06.07, Pub. Date: 8 Jun. 2016
The field of digital forensic analysis has emerged in the past two decades to counter the digital crimes and investigate the modus operandi of the culprits to secure the computer systems. With the advances in technologies and pervasive nature of the computing devices, the digital forensic analysis is becoming a challenging task. Due to ease of digital equipment and popularity of Internet, criminals have been enticed to carry out digital crimes. Digital forensic is aimed to investigate the criminal activity and bring the culprits to justice. Traditionally the static analysis is used to investigate about an incident but due to a lot of issues related the accuracy and authenticity of the static analysis, the live digital forensic analysis shows an investigator a more complete picture of memory dump. In this paper, we introduce a module for profiling behavior of application programs. Profiling of application is helpful in forensic analysis as one can easily analyze the compromised system. Profiling is also helpful to the investigator in conducting malware analysis as well as debugging a system. The concept of our model is to trace the unique process name, loaded services and called modules of the target system and store it in a database for future forensic and malware analysis. We used VMware workstation version 9.0 on Windows 7 platform so that we can get the detailed and clean image of the current state of the system. The profile of the target application includes the process name, modules and services which are specific to an application program.[...] Read more.
DOI: https://doi.org/10.5815/ijem.2016.01.02, Pub. Date: 8 Jan. 2016
Product maintenance techniques have significant importance because they are much cost effective and less time consuming to maintain a product or software rather to change it. There are different product maintenance and support techniques. These previous techniques do not solve user/clients bugs, issues and enhancements effectively and efficiently. Scrum is being used now a day as a quick, flexible and holistic methodology to develop software. In Scrum projects there is the much customer involvement is included which help to develop a user oriented product. Users can change their requirements in Scrum. Many techniques have been proposed for product maintenance and support. However, in this paper, there have been a detailed literature review of existing product maintenance techniques and also presented a new proposed model and technique for the product maintenance by using Scrum methodology. This Scrum based model for maintenance is designed and based on the analysis of client request types and severity (priority). In our approach, The Session attendees (Scrum Master, Product Owner and Team) choose that bug, issue or enhancement first which has an urgent type or higher priority request and resolves it then select low priority request or non urgent requests and facilitates the clients in timely manner. In this way this proposed model works effectively and defiantly to meet the customer's demand. A comprehensive study on product maintenance and support has been carried out which adds to the current practices in the scrum. We found that maintenance phase of the scrum has been given less attention in the existing literature. In view of this, we have made an attempt to propose a novel model that focuses on the maintenance phase of scrum.[...] Read more.
DOI: https://doi.org/10.5815/ijmecs.2016.01.04, Pub. Date: 8 Jan. 2016
The image segmentation performs a significant role in the field of image processing because of its wide range of applications in the agricultural fields to identify plants diseases by classifying the different diseases. Classification is a technique to classify the plants diseases on different morphological characteristics. Different classifiers are used to classify such as SVM (Support Vector Machine), K- nearest neighbor classifiers, Artificial Neural Networks, Fuzzy Logic, etc. This paper presents different image processing techniques used for the early detection of different Plants diseases by different authors with different techniques. The main focus of our work is on the critical analysis of different plants disease segmentation techniques. The strengths and limitations of different techniques are discussed in the comparative evaluation of current classification techniques. This study also presents several areas of future research in the domain of plants disease segmentation. Our focus is to analyze the best classification techniques and then fuse certain best techniques to overcome the flaws of different techniques, in the future.[...] Read more.
DOI: https://doi.org/10.5815/ijmecs.2015.08.05, Pub. Date: 8 Aug. 2015
Cloud computing has played much important role in comparison to other fields of IT, in providing Data storage, Data security, Quality of Services (QoS) etc. In the last few years, it had emerged and evolved so quickly due to its number of facilities and advantages to the organizations and end users. Many data security factors have also increased due to this fast evolution of cloud in the IT industry. Therefore several security models and trust establishing techniques have been deployed and are been in execution for providing more security to the data, especially the sensitive data. Despite of that much security, many of the models/techniques lacks in one or more security threat measures. In this paper a new model have been designed and proposed which introduces Security Aware Cloud. First the trust of the user or organization is established successfully on cloud than the security to the data is granted through privacy and encryption module. Level of quality of service and security are achieved under the Contract Trust layer while the Authentication and Key Management are covered under Internal Trust layer. For critical data privacy and encryption, Homomorphism mechanism is used. By the use of proposed trust and security model, we can enhance Return on Investment factor in the cloud for the data security and service provided by it.[...] Read more.
DOI: https://doi.org/10.5815/ijmecs.2015.04.08, Pub. Date: 8 Apr. 2015
Biometrics is being commonly used nowadays for the identification and verification of humans everywhere in the world. In biometrics humans unique characteristics like palm, fingerprints, iris etc. are being used. Pattern Recognition and image processing are the major areas where research on signature verification is carried out. Hand written Signature of an individual is also unique and for identification of humans are being used and accepted specially in the banking and other financial transactions. The hand written signatures due to its importance are at target of fraudulence. In this paper we have surveyed different papers on techniques that are currently used for the identification and verification of Offline signatures.[...] Read more.
DOI: https://doi.org/10.5815/ijmecs.2015.01.06, Pub. Date: 8 Jan. 2015
Dealing with data means to group information into a set of categories either in order to learn new artifacts or understand new domains. For this purpose researchers have always looked for the hidden patterns in data that can be defined and compared with other known notions based on the similarity or dissimilarity of their attributes according to well-defined rules. Data mining, having the tools of data classification and data clustering, is one of the most powerful techniques to deal with data in such a manner that it can help researchers identify the required information. As a step forward to address this challenge, experts have utilized clustering techniques as a mean of exploring hidden structure and patterns in underlying data. Improved stability, robustness and accuracy of unsupervised data classification in many fields including pattern recognition, machine learning, information retrieval, image analysis and bioinformatics, clustering has proven itself as a reliable tool. To identify the clusters in datasets algorithm are utilized to partition data set into several groups based on the similarity within a group. There is no specific clustering algorithm, but various algorithms are utilized based on domain of data that constitutes a cluster and the level of efficiency required. Clustering techniques are categorized based upon different approaches. This paper is a survey of few clustering techniques out of many in data mining. For the purpose five of the most common clustering techniques out of many have been discussed. The clustering techniques which have been surveyed are: K-medoids, K-means, Fuzzy C-means, Density-Based Spatial Clustering of Applications with Noise (DBSCAN) and Self-Organizing Map (SOM) clustering.[...] Read more.
DOI: https://doi.org/10.5815/ijmecs.2014.12.07, Pub. Date: 8 Dec. 2014
The social media networks have evolved rapidly and people frequently use these services to communicate with others and express themselves by sharing opinions, views, ideas etc. on different topics. The social media trend analysis is generally carried out by sifting the corresponding or interlinked events discussed on social media websites such as Twitter, Facebook etc. The fundamental objective behind such analyses is to determine the level of criticality with respect to criticism or appreciation described in the comments, tweets or blogs. The trend analysis techniques can also be systematically exploited for opinion making among the masses at large. The results of such analyses show how people think, assess, orate and opine about different issues. This paper primarily focuses on the trend detection and sentiment analysis techniques and their efficacy in the contextual information. We further discuss these techniques which are used to analyze the sentiments expressed within a particular sentence, paragraph or document etc. The analysis based on sentiments can pave way for automatic trend analysis, topic recognition and opinion mining etc. Furthermore, we can fairly estimate the degree of positivity and negativity of the opinions and sentiments based on the content obtained from a particular social media.[...] Read more.
DOI: https://doi.org/10.5815/ijmecs.2014.02.04, Pub. Date: 8 Feb. 2014
In computing, the software elements like objects and components emphasize on reusability using design tools of abstraction and separation of concerns. Software architecture has appeared as an initial idea to develop huge, complicated and heterogeneous distributed systems successfully. Service Oriented Architecture (SOA) combines services together to make systems having a greater impact on the way software systems are developed. SOA addresses the need of standards-based, loosely connected, and distributed computing which is protocol independent. It is not easy to ensure the secure transaction of data, where the movement of data occurs through loosely connected services. A number of techniques have been proposed in the contemporary literature to guide the SOA implementation in distributed system. These techniques offer certain benefits, but pose some challenges alongside such as the use of meta-data as framework and standard, contract documents, security patterns and security adviser, etc. The objective of this research is to provide a comprehensive analysis of various approaches used to provide application level security to the web services in SOA. These approaches have been compared based on a number of parameters. In addition, we critically evaluate different security methods used in SOA. The study also discusses some future directions in this domain.[...] Read more.
DOI: https://doi.org/10.5815/ijmecs.2013.06.08, Pub. Date: 8 Jun. 2013
The area of cloud computing has become popular from the last decade due to its enormous benefits such as lower cost, faster development and access to highly available resources. Apart from these core benefits some challenges are also associated with it such as QoS, security, trust and better resource management. These challenges are caused by the infrastructure services provided by various cloud vendors on need basis. Empirical studies on cloud computing report that existing quality of services solutions are not enough as well as there are still many gaps which need to be filled. Also, there is a dire need to develop appropriate frameworks to improve response time of the clouds. In this paper, we have made an attempt to fill this gap by proposing a framework that focuses on improving the response time factor of the QoS in the cloud environment such as reliability and scalability. We believe that if the response time are communicating effectively and have awareness of the nearest and best possible resource available then the remaining issues pertaining to QoS can be reduced to a greater extent.[...] Read more.
DOI: https://doi.org/10.5815/ijmecs.2013.02.06, Pub. Date: 8 Feb. 2013
Comprehensive requirement engineering (RE) process acts as a backbone of any successful project. RE processes are very complex because most of the requirement engineering documentation is written in natural languages, which are less formal and often distract the designers and developers of the system. To streamline different phases of the software lifecycle, first we need to model the requirement document so that we can analyze and integrate the software artifacts. Designers can ensure completeness and consistency of the system by generating models using the requirement documents. In this paper, we have made an attempt to analyze extreme programming based RE approach to understand its utility in the requirement elicitation phase. In this study, different RE process models are evaluated and a comparison of the extreme programming technique is drawn to highlight the merits of the latter technique over the conventional RE techniques.[...] Read more.
DOI: https://doi.org/10.5815/ijmecs.2013.02.08, Pub. Date: 8 Feb. 2013
Radiologists use medical images to diagnose diseases precisely. However, identification of brain tumor from medical images is still a critical and complicated job for a radiologist. Brain tumor identification form magnetic resonance imaging (MRI) consists of several stages. Segmentation is known to be an essential step in medical imaging classification and analysis. Performing the brain MR images segmentation manually is a difficult task as there are several challenges associated with it. Radiologist and medical experts spend plenty of time for manually segmenting brain MR images, and this is a non-repeatable task. In view of this, an automatic segmentation of brain MR images is needed to correctly segment White Matter (WM), Gray Matter (GM) and Cerebrospinal Fluid (CSF) tissues of brain in a shorter span of time. The accurate segmentation is crucial as otherwise the wrong identification of disease can lead to severe consequences. Taking into account the aforesaid challenges, this research is focused towards highlighting the strengths and limitations of the earlier proposed segmentation techniques discussed in the contemporary literature. Besides summarizing the literature, the paper also provides a critical evaluation of the surveyed literature which reveals new facets of research. However, articulating a new technique is beyond the scope of this paper.[...] Read more.
DOI: https://doi.org/10.5815/ijmecs.2013.01.03, Pub. Date: 8 Jan. 2013
A requirement is a capability to which a product or service should conform to. A meticulous consideration to requirements engineering acts as a backbone of software projects. Ambiguous and unrealistic requirements are major source of failure in the software-intensive systems. Requirements engineering processes are complex as most of the requirements engineering documentation is written in natural languages which are less formal and often distract the designers and developers. Requirements management is a continuous process throughout the project lifecycle and relates to documenting, analyzing, tracing and prioritizing requirements and then finally controlling changes. The main issues related to requirements management are usually social, political and cultural. Software requirement engineers who gather the requirements generally consider that such issues are beyond the scope of their profession as they deem them within the project management ambit. In this study, we highlight the management issues that arise in the requirements engineering process and explore the possibilities to tackle them amicably. The study is supplemented with a critical review of the existing methodologies for resolving and managing software requirements.[...] Read more.
DOI: https://doi.org/10.5815/ijmecs.2012.03.02, Pub. Date: 8 Mar. 2012
Most of the software projects fail to meet the desired level of quality and standards due to different types of defects introduced during the course of requirement solicitation, designing and development. These defects inexorably hinder the secure deployment or smooth operations of the software systems. One of the key reasons for this misfortune is the lack of proper defect prevention planning while formulating the software architecture. Defect prevention needs to be a thorough and critical phase because it has a direct impact on quality of the product which cannot be compromised. This paper looks into different defect prevention techniques and analyses them critically. The scope of this study is restricted to the identification of the modern trends in defect prevention.[...] Read more.
Subscribe to receive issue release notifications and newsletters from MECS Press journals