Rishika Chauhan

Work place: Department of Mechanical Engineering, Jaypee University of Engineering and Technology, Guna-473226, Madhya Pradesh, India

E-mail: p.dumka.ipec@gmail.com

Website:

Research Interests: Numerical Analysis, Wireless Sensor Networks, Neural Networks, Wireless Networks

Biography

Rishika Chauhan has been working with the Jaypee University of Engineering and Technology as an assistant professor in the Department of Electronics and Communication Engineering since 2011. She teaches both undergraduate and postgraduate students of engineering at the University. Her area of interest includes Wireless Communication, Numerical Computations, Artificial Neural Networks, and Discrete Mathematics. Mrs. Chauhan has published number of research publications in numerous journals of national and international repute.

Author Articles
Application of Python in Evaluating the Volume of 3D Shapes Using Monte Carlo Simulation

By Pankaj Dumka Rishika Chauhan Dhananjay R. Mishra

DOI: https://doi.org/10.5815/ijem.2026.01.05, Pub. Date: 8 Feb. 2026

Volume estimation of three-dimensional (3D) objects is fundamental in various scientific and engineering fields. While analytical expressions exist for the simple geometric shapes, they become impractical for complex or irregular structures. Monte Carlo simulation is a statistical method which is based on the random sampling, which offers an efficient numerical alternative. This research explores the application of Monte Carlo integration method for the estimation of the volumes of three different 3D objects viz. sphere, cylinder, and cone. The paper elaborates on the mathematical background of the simulation by presenting detailed Python implementations, and analyzes the accuracy, convergence rates, and computational efficiency of the method. The study concludes that the simulation, despite their probabilistic nature, provide an effective and scalable technique for volume estimation, particularly for the shapes without closed-form volume expressions.

[...] Read more.
Modelling Taylor's Table Method for Numerical Differentiation in Python

By Pankaj Dumka Rishika Chauhan Dhananjay R. Mishra

DOI: https://doi.org/10.5815/ijmsc.2023.04.03, Pub. Date: 8 Dec. 2023

In this article, an attempt has been made to explain and model the Taylor table method in Python. A step-by-step algorithm has been developed, and the methodology has been presented for programming. The developed TT_method() function has been tested with the help of four problems, and accurate results have been obtained. The developed function can handle any number of stencils and is capable of producing the results instantaneously. This will eliminate the task of hand calculations and the use can directly focus on the problem solving rather than working hours to descretize the problem.

[...] Read more.
Other Articles