Articles

Access the latest knowledge in applied science, electrical engineering, computer science and information technology, education, and health.

Filter Icon

Filters article

Years

FAQ Arrow
0
0

Source Title

FAQ Arrow

Authors

FAQ Arrow

29,602 Article Results

NLP-based fraudulent biomedical news identification using LSTM-SGD deep learning algorithm

10.11591/ijict.v15i1.pp179-188
Siva Dhievaraj , Agusthiyar Ramu
Concern over bio medical fake news is rising, particularly as false information about illnesses, medical procedures, and public health regulations becomes more prevalent. It is essential to recognize such false information, and deep learning (DL) algorithms can offer a potent remedy, especially when paired with sophisticated natural language processing (NLP) methods. This technique improves the model's capacity to ignore frequently used but uninformative terms and concentrate on important terminology. The model's capacity to concentrate on the most pertinent phrases for fake news identification is enhanced by the use of chi-squared, a statistical test that ascertains the dependency between various variables and aids in the removal of unnecessary data. By reducing less significant characteristics to zero, the Lasso approach, a kind of regression, is used for feature selection, guaranteeing that the model only utilizes the most predictive features for classification. A crucial step in getting the data ready for DL models is feature extraction, which turns unprocessed text into numerical data. After the structured data has been analyzed, algorithms like as stochastic gradient descent (SGD), long short-term memory (LSTM) may determine whether or not an article is accurate. The authenticity and dependability of medical information provided across platforms may be ensured by effectively identifying biomedical fake news by fusing DL with sophisticated NLP techniques.
Volume: 15
Issue: 1
Page: 179-188
Publish at: 2026-03-01

Development of machine learning techniques for automatic modulation classification and performance analysis under AWGN and fading channels

10.11591/ijict.v15i1.pp287-301
P. G. Varna Kumar Reddy , M. Meena
Automatic modulation classification (AMC) is essential in modern wireless communication for optimizing spectrum usage and adaptive signal processing. This study explores the use of various machine learning (ML) methods for AMC, focusing on their performance in additive white Gaussian noise (AWGN) and fading channels. This study evaluates of ML classifiers such as support vector machines (SVM), K-nearest neighbors (KNN), decision trees (DT), and ensemble methods with a dataset spanning signalto-noise ratios (SNRs) from -30 dB to +30 dB. Higher-order statistical features including moments and cumulants are used to train the classifiers for AMC. Performance is measured in terms of classification accuracy and computational efficiency across different SNR levels. The findings show that linear SVM, fine KNN, and fine trees consistently achieved high classification accuracy, even at low SNRs. From the analysis, it is observed that linear SVM and fine KNN achieve over 96% accuracy at 0 dB SNR. These classifiers demonstrate significant robustness, maintaining performance in challenging noise conditions. The research highlights the promise of ML techniques in improving AMC, providing a detailed comparison of classifiers and their strengths.
Volume: 15
Issue: 1
Page: 287-301
Publish at: 2026-03-01

GSM based load monitoring system with ADL classification and smart meter design

10.11591/ijict.v15i1.pp74-83
Debani Prasad Mishra , Rudranarayan Senapati , Rohit Kumar Swain , Subhankar Dash , Raj Alpha Swain , Surender Reddy Salkuti
This paper introduces a method for the classification of activities of daily living (ADL) by utilizing smart meter and smart switch data in a synergistic approach. Through the integration of these internet of things (IoT) devices, the paper aims to enhance the application of ADL classification. Guided by recent advancements in load monitoring and energy management systems, the methodology incorporates machine learning techniques to analyze data streams from both the smart meter and smart switch. Drawing inspiration from prepaid smart meter monitoring systems, IoT-based smart energy meters for optimizing energy usage, and energy metering chips with adaptable computing engines, our design incorporates diverse perspectives. Additionally, we consider the utilization of mobile communication for prepaid meters, remote detection of malfunctioning smart meters, and an empirical investigation into the acceptance of IoT-based smart meters. We substantiate our proposed approach through experimental results, showcasing its effectiveness in accurately classifying diverse ADL scenarios. This research contributes to the field of smart home technology by offering an advanced method for ADL classification. The integration of smart meter and smart switch data provides a comprehensive understanding of energy consumption patterns, opening avenues for improved energy management and informed decision-making within smart homes.
Volume: 15
Issue: 1
Page: 74-83
Publish at: 2026-03-01

Plant disease sensing using image processing (with CNN)

10.11591/ijict.v15i1.pp93-101
Haresh Rajkumar , Harry Jakin S. , Sudhakar Thirumalaivasal Devanathan , Booapthy Kannan
Plant disease is a significant challenge for agriculture, leading to reduced yield, economic loss, and environmental impact. Leveraging digital photos of plant leaves, convolutional neural networks (CNNs) have emerged as promising tools for disease detection. The methodology involves several steps, including image pre-processing, segmentation, feature extraction using CNNs. Crucially, a diverse dataset comprising images of both healthy and diseased leaves under varying conditions is necessary for training accurate models. Transfer learning, particularly with pre-trained models like ImageNet, can further enhance accuracy, allowing for better performance with fewer training samples. The proposed method demonstrates impressive results, achieving over 95% accuracy, outperforming existing state-of-the-art techniques. This system could serve as a valuable tool for farmers, facilitating timely disease identification and treatment, ultimately leading to increased agricultural yields, reduced financial losses, and the adoption of more sustainable farming practices. Additionally, beyond its practical applications, the proposed system holds promise for advancing sustainable agriculture by promoting environmentally friendly farming methods and contributing to the overall resilience and productivity of agricultural systems.
Volume: 15
Issue: 1
Page: 93-101
Publish at: 2026-03-01

Exploring diverse perspectives: enhancing black box testing through machine learning techniques

10.11591/ijict.v15i1.pp238-246
Heba Nafez Jalal , Aysh Alhroob , Ameen Shaheen , Wael Alzyadat
Black box testing plays a crucial role in software development, ensuring system reliability and functionality. However, its effectiveness is often hindered by the sheer volume and complexity of big data, making it difficult to prioritize critical test cases efficiently. Traditional testing methods struggle with scalability, leading to excessive resource consumption and prolonged testing cycles. This study presents an AI-driven test case prioritization (TCP) approach, integrating decision trees and genetic algorithms (GA) to optimize selection, eliminate redundancy, and enhance computational efficiency. Experimental results demonstrate a 96% accuracy rate and a 90% success rate in identifying relevant test cases, significantly improving testing efficiency. These findings contribute to advancing automated software testing methodologies, offering a scalable and efficient solution for handling large-scale, data-intensive testing environments.
Volume: 15
Issue: 1
Page: 238-246
Publish at: 2026-03-01

Securing Defi: a comprehensive review of ML approaches for detecting smart contract vulnerabilities and threats

10.11591/ijict.v15i1.pp438-446
Dhivyalakshmi Venkatraman , Manikandan Kuppusamy
The rapid evolution of decentralized finance (DeFi) has brought revolutionary innovations to global financial systems; however, it has also revealed some major security vulnerabilities, especially of smart contracts. Traditional auditing methods and static analysis tools are prone to fail in identifying sophisticated threats, including reentrancy attacks, front-running, oracle manipulation, and honeypots. This review discusses the growing role of machine learning (ML) in enhancing the security of DeFi systems. It provides a comprehensive overview of modern ML-based methods related to the detection of smart contract vulnerabilities, transaction-level fraud detection, and oracle trust assessment. The paper also provides publicly available datasets, necessary toolkits, and architectural designs used for developing and testing these models. Additionally, it provides future directions like federated learning, explainable AI, real-time mempool inspection, and cross-chain intelligence sharing. While it is full of promise, the application of ML in DeFi security is plagued by issues like data scarcity, interoperability, and explainability. This paper concludes by highlighting the need for standardised benchmarks, shared data initiatives, and the integration of ML into development pipelines to deliver secure, scalable, and reliable DeFi ecosystems.
Volume: 15
Issue: 1
Page: 438-446
Publish at: 2026-03-01

Practice-based teaching using an AI platform to strengthen faculty competency

10.11591/ijict.v15i1.pp171-178
Angsana Phonsuk , Phakharach Plirdpring
This research aimed to i) analyze faculty members’ knowledge, understanding, and skills in using AI for practice-based teaching enhancement, ii) evaluate factors affecting faculty readiness in integrating AI into teaching processes, and iii) design and develop an AI platform to enhance faculty competency in practice-based teaching. The questionnaire, validated by five experts, was administered to 200 respondents divided into two groups: 100 faculty members from public universities and 100 from private universities. Comparative analysis revealed that public university faculty and private university faculty statistically significant differences in challenges and concerns at the 05 level, with public university faculty expressing higher concerns. Significant differences were found in AI experience and skills, attitudes toward AI use, and challenges and concerns. However, no significant differences were observed in three other areas: AI knowledge and understanding, AI readiness, and belief in AI’s effectiveness for practice-based learning enhancement. Data from both groups were utilized in designing and developing the AI platform to enhance practicebased teaching competency in higher education. Expert evaluation of the platform’s suitability showed high levels of demand for the AI platform and high appropriateness of the technology used in platform development.
Volume: 15
Issue: 1
Page: 171-178
Publish at: 2026-03-01

Exploratory data analysis and forecasting of dengue outbreaks in Pangasinan using the ARIMA model

10.11591/ijict.v15i1.pp46-56
Patrick Mole , Thelma Palaoag
Dengue fever remains a critical public health concern in tropical countries like the Philippines, with Pangasinan frequently experiencing outbreaks due to favorable environmental conditions for mosquito breeding. Despite ongoing efforts to control the disease, the absence of a reliable forecasting tool limits the ability of health authorities to implement proactive measures. This study developed a forecasting model using the autoregressive integrated moving average (ARIMA) technique, following an initial exploratory data analysis (EDA) to identify trends and patterns in historical dengue case data from 2019 to 2024. The ARIMA model was trained and validated using historical data, capturing seasonal variations and projecting future dengue outbreaks. The evaluation metrics, including mean absolute error (MAE), root mean squared error (RMSE), and mean absolute percentage error (MAPE), indicated that the model achieved an accuracy of approximately 78.3%, suggesting reasonable predictive capability. Forecasts for the year 2025 indicate a potential rise in dengue cases, particularly during peak seasons, aligning with observed historical trends. These predictions offer valuable insights for local health authorities, enabling them to plan targeted interventions, allocate resources efficiently, and mitigate the impact of future outbreaks. The study demonstrates the practical application of time series analysis in public health forecasting and provides a proactive tool tailored for the needs of Pangasinan.
Volume: 15
Issue: 1
Page: 46-56
Publish at: 2026-03-01

Enhancing power grid reliability: a hybrid blockchain and machine learning approach

10.11591/ijape.v15.i1.pp421-429
Ravi V. Angadi , Suresh Kumar , A. K. Vijayalakshmi , G. N. Vidya Shree
As contemporary power grids are becoming more complex with the integration of renewable energy sources, distributed generation, and smart grid technologies. Conventional contingency analysis techniques, based on centralized architectures and static rule-based evaluations, tend to be inadequate in real-time fault detection, automated response, and cybersecurity. This paper suggests a hybrid approach that combines machine learning algorithms with blockchain technology to improve both predictive intelligence and security of contingency analysis. For the IEEE 30-bus test case, different line outage and generator failure cases were simulated. Different machine learning models, such as random forest (RF), support vector machine (SVM), and gradient boosting (GB), were trained to classify and predict these contingencies. In parallel, cryptographic primitives like advanced encryption standard (AES), Rivest–Shamir–Adleman (RSA), and elliptic curve cryptography (ECC) were tested in a blockchain setting to provide security for event data and enable automatic recovery steps through smart contracts. Outcomes illustrate that the GB showed the maximum fault classification rate (93.4%), and ECC ensured light yet robust data protection for blockchain activities. Against the conventional system, the designed model enhanced the response time in case of faults, accuracy, and system fault tolerance. This two-layer mechanism presents a scalable, proactive, and cyber-safe mechanism for the power grid in the future.
Volume: 15
Issue: 1
Page: 421-429
Publish at: 2026-03-01

Preserving non-minimum phase dynamics in model order reduction of fifth-order DC-DC boost converters

10.11591/ijape.v15.i1.pp165-176
Neha Rani , Souvik Ganguli , Manjeet Singh , Sundeep Singh Saini
In this work, a unified modelling approach is developed for the model order reduction of non-minimum phase systems. An optimized approach is adopted to address the problem. The coordinated hunting behavior of Cuban boa snake is made use of to develop a new optimization strategy. A constrained optimization method is developed to reduce a 5th order boost converter in the unified domain. Comparison is carried out with multiple classical techniques as well as some of the widely known nature inspired algorithms. The step and Bode responses using the proposed method offers closeness to the original responses as compared to the existing techniques. The pole zero mapping reveals the non-minimum nature of the reduced system. The stability of the reduced system is reflected through the Nyquist plot. A second-order proportional-integral-derivative (PID) controller is also synthesized using approximate model matching and Cuban boa snake optimization algorithm (CBSOA), which demonstrates superior transient performance, minimal steady-state error, and enhanced robustness.
Volume: 15
Issue: 1
Page: 165-176
Publish at: 2026-03-01

Raindrop and bit drop effects on millimeter wave network performance: a critical review

10.11591/csit.v7i1.p83-92
Victor Dela Gordon , Amevi Acakpovi , George Kwamena Aggrey , Michael Gameli Dziwornu
This PRISMA guided review examines how rain precipitation degrades 5G millimeter wave (mmWave) network performance, with emphasis on rain induced bit drop and its impact on end-to-end quality of service (QoS). From an initial corpus of 13,317 publications screened across IEEE Xplore, ACM Digital Library, ScienceDirect, Google Scholar, and ELICIT, 18 peer reviewed studies published between 2018 and 2024 met the inclusion criteria. Findings show that rainfall significantly weakens mmWave signals, with specific attenuation ranging from approximately 4 to 45 dB/km at 100 mm/h, particularly in tropical regions. When QoS outcomes are reported, these losses manifest as increased bit error rates, rain driven bit drop along the link, higher packet loss and delay, and reduced throughput. Key deficiencies identified include limited empirical validation of attenuation models against packet level QoS, lack of standardized propagation datasets for short range links, and weak treatment of bit level impairments within QoS analysis. To address these gaps, the review recommends enhancing ITU R P.530 and Mie scattering models with region specific measurements, implementing rain aware adaptive protocols, and adopting standardized benchmarking frameworks that link rain attenuation, bit drop, and QoS. This synthesis offers guidance for building climate aware mmWave systems and positions bit drop as a practical metric for precipitation resilience assessment.
Volume: 7
Issue: 1
Page: 83-92
Publish at: 2026-03-01

Development and performance evaluation of a CNN model for seagrass species classification in Bintan, Indonesia

10.11591/csit.v7i1.p20-29
Nurul Hayaty , Hollanda Arief Kusuma
This study presents the development and evaluation of a convolutional neural network (CNN) model for automated seagrass species classification in Bintan, Indonesia. The objective of this research is to examine how different train-validation data split ratios affect model accuracy and generalization performance. The CNN was trained under four configurations (60:40, 70:30, 80:20, and 90:10) to analyze the influence of training data volume on learning convergence and predictive capability. The results indicate that all configurations achieved high validation accuracy, with the best performance reaching 98.53% when using the 90:10 split. Evaluation on unseen data demonstrated that the 60:40 configuration provided the most consistent and reliable generalization. Performance variations were also affected by the morphological similarity between the classified species, which increases the challenge in correctly distinguishing certain classes. Overall, the findings confirm the effectiveness of CNN-based classification for supporting marine biodiversity monitoring and underline the importance of dataset composition in achieving optimal performance. Future improvements will focus on expanding data variability to enhance robustness in real-world scenarios.
Volume: 7
Issue: 1
Page: 20-29
Publish at: 2026-03-01

Review on patch antenna for 5G Networks at Ka-Band

10.11591/csit.v7i1.p102-110
Md. Nurullah Al Nasib , Md. Sohel Rana
Microstrip antennas for Ka-band wireless applications will be thoroughly examined in this research. To utilize 5G wireless applications, a new research topic that has been established is the creation of microstrip patch antennas. Patch antennae are made of different shapes, such as rectangles, circular shapes, triangles, donuts, rings, etc. Many substrate materials are used in patch antenna designs. This article examines the geometric configurations of antennas, the many methods of analysis for attributes of antennas, the dimensions of antennas, the issues that antennas face, and the potential solutions to those challenges. Wireless communication technologies, such as television broadcasts, microwave ovens, mobile phones, wireless local area networks (LANs), Bluetooth, global positioning systems (GPS), and two-way radios, all use it. This article examines the geometric structures of antennas, including several characteristics and materials by which they are constructed, as well as the numerous shapes they can produce. This paper will also examine return loss (S11), bandwidth, voltage standing wave ratio (VSWR), gain, directivity, efficiency, and Bandwidth discussed in the prior studies. In the future, a novel patch antenna can be designed for 5G wireless applications.
Volume: 7
Issue: 1
Page: 102-110
Publish at: 2026-03-01

Towards efficient fog computing in smart cities: balancing energy consumption and delay

10.11591/ijict.v15i1.pp332-342
Ida Syafiza Md Isa , Nur latif Azyze Mohd Shaari Azyze , Haslinah Mohd Nasir , Vigneswara Rao Gannapathy , Ashwini Jayadevan Naidu
In this work, we propose fog-based energy-delay optimization (F-EDO) approach and benchmark its performance against the cloud-based energydelay optimization (C-EDO) method, focusing on energy consumption and delay. Unlike previous studies that optimize energy or delay separately, FEDO minimizes both metrics simultaneously, achieving up to 52.2% energy savings with near-zero delay. Additionally, increasing the number of users also leads to energy savings. This is due to the optimized placement of fog servers at the access layer which reduces network energy consumption compared to C-EDO. F-EDO also significantly reduces delay, with negligible delay compared to C-EDO due to fog servers are placed closer to the users which minimized the transmission distances. Besides, the results also show that the energy saving in F-EDO compared to the C-EDO increased as the processing capacity of the processing server increased while maintaining its minimal delay. Overall, F-EDO proves to be a more energyefficient and lower-delay solution for IoT networks, offering a better alternative to cloud-based offloading.
Volume: 15
Issue: 1
Page: 332-342
Publish at: 2026-03-01

Classification and regression tree model for diabetes prediction

10.11591/ijict.v15i1.pp207-216
Farah Najidah Noorizan , Nur Anida Jumadi , Li Mun Ng
Diabetes mellitus is characterized by excessive blood glucose that occurs when the pancreas malfunctions while producing insulin. High blood glucose levels can cause chronic damage to organs, particularly the eyes and kidneys. Diabetes prediction models traditionally use a variety of machine learning (ML) algorithms by combining data from the glucose levels, patient health parameters, and other biomarkers. Prior research on diabetes prediction using various algorithms, such as support vector machine (SVM) and decision tree (DT) models, demonstrates an accuracy rate of approximately 70%, which is relatively modest. Therefore, in this study, a classification and regression tree (CART) multiclassifier model has been proposed to improve the accuracy of diabetes prediction, which is based on three classes: non-diabetic, pre-diabetic, and diabetic. The study involved data preprocessing steps, hyperparameter tuning, and evaluation of performance metrics. The model achieved 97% accuracy while utilizing the value of 5 for the number of leaves per node, the value of 10 for the maximum number of splits, and deviance as the split criterion, which also resulted in a precision of 98%, recall of 97%, and F1-score of 98%, showing that the proposed multiclassifier model can accurately predict diabetes. In conclusion, the proposed CART model with the best hyperparameter setting can enable the highest accuracy in predicting diabetes classes.
Volume: 15
Issue: 1
Page: 207-216
Publish at: 2026-03-01
Show 8 of 1974

Discover Our Library

Embark on a journey through our expansive collection of articles and let curiosity lead your path to innovation.

Explore Now
Library 3D Ilustration