Articles

Access the latest knowledge in applied science, electrical engineering, computer science and information technology, education, and health.

Filter Icon

Filters article

Years

FAQ Arrow
0
0

Source Title

FAQ Arrow

Authors

FAQ Arrow

29,602 Article Results

Design and development of machine learning-based web application for oil palm yield prediction

10.11591/ijict.v15i1.pp228-237
Yuhao Ang , Helmi Shafri , Yang Ping Lee , Shahrul Azman Bakar , Hwee San Lim , Rosni Abdullah , Yusri Yusup , Mohammed Mustafa AL-Habshi
The prediction of crop yields is influenced by various factors such as weather conditions, agronomic practices, and management strategies. Accurately predicting oil palm yield is crucial for sustainable production, as it plays a significant role in global food security. Challenges such as climate change and nutrient deficiencies have adversely affected yields, highlighting the necessity for a specialized web application tailored to the oil palm industry. This study presents a machine-learning-based web application that utilizes a deep learning model to estimate oil palm yields by integrating key parameters, including weather, agronomy, and satellite data. The application features a user-friendly interface and a dashboard for comparing predicted and actual yields, enhancing user engagement and facilitating collaboration among stakeholders. By deploying this tool on the cloud, plantation managers can make informed decisions early in the yield prediction process, ultimately improving plantation management and profitability. This web application is designed to provide valuable insights to stakeholders, contributing to effective decision-making in the oil palm sector.
Volume: 15
Issue: 1
Page: 228-237
Publish at: 2026-03-01

FPGA implementation of a coprocessor architecture for random data generation and encryption

10.11591/ijres.v15.i1.pp21-30
Manoj Kumar
Coprocessors are designed to perform some specific tasks to enhance system performance and speed. Information security is the main focus in internet of thing (IoT), cryptography, and cybersecurity applications. In this work, a coprocessor architecture is designed to generate 4-bits of random data and perform encryption. Coprocessor architecture uses true random number generator (TRNG) and pseudo-random number generator (PRNG) architectures to generate random data. Modified linear feedback shift register (LFSR)-based PRNG and modified transition effect ring oscillator (TERO) and ring oscillator-based TRNG architectures are designed and implemented for performing encryption. A serial-in-parallel-out (SIPO) shift register circuit is used to generate 4-bit random data. A 15-bit instruction word is assigned to coprocessor architecture to perform its task. The coprocessor architecture is designed using VHSIC Hardware Description Language (VHDL) and implemented on an Artix-7 field programmable gate array (FPGA). All simulation and synthesis results of the proposed coprocessor architecture are obtained by the Xilinx Vivado 2015.2 tool. Coprocessor architecture efficiency (throughput (Mbps)/LUTs) is 2.31, and it operates at a 100 MHz clock.
Volume: 15
Issue: 1
Page: 21-30
Publish at: 2026-03-01

Heart disease prediction using hybrid deep learning and medical imaging with wavelet-based feature extraction

10.11591/ijres.v15.i1.pp183-193
Chairmadurai Palanisamy , Kavitha Pachamuthu , Arun Kumar Ramamoorthy
The process of heart disease prediction is based on patient medical information, which can be addressed in terms of medical image as well as the results of an electrocardiogram (ECG) conducted to determine the risk of developing heart disease. The hybrid deep learning (DL) algorithms are developed using past data that can identify trends related to cardiovascular disease (CVDs). In the current paper, it is possible to offer a new method of heart disease prediction that would combine high-quality image processing and hybrid DL to enhance the effectiveness of predictions and avoid the shortcomings of the modern approaches. First, medical images like ECG images are pre-processed with butterworth adaptive 2D wavelet filter, which ensures maximal noise reduction, followed by maintenance of spatial and frequency information. The Gabor Wavelet-based feature extraction technique is applied to extract meaningful patterns, including both spatial and frequency domain information, which is essential for detecting heart-related anomalies. The resultant features are then categorized, along with both convolutional neural networks (CNN) and long short-term memory (LSTM), to make reliable and precise predictions of heart disease. The performance indicators, including accuracy (92.4%), precision (91.2%), recall (93.5%), and F1-score (91.0%), are utilized. Applying the model yields significant levels of reliability and generalization compared to traditional applications.
Volume: 15
Issue: 1
Page: 183-193
Publish at: 2026-03-01

Deployment and evaluation of facial expression recognition on Android and Temi V3 in controlled settings

10.11591/ijres.v15.i1.pp42-53
Mohamad Hariz Nazamid , Rozita Jailani , Nur Khalidah Zakaria , Anwar P. P. Abdul Majeed
Facial expression recognition (FER) is vital for improving human-robot interaction (HRI). This study presents the deployment and evaluation of an optimized FER model on android devices, specifically tested on the Temi V3 robot in controlled environments. Trained using FER+ and CK+ datasets and optimized with TensorFlow Lite (TFLite) and MobileNetV2, the model achieved a validation accuracy of 92.32%. Its performance was assessed on the Temi V3 robot and a Samsung A52 smartphone, focusing on CPU usage, memory, and power consumption. Cross-device compatibility and real-time performance challenges were addressed through model quantization and thread optimization. Real-time testing on the Temi V3 showed an overall accuracy of 82.28%, with emotion-specific accuracies ranging from 46.19% to 92.28%. This study offers practical insights for optimizing FER systems across android platforms, with potential applications in education, healthcare, and customer service. The results support the feasibility of implementing FER models as backends in android applications, enabling more intuitive and responsive HRI. Future work will focus on improving model efficiency for lower-end devices and exploring on-device learning techniques to boost accuracy in diverse real-world environments.
Volume: 15
Issue: 1
Page: 42-53
Publish at: 2026-03-01

Enhancing power grid reliability: a hybrid blockchain and machine learning approach

10.11591/ijape.v15.i1.pp421-429
Ravi V. Angadi , Suresh Kumar , A. K. Vijayalakshmi , G. N. Vidya Shree
As contemporary power grids are becoming more complex with the integration of renewable energy sources, distributed generation, and smart grid technologies. Conventional contingency analysis techniques, based on centralized architectures and static rule-based evaluations, tend to be inadequate in real-time fault detection, automated response, and cybersecurity. This paper suggests a hybrid approach that combines machine learning algorithms with blockchain technology to improve both predictive intelligence and security of contingency analysis. For the IEEE 30-bus test case, different line outage and generator failure cases were simulated. Different machine learning models, such as random forest (RF), support vector machine (SVM), and gradient boosting (GB), were trained to classify and predict these contingencies. In parallel, cryptographic primitives like advanced encryption standard (AES), Rivest–Shamir–Adleman (RSA), and elliptic curve cryptography (ECC) were tested in a blockchain setting to provide security for event data and enable automatic recovery steps through smart contracts. Outcomes illustrate that the GB showed the maximum fault classification rate (93.4%), and ECC ensured light yet robust data protection for blockchain activities. Against the conventional system, the designed model enhanced the response time in case of faults, accuracy, and system fault tolerance. This two-layer mechanism presents a scalable, proactive, and cyber-safe mechanism for the power grid in the future.
Volume: 15
Issue: 1
Page: 421-429
Publish at: 2026-03-01

IoT cloud integration with EfficientNet-B7 for real-time pest monitoring and leaf-based classification

10.11591/ijres.v15.i1.pp150-158
Sabapathi Shanmugam , Vijayalakshmi Natarajan
The increasing prevalence of pest infestations poses a significant threat to global agricultural productivity, often resulting in substantial yield losses and economic damage. To address this challenge, this paper proposes an intelligent, cloud-enabled pest detection and classification framework leveraging state-of-the-art deep learning techniques. The proposed system integrates YOLOv8 for rapid and accurate pest detection with EfficientNet-B7 for fine-grained species-level classification. The framework is trained and evaluated using the Pestopia dataset, which contains annotated images representing diverse pest species. To enhance data diversity, robustness, and model generalization, data augmentation techniques such as center cropping and horizontal flipping are applied during preprocessing. YOLOv8 is employed to detect and localize pest instances within images, while EfficientNet-B7 extracts high-level discriminative features from detected regions to enable precise species identification. Furthermore, the system incorporates cloud-based real-time monitoring through Adafruit IO, enabling scalable, remote access to pest information for timely decision-making. The performance of the proposed framework is evaluated using standard metrics, including accuracy, precision, recall, and F1-score, achieving values of 97.8%, 98.9%, 98.4%, and 98.9%, respectively. The experimental results demonstrate the effectiveness and reliability of the proposed approach for real-time pest management. The cloud-integrated architecture facilitates proactive pest control strategies, supporting smarter, data-driven agricultural practices, and improved crop protection.
Volume: 15
Issue: 1
Page: 150-158
Publish at: 2026-03-01

Energy-efficient reconfigurable architectures for Edge AI in healthcare IoT: trends, challenges, and future directions

10.11591/ijres.v15.i1.pp1-20
Tole Sutikno , Aiman Zakwan Jidin , Lina Handayani
The integration of Edge artificial intelligence (AI) with internet of things (IoT) technologies is transforming healthcare applications, including wearable monitoring, telemedicine, and implantable medical devices, by enabling low-latency and intelligent data processing close to patients. However, stringent requirements on energy efficiency, reliability, real-time responsiveness, and data privacy continue to hinder scalable and long-term deployment in resource-constrained healthcare environments. Energy-efficient reconfigurable architectures—such as field-programmable gate arrays (FPGAs), coarse-grained reconfigurable arrays (CGRAs), and emerging memory-centric and heterogeneous platforms—have emerged as promising solutions to address these challenges by balancing flexibility, adaptability, and power efficiency. This review systematically examines recent advances in reconfigurable Edge AI architectures for healthcare IoT, highlighting key trends in hardware–software co-design, AI-assisted design automation, memory-centric optimization, and domain-specific overlays. It further identifies critical challenges, including energy–performance trade-offs, runtime reconfiguration overheads, security and privacy vulnerabilities, limited standardization, and reliability concerns in dynamic clinical settings. Finally, future research directions are outlined, emphasizing self-optimizing and context-aware architectures, secure and trustworthy reconfiguration mechanisms, unified frameworks for heterogeneous healthcare workloads, and sustainable, carbon-aware edge computing. Collectively, this review positions energy-efficient reconfigurable architectures as a foundational enabler for next-generation Edge AI in IoT-enabled healthcare systems.
Volume: 15
Issue: 1
Page: 1-20
Publish at: 2026-03-01

FPGA implementation and bit error rate analysis of the forward error correction algorithms in voice signals

10.11591/ijres.v15.i1.pp86-96
Ramjan Khatik , Afzal Shaikh , Shraddha Sawant , Pritika Patil
The idea of codes (VITERBI) is broadly utilized as a part of the wireless communication system as a result of their less complex nature in the decoding of transmitted message. This paper attempts to develop a performance analysis of the decoder by methods for bit error rate (BER) examination. The Galois field based decoder calculation is only utilized as a part of the communication systems. The decoder calculation-based Viterbi based decoder is carried out using field programmable gate arrays (FPGA) and MATLAB. This paper looks at the execution examination of both the calculations. The reconfigurable processor called Microblaze on the Spartan 3E FPGA is utilized for this purpose. MATLAB based code is used to see the BER analysis after the FPGA implementation output.
Volume: 15
Issue: 1
Page: 86-96
Publish at: 2026-03-01

Inquisitive biometric feature analysis and implementation for recognition tasks using camouflaged segmentation with AI and IoT

10.11591/ijres.v15.i1.pp119-129
Mahesh Shankarrao Patil , Harsha J. Sarode , Abhijit Banubakode , Prakash Tukaram Patil , Nutan Patil , Vijayakumar Varadarajan , Deshinta Arrova Dewi
A vital role in reconfigurable and embedded systems which are deployed in smart environements and healthcare monitoring applications is played by human activity recognition (HAR). However, the potential leakage of sensitive user attributes raises serious privacy issues due to collection of data from the end devices and it needs to be transmitted to more powerful platforms for inference. Addressing this key challenge is principally crucial for resource-constrained embedded systems where efficiency of energy is a chief design requirement. The aim of this paper is present an energy-aware, privacy-preserving HAR framework appropriate for low-power embedded platforms. A machine learning–based camouflaged signal segmentation technique is proposed to transform the data collected from the sensor by eliminating sensitive information while preserving activity-relevant features. For characterization of trade off between the energy consumption and accuracy of recognition, parameters are extensively tuned by careful optimization in this proposed model. Experimental evaluations demonstrate that the method significantly reduces the inference of sensitive attributes such as gender, age, height, and weight, with minimal impact on HAR accuracy. Furthermore, the system supports configurable trade-offs between energy usage and classification performance, making it suitable for implementation on low-power embedded devices.
Volume: 15
Issue: 1
Page: 119-129
Publish at: 2026-03-01

Remote procedure call communication and control of autonomous mobile robot for indoor smart waste monitoring

10.11591/ijra.v15i1.pp89-98
Ashaari Yusof , Abdullah Man , Azmi Ibrahim , Mohamed Ashraf Husni Zai , Md. Jakir Hossen
The integration of autonomous mobile robots (AMRs) and Internet of Things (IoT) technology has revolutionized various industries, including smart waste management (SWM). In this paper, the implementation of a customized remote procedure call (RPC) methodology was successfully demonstrated. This methodology facilitated control and monitoring of AMRs for smart indoor waste management to collect and dispose waste, monitor bin threshold levels and report relevant parameters to a cloud-based platform. Key operational parameters from the AMR and the smart bins via assembled user smart dashboard ensures seamless user monitoring for indoor waste management. Our findings underscore the relevance of RPC in advancing smart waste management technologies, contributing to operational efficiency and sustainability.
Volume: 15
Issue: 1
Page: 89-98
Publish at: 2026-03-01

Car selection in games using multi-objective optimization by ratio analysis based on player achievement

10.11591/csit.v7i1.p30-45
Caesar Nafiansyah Putra , Fresy Nugroho , Mochamad Imamudin , Dwi Pebrianti , Jehad Abdelhamid Hammad , Tri Mukti Lestari , Dian Maharani , Alfina Nurrahman
The selection menu in some racing games usually uses a random system for vehicle selection. However, this random feature generally randomizes the selection of the index without considering factors that support the player's abilities. Therefore, this study aims to develop a racing game that can suggest vehicles that have been adjusted to the player's performance. Vehicle recommendations are made using the multi-objective optimization on the basis of ratio analysis (MOORA) method as its method. The MOORA calculation ranks vehicles based on criteria such as mileage, fuel efficiency, speed, agility, and others collected in previous games. The results of this study show the effectiveness of using the MOORA method in recommending vehicles that match the player's skills, thereby improving the overall player experience. In addition, the usability test produced a system usability scale (SUS) score of 82.4, so it is included in the very good category.
Volume: 7
Issue: 1
Page: 30-45
Publish at: 2026-03-01

Analysis of congestion management using generation rescheduling with augmented Mountain Gazelle optimizer

10.11591/ijict.v15i1.pp57-65
Chidambararaj Natarajan , Aravindhan Karunanithy , S. Jothika , R. P. Linda Joice
This study presents an original blockage of the executive’s approach utilizing age rescheduling with the augmented mountain gazelle optimizer (AMGO). Enlivened by the versatility of mountain gazelles, AMGO is applied to enhance age plans for a reasonable power framework situation. The strategy successfully mitigates clogs, taking into account functional imperatives, market elements, and vulnerabilities. Recreation results show AMGO’s heartiness, seriousness, and proficiency in contrast with existing strategies. Notwithstanding its heartiness in blockage the board, the AMGO presents a state-of-the-art versatile element, enlivened by the spryness of mountain gazelles, empowering constant changes in accordance with developing power framework conditions and contrasted and genetic algorithms and PSO. The review adds to propelling streamlining methods for clogging the executives, offering a promising device for improving power framework, unwavering quality and productivity.
Volume: 15
Issue: 1
Page: 57-65
Publish at: 2026-03-01

Enhancing intellectual property rights management through blockchain integration

10.11591/ijict.v15i1.pp111-119
Raghavan Sheeja , Sherwin Richard R. , Shreenidhi Kovai Sivabalan , Srinivas Madhavan
The generational improvement has significantly converted several industries, and the area of intellectual property rights (IPR) isn’t any exception. IPRs, being as important as they are, need to be securely managed in some way. Blockchain, with its decentralized and immutable nature, gives a promising answer for enhancing the management of intellectual property (IP). This paper explores the strategic integration of blockchain generation for the control of IPR. The proposed system consists of a complete system, from registration and validation to predictive evaluation and royalty distribution, all facilitated through clever contracts. The use of zero-knowledge proofs guarantees the safety and confidentiality of sensitive information. The paper discusses the advantages and future implications of implementing this type of device.
Volume: 15
Issue: 1
Page: 111-119
Publish at: 2026-03-01

Classification and regression tree model for diabetes prediction

10.11591/ijict.v15i1.pp207-216
Farah Najidah Noorizan , Nur Anida Jumadi , Li Mun Ng
Diabetes mellitus is characterized by excessive blood glucose that occurs when the pancreas malfunctions while producing insulin. High blood glucose levels can cause chronic damage to organs, particularly the eyes and kidneys. Diabetes prediction models traditionally use a variety of machine learning (ML) algorithms by combining data from the glucose levels, patient health parameters, and other biomarkers. Prior research on diabetes prediction using various algorithms, such as support vector machine (SVM) and decision tree (DT) models, demonstrates an accuracy rate of approximately 70%, which is relatively modest. Therefore, in this study, a classification and regression tree (CART) multiclassifier model has been proposed to improve the accuracy of diabetes prediction, which is based on three classes: non-diabetic, pre-diabetic, and diabetic. The study involved data preprocessing steps, hyperparameter tuning, and evaluation of performance metrics. The model achieved 97% accuracy while utilizing the value of 5 for the number of leaves per node, the value of 10 for the maximum number of splits, and deviance as the split criterion, which also resulted in a precision of 98%, recall of 97%, and F1-score of 98%, showing that the proposed multiclassifier model can accurately predict diabetes. In conclusion, the proposed CART model with the best hyperparameter setting can enable the highest accuracy in predicting diabetes classes.
Volume: 15
Issue: 1
Page: 207-216
Publish at: 2026-03-01

Reputation-enhanced two-way hybrid algorithm for detecting attacks in WSN

10.11591/ijict.v15i1.pp428-437
Divya Bharathi Selvaraj , Veni Sundaram
Wireless sensor networks (WSNs) are susceptible to a variety of attacks, such as data tampering attacks, blackhole attacks, and grayhole attacks, that can affect the reliability of communication. We proposed a reputationenhanced two-way hybrid algorithm (RCHA) that uses cryptographic hash functions and reputation-based trust management to detect and de-escalate attacks accurately. The RCHA algorithm implements two hash functions RACE integrity primitives’ evaluation message digest (RIPEMD) and secure hash algorithm (SHA-3), to initiate the integrity check for the entire packet sent across the network. Every node in the WSN tracks a reputation score for each neighbor the node is connected to, and this score is dynamically updated based on the behavior of each neighbor. If a neighboring node’s reputation drops below a threshold, the node is sent a maliciousness designation. At that time, the node will broadcast an alert message to its neighboring nodes and begin to reroute its data through one of its trusted neighbors to ensure the reliability of the communication. The simulation results reported that the RCHA algorithm improved the accuracy of the attack detection rate and the number of packets delivered compared to traditional attack detection methods. The RCHA algorithm was able to maintain low computational and energy overhead for the WSN, making it an attractive option for a resource-constrained application in a WSN. Given the trends towards more collaborative networks, the reputation mechanism in the RCHA algorithm improves the overall reliability and capabilities of the WSN, regardless of adversaries.
Volume: 15
Issue: 1
Page: 428-437
Publish at: 2026-03-01
Show 9 of 1974

Discover Our Library

Embark on a journey through our expansive collection of articles and let curiosity lead your path to innovation.

Explore Now
Library 3D Ilustration