Journal of Autonomous Intelligence (2024) Volume 7 Issue 5 doi: 10.32629/jai.v7i5.1622 Original Research Article Deep learning techniques for predicting the customer lifetime value to improve customer relationship management Nabeel S. Alsharafa1, P. Madhubala2, Sree Lakshmi Moorthygari3, K. N. Rajapraveen4, B.R. Kumar5, Sudhakar Sengan6,*, Pankaj Dadheech7 1 2 Department of Information Technology, College of Science, University of Warith Al-Anbiyaa, Karbala 56003, Iraq Department of Computer Science and Engineering, Bharathiyar Institute of Engineering for Women, Tamil Nadu 636112, India 3 Department of Business Management, Mahatma Gandhi University, Telangana 508003, India 4 Department of CSE, JAIN (Deemed-to-be University), Karnataka 560069, India 5 Department of MBA, Andhra Loyola College, Vijayawada 520008, India 6 Department of Computer Science and Engineering, PSN College of Engineering and Technology, Tamil Nadu 627451, India 7 Department of Computer Science and Engineering, Swami Keshvanand Institute of Technology, Management & Gramothan (SKIT), Rajasthan 302017, India * Corresponding author: Sudhakar Sengan,

[email protected]

ABSTRACT Deep Learning (DL) forecasts Customer Lifetime Value (CLV) and optimises CRM in current research. ML models can be adapted and used alongside CRM methods to recognise customer behaviour anomalies amid numerous customer relationships, heterogeneous statistics, and time-sensitive data. This technique allows companies to maintain customers and improve profit, advertising, and confidence, divided by income. First, the study recommends a multi-output Deep Neural Network (DNN) model for predicting CLV. The suggested framework was measured with multi-output Decision Tree (DT) and multi-output Random Forest (RF) techniques on the same dataset. The study presents a multilayer supervised DL-based CLV prediction technique that enhances features on limited data, outperforming better-quality goods in marketing effectiveness and client lifetime value. The research explores using CLV prediction in personalized customer experiences, highlighting its potential to enhance CRM strategies by incorporating dynamic variables and current data for improved accuracy. The Deep Neural Network model has an acceptable error rate of MAPE of 10.3%, MSE of 11.6%, and RMSE of 12.29%, demonstrating reasonable complete error rates. Keywords: deep learning; customer lifetime value; accuracy; CRM; MAPE; MSE; RMSE ARTICLE INFO Received: 29 February 2024 Accepted: 18 March 2024 Available online: 17 May 2024 COPYRIGHT Copyright © 2024 by author(s). Journal of Autonomous Intelligence is published by Frontier Scientific Publishing. This work is licensed under the Creative Commons Attribution-NonCommercial 4.0 International License (CC BY-NC 4.0). https://creativecommons.org/licenses/bync/4.0/ 1. Introduction Both research and business have investigated Customer Relationship Management (CRM) since the 1980s because it can increase business profits and market benefits. Businesses can classify customers by consumption behaviour, lifetime value, and other parameters to calculate customer value for CRM. Successful marketing approaches demand empirical studies on Customer Lifetime Value (CLV)[1–5]. CLV is an integral measure for organizations because it predicts future customer value. Due to the accessibility of enormous customer data sets, particularly from digital service providers, CLV forecasting is increasing in demand. Organizations use this data, namely consumer buying behaviour and 1 usage of product data, to predict future consumer buying behaviour and behaviours. Online shopping is every day, but paying attention to customer requirements reduces time. Studying customer feedback to obtain a general idea or mean sentiment about a good or service provides a better and more effective online purchase[6– 10] . This has resulted in an essential rise in the volume of individuals making online purchases, resulting in a boom in the number of firms that sell items and provide products online[11–15]. The market has grown increasingly competitive as additional e-commerce businesses enter it. Therefore, in order to be successful with other businesses in the subject of e-commerce, firms must maximise the success of their company’s marketing ideas. This process is carried out to find economical customers and those more likely to adopt the company. The methodologies of categorization and grouping can be employed to generate clusters of customers with similar features based on customers’ individual and past information. This research designs models that predict the CLV in order to maintain existing customers. The CLV presents the business with a method to measure the point at which the customer is significant to the industry. When companies are more aware of the value their customers provide, they can use their financial resources properly. To boost economic viability, the preliminary aim of the process is to increase the number of transactions involving customers. A number of the strategies that are employed comprise market basket analysis, response analysis, and cross-sales. 1) 2) 3) 4) The framework was divided into four objectives to address this research hypothesis. According to the research, predicting CLV with additional variables could enhance CLV prediction and classification. What model is optimum for forecasting output features over CLV? What is the most accurate approach to analysing the findings of the DL model? Identify future profitable customers for multi-level e-commerce businesses. This paper presents a detailed measure of background studies on specific research in Section 2, details the proposed methodology, models used, data, explainability, and use of result measures for user segmentation and CLV design in Section 3. Results are presented in Section 4, and the research concludes in Section 5 and future work recommendations. 2. Related works CLV prediction research and hybrid approaches to prediction models support our investigations. CLV is a crucial advertising statistic for maintaining customers and funds. It also increases value for investors. The authors observed that obtaining fresh consumers is costlier than supporting them in their final days, prompting decades of study on CLV. The above approach considers one-time purchases and long-term customer value[16– 20] . Binary classifier and regression approaches were employed to predict CLV in a multi-level model. They divided the CLV model into AOV and Freq regression models. Marketing theory suggests CLV comprises customer retention and sales subprocesses. By predicting these factors, marketers may recognize customers with a low-profit risk but excellent expected buy and implement high-value CLV[21–25]. The authors developed a deep neural network and zero-inflated lognormal distribution CLV integrated analysis method. Investors assumed low CLV implied customer loss. The platform informs business choices with churn, CLV predictions, average order value, and frequency. Multiple-stage modelling was employed, with the final CLV being the product of three sub-model outputs[26–30]. 2 Assuming customers have a set turnover risk following making a buy, the authors substitute the Pareto distribution with a Beta-geometric distribution. Markov Chains are valid CLV techniques since they are modelfree[31–35]. According to the researcher, Random Forest (RF) and Deep Neural Networks (DNNs) have proven more adaptable and efficient for matching variables to data, resulting in more reliable CLV prediction models. RF methods were superior to BTYD models[36–40]. During the 1980s, CLV became common in direct marketing research and business processes. In their summary of CLV studies, they classify prediction models as computation, customer base evaluation, and standard. Investigations utilising CLV to comprehend problems and how chief executives may impact them fall into the next group[41–45]. Conventional business marketing ignored client support for short-term sales. CRM systems and techniques assist businesses in performing, fulfilling, motivating, and keeping customers [46–50]. Developing and maintaining trustworthy, profitable clients is its primary goal. Numerous companies in both manufacturing and service have been employing CRM since it was first introduced. Multiple techniques have been designed for classifying current customers or consumer groups[51–55]. CRM involves databases, and its functionality and architecture determine advertising strategies. A database contains customer and target data obtained over time, impacting every advertising campaign[55–60]. In goods, pricing, advertising, methods, new client acquisition, service to the customer, sales force, client relationship maintenance, and study of marketing, it can be operational or strategic[61–65]. 3. Proposed methodology CLV is the total present value of future projected income for each customer in the literature-based model. Historically, CLV prediction requires estimating each active customer’s renewal rate and projected revenue. This approach predicts future revenue flow in a specified time frame, including discount rates and net current value, which renders it beneficial for numerous uses. The researchers invented the RFM model in 1994 for figuring out CLV. The segments are R, F, M, and M. R is the most recent time the customer purchased the good, F is the number of times, and M is the total quantity purchased in historical time. Lower R values imply greater promotional attention and frequent purchases, while high R values decrease the economy. F indicates better client retention, value, and sales relations. M reflects customers’ overall consumption throughout the average time, having more significant purchases contributing to less competing product or service consumption, longer-term customers, and more excellent value. The K-means clustering method was employed in the building of this model. This technique takes responsibility for trying to minimise the gap between the points in a cluster and their average as much as possible. Moreover, it is a technique that depends on the origin and distance. A cluster is assigned points determined by the distance between the people who have been selected. Based on the K-means algorithm, each cluster has a link to its origin. Figure 1 demonstrates the procedure that ensues. Figure 1. Flow process of customer segmentation. 3 Imports of the data sets have been performed from the UCI repository. The sections contain the following information: User_ID and bag add list of items, bag add specifics, click on the shopping bag icon, filter by, image selecting, click on the account information page, click on the advertisement label, add detail request, and so on. This RF and DT algorithm was implemented to train the dataset in the problem (Figure 2) Figure 2. Flow process of customer weakness. Kaggle is the source of the raw data used for this model. There is no contrast between this dataset and the one mentioned in the customer segmentation module. Figure 3 refers to a flow diagram demonstrating how to estimate the LCLV. Figure 3. Flow process for CLV. 4. ARIMA for CLV Applying an appropriate analytical model for CLV forecasting may drastically affect a business’s capacity to develop intelligent choices, manage resources, and boost CRM operations. The Autoregressive Integrated Moving Average (ARIMA) methodology is a typical CLV forecasting approach[66–70]. The deep ARIMA technique for forecasting and modelling time-series data is accurate. This renders it suitable for buyer behaviour and lifetime value prediction and analysis[71–75]. The ARIMA technique has been selected for CLV forecasting for multiple valid reasons. First, ARIMA improves historical data, which is essential for CLV study because consumer relationships and purchases fluctuate throughout the period. This approach may allow for buyer behaviour changes, fluctuations in demand, and a cyclical nature by identifying time-dependent relationships and habits in the data being analysed. ARIMA assists firms in forecasting CLV ratios with high accuracy[76–80]. This aim is necessary for intelligent choices and long-term strategy. When people predict when CLV will impact them, businesses may decide on more effective resource allocation and, recruitment and advertising choices[81–85]. The ARIMA technique is used for time series prediction in economics, banking, and CRM. Applying ARIMA to foresee CLV along with supporting CRM involves several variables. The data accumulates into optimal periods after receiving and pre-processing historical client purchases, relationships and lifetime data to deal with incorrect values, anomalies, and conflicts. Exploratory Data Analysis (EDA) applies mathematical methods for linearity assessment and graphical techniques to find time series developments, patterns, and periodicity[86–90]. Evaluate the variance ratio and apply ACF and PACF assessment to identify the most appropriate autoregressive and moving average parameters for selecting the framework. Also mentioned is periodic ARIMA. Leveraging analytical files, the ARIMA model is adapted to automated CLV data. The residual analysis technique verifies 4 the model integrity using MAE, MSE, and RMSE. Prediction time frames analyse predicting risk and develop CLV forecasts[91–95]. CRM applications using ARIMA predictions enhance client selection, loyalty, and allocation of resources. Personalised advertising strategies, incentive programmes, and client segmentation schemes employing projected CLV values are also implemented. ARIMA model: The CLV data model was measured after data were added to the application’s database. The letter ‘f’, included in the parameters, can be utilised to recognise this data. Based on the findings of specific authors, the ARIMA model is an example of time-series analysis that depends on prediction methods. Static ARIMA model iteration: The Dickey-Fuller test is a standard statistical approach to identify the stationary state of the autoregressive model. It analyses the null assumption that an autoregressive model has an initial base vs linearity or trend-stationarity. In 1979, researchers David Dickey and Wayne Fuller developed the test. It examines developments within sections with predictable swings and increasing associations. The OLS method of practical estimate requires static time series variables. First, study a variable’s time-series graph to find its significance. A single root test may identify a random pattern in a time series. Assume the first-order explanation process, Equation (1): 𝑥𝑡 = 𝑝𝑦𝑡−1 + 𝑢𝑡 𝑡 = 1, . . 𝑁 (1) The subsequent hypothesis investigates if the time series has a single root or is unidentified, Equation (2). 𝐻0 = 𝑝 = 1 𝐻1 : =< 1 (2) An autoregressive (AR) model impacts a series’ present value by its previous values. In an AR(p) model, a variable’s future value is determined by a linear sum of its ‘p’ last data points, a random error term, and a fixed factor. The mathematical description of AR(p) is Equation (3). 𝑝 𝑃𝑡 = 𝑐 + ∑ 𝜑𝑖 𝑃𝑡−𝑖 + 𝜀𝑡 (3) 𝑗=1 Here Pt and 𝜀t are correspondingly the actual value and random error at time ‘′𝑡′, 𝜙𝑖 (𝑖 = 1,2, … , 𝑝) are model variables, and ′c′ is a fixed. The digit of ′p′ is identified as the order of the model. Historical errors impact present variations from the mean in time series analysis. An AR model of order ‘p’ forecasts the next value by regressing the series’ past values. In contrast, a Moving Average (MA) model of order ‘q’ employs series errors as variables to explain—a mathematical illustration of the MA(q) model. In this Equation (4), ‘μ’ specifies the sequence mean, θj (for j = 1, 2, …, q) are the model’s variables, and ‘q’ specifies the model order. Linear regressions correlating the present data versus unpredictability from previous research are moving average models. Linearity, or consistent variation near a set point, is essential for Box-Jenkins time series modelling. A static mechanism achieves statistical balance with a fixed probability distribution p(xt) at all periods ‘t’. Equations (5) and (6) require Regular Differencing (RD) to fix a nonstationary series. 𝑞 𝑃1 = 𝜇 + ∑ 𝜃𝑗 𝜀𝑟−𝑗 + 𝜀𝑡 𝑗=1 𝛻𝑃𝑡 = (1 − 𝐵𝑆)𝑋𝑡 = 𝑃𝑡 − 𝑃𝑡 − 1 𝛻𝑃𝑡 = (1 − 𝐵𝑆)𝑋𝑡 = 𝑃𝑡 − 𝑃𝑡 − 1 (4) (5) (6) The backward shift function is symbolized as ‘BS’. The requirement for at least two standard variations is minimal. Deep neural network model Given the input patterns, the deep NN adapts its settings to generate more precise results while minimising faults. It statistically develops the desired unidentified function utilising neuron functions, with expansion 5 weights corresponding to network weights[96–100]. Essential CPU parts linked by weight ratios form ANN that may identify variable relationships. Four key factors impact the Deep NN model:  The forecasting method uses exchange value, volume, and recent exchange as input parameters and CLV data. It selected 500 customers.  Hidden Layer Count: The research has two hidden layers.  In the output layer, exponential and sigmoid tangent activation functions from hidden and linear layers are employed for transfer.  Due to their popularity, this study uses growth, Lonberg-Marquardt, descending gradient, phase, delta bar delta, and conjugate rotation techniques. Multilayer perceptron networks with error propagation algorithms are standard in technology. The system has three layers of neurons, 80% for training and 20% for testing. The neurons decreased from 1 to 50 to create an efficient II-layer NN. Using company information, model outcomes were given individually. Researchers demonstrated that linear first hyperbolic and second sigmoid tangent transfer functions and the Levenberg-Marquat learning function improve ANN. The number of neurons in the first and second hidden layers is considered. Based on the investigation, levenberg-Marquat learning and linear first hyperbolic and second sigmoid tangent transfer functions generate the ideal state of ANN. Examine the first and second hidden layer neuron count. A high Pearson correlation between R, F, and M indicates a significant relationship between dependent variables and all independent variables. A significant correlation between the two variables (R = 0.946; F = 0.91; M = 0.93) suggests that both can predict the dependent variable. This demonstrates a simple, powerful connection among the variables. A 38.4% MAPE error indicates that the ARIMA model predicts CLV effectively. CLV is expected to be greater by using the ARIMA model. MAPE values suggest that the Deep NN model in the present investigation predicts CLV far more accurately by using the ARIMA model. 5. Result and discussion The dataset comprises several attributes providing detailed information about transactions. The ‘Invoice_Number’ attribute represents a six-digit integral number assigned to each transaction, with codes starting with ‘c’ indicating cancellations. ‘Stock_Number’ is a nominal attribute of 5-digit necessary numbers assigned uniquely to distinct products. The ‘Product_Description’ feature encompasses minimal values representing the tags of the products. ‘Quantity’ is a numeric value signifying the measures of each product per transaction. ‘Date of Invoice’ is a numeric value measuring the transaction period. ‘Cost_Per_Unit’ denotes the unit price of products in real (£) as a numeric value. ‘Customer_ID’ represents the customer number, a nominal five-digit primary identifier unique to each customer. Finally, ‘Country’ is a minimum value of the name of the country where the customer resides. These value schemes complete visions into transactional data and customer behaviour within the dataset. Analysing graphs provides several perspectives of the dataset. Figure 4 CLV Histogram illustrates size, bill date, and costs, which indicate consumer buying behaviour, sales patterns, and pricing models. Figure 5 Data Distribution demonstrates the allocation of data collection, assisting in recognising deviations, anomalies, and layouts. Figure 6 Item Distribution displays item recognition, directing stock control and promotion. Figure 7 Analysis of MAE, MSE, and RMSE measures forecast performance of models, which is crucial in selecting a highly accurate approach. These statistics present an in-depth data opinion, allowing customers to reach smart marketing, advertising, forecasting, and stock control strategies. MAPE, MSE, and RMSE errors, respectively, 10.3%, 11.6%, and 12.29% in the Deep Neural Network framework. These reflect cumulative error ratios. Acceptable is Deep Neural Network. 6 Figure 4. CLV histogram. (b) (a) (c) Figure 5. Data distribution. (a) number of items; (b) date of invoice; (c) cost_per_unit. Figure 6. Item distribution. Figure 7. Comparison MAE, MSE, and RMSE. 6. Conclusion and future work The companies forecast Customer Lifetime Value (CLV) and discover lucrative clients using mathematical models and Neural Networks (NN), fresh data collection techniques invented over the previous 7 two decades. The precision in such predictions is unclear. This paper proposes a novel Deep Learning (DL) business-to-business CLV forecasting technique. Segmentation of buyers by timely completion, frequency range, and CLV. CLV forecasts the retention of clients. Business growth raises clients through marketing products, statistical analysis, and review measurement. The reliant on-data Autoregressive Integrated Moving Average (ARIMA) estimation model calculates the Customer Life Cycle (CLV) and assists businesses in making recommendations. Innovative options and CRM improvements are made possible by precisely forecasting CLV data periodic relationships, changes, and intervals. ARIMA deals with data migration and inadequate past data to enhance marketing purposes, client retention, and allocation of resources. ARIMA models enable businesses to forecast CLV, evaluate buyer behaviour, and increase income. Data-driven approaches attempt to boost CLV gradually. Summary DL algorithms enhance versatility access and forecast the cost of services. The Deep NN system has acceptable MAPE, MSE, and RMSE errors of 10.3%, 11.6%, and 12.29%. ARIMA’s integration with cutting-edge algorithms such as ML and DL will be significant in future developments. Hybrid mathematical models enhance CLV forecasts. Author contributions Conceptualization, SS; methodology, SS; software, NSA and SLM; validation, PM and SS; formal analysis, SS and PD investigation, SS and PD; resources, KNR; data curation, BRK; writing—original draft preparation, SS; writing—review and editing, SS and PD; visualization, PM; supervision, NSA; project administration, SLM; funding acquisition, NSA and KNR. All authors have read and agreed to the published version of the manuscript. Conflict of interest The authors declare no conflict of interest. References 1. 2. 3. 4. 5. 6. 7. 8. 9. Luo Y. Decision making of customer retention based on customer identification. In: Proceedings of the 2011 Eighth International Conference on Fuzzy Systems and Knowledge Discovery (FSKD). Lestari PT, Fernando Y, Ikhsan RB, et al. A Survey on Electronic Dialogue, Risk Assessment, Customer Access, and Customers Relationship Lifetime. In: Proceedings of the 2022 International Conference on Information Management and Technology (ICIMTech). Rahmadianti R, Dhini A, Laoh E. Estimating Customer Lifetime Value using LRFM Model in Pharmaceutical and Medical Device Distribution Company. In: Proceedings of the 2020 International Conference on ICT for Smart Society (ICISS). Xue C. Modeling Customer Lifetime Value in Buyer-Seller Relationships. In: Proceedings of the 2009 International Conference on Information Management, Innovation Management and Industrial Engineering. Published online 2009. doi: 10.1109/iciii.2009.38 Myburg M, Berman S. Customer Lifetime Value Prediction with K-means Clustering and XGBoost. In: Proceedings of the 2022 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining (ASONAM). Tsai TY, Lin CT, Prasad M. An Intelligent Customer Churn Prediction and Response Framework. In: Proceedings of the 2019 IEEE 14th International Conference on Intelligent Systems and Knowledge Engineering (ISKE). Bosnjak Z, Grljevic O. Credit users segmentation for improved customer relationship management in banking. In: Proceedings of the 2011 6th IEEE International Symposium on Applied Computational Intelligence and Informatics (SACI). Published online May 2011. doi: 10.1109/saci.2011.5873033 Fernando AMP, Adhikari AMTT, Wijesekara WHATK, et al. Planning Marketing Strategies in Small-Scale Business Using Data Analysis. In: Proceedings of the 2022 3rd International Informatics and Software Engineering Conference (IISEC). Umair A, Alamgir Z. Product recommendation using typicality based collaborative filtering and churn analysis. In: Proceedings of the 2016 Sixth International Conference on Innovative Computing Technology (INTECH). 8 10. Fernando Y, Mergeresa F, Wahyuni-TD IS, et al. Business Based Smart Operations and Digital Supply Chain Performance. In: Proceedings of the 2021 3rd International Conference on Cybernetics and Intelligent System (ICORIS). 11. Alnuaim AA, Zakariah M, Shukla PK, et al. Human-Computer Interaction for Recognizing Speech Emotions Using Multilayer Perceptron Classifier. Journal of Healthcare Engineering. 2022. 12. Gupta A, Pawade P, Balakrishnan R. Deep Residual Network and Transfer Learning-based Person ReIdentification. Intelligent Systems with Applications. 2022; 16: 200137. doi: 10.1016/j.iswa.2022.200137 13. Lazar AJP, Sengan S, Cavaliere LPL, et al. Analysing the User Actions and Location for Identifying Online Scam in Internet Banking on Cloud. Wireless Personal Communications; 2021. 14. Naik A, Satapathy SC. A comparative study of social group optimization with a few recent optimization algorithms. Complex & Intelligent Systems. 2020; 7(1): 249-295. doi: 10.1007/s40747-020-00189-6 15. Naik A, Satapathy SC. Past present future: A new human-based algorithm for stochastic optimization. Soft Computing. 2021; 25(20): 12915-12976. doi: 10.1007/s00500-021-06229-8 16. Naik A, Satapathy SC, Abraham A. Modified Social Group Optimization—a meta-heuristic algorithm to solve short-term hydrothermal scheduling. Applied Soft Computing. 2020; 95: 106524. doi: 10.1016/j.asoc.2020.106524 17. Sampath Dakshina Murthy A, Karthikeyan T, Vinoth Kanna R. Gait-based person fall prediction using deep learning approach. Soft Computing. 2021; 26(23): 12933-12941. doi: 10.1007/s00500-021-06125-1 18. Sujith AVLN, Qureshi NI, Dornadula VHR, et al. A Comparative Analysis of Business Machine Learning in Making Effective Financial Decisions Using Structural Equation Model (SEM). Journal of Food Quality. 2022; 2022: 1-7. doi: 10.1155/2022/6382839 19. Vijetha A, Shashirekha T, Raju K, et al. A robust fake currency detection model using ESVM machine learning technique. Journal of Advanced Research in Dynamical and Control Systems. 2020; 12(6): 170–179. 20. Hazarika BB, Gupta D. 1-Norm random vector functional link networks for classification problems. Complex & Intelligent Systems. 2022; 8(4): 3505-3521. doi: 10.1007/s40747-022-00668-y 21. Fernandes B, Mannepalli K. Speech Emotion Recognition Using Deep Learning LSTM for Tamil Language. Pertanika Journal of Science and Technology. 2021; 29(3). doi: 10.47836/pjst.29.3.33 22. Durga BK, Rajesh V. A ResNet deep learning-based facial recognition design for future multimedia applications. Computers and Electrical Engineering. 2022; 104. 23. Venkateswarlu B, Shenoi VV, Tumuluru P. CAViaR-WS-based HAN: Conditional autoregressive value at riskwater sailfish-based hierarchical attention network for emotion classification in COVID-19 text review data. Social Network Analysis and Mining. 2022; 12(1). 24. Ghuge CA, Chandra Prakash V, Ruikar SD. Weighed query-specific distance and hybrid NARX neural network for video object retrieval. Computer Journal. 2020; 63(7): 1738–1755. 25. Banchhor C, Srinivasu N. Analysis of Bayesian optimization algorithms for big data classification based on Map Reduce framework. Journal of Big Data. 2021; 8(1). doi: 10.1186/s40537-021-00464-4 26. Banchhor C, Srinivasu N. FCNB: Fuzzy Correlative Naive Bayes Classifier with MapReduce Framework for Big Data Classification. Journal of Intelligent Systems. 2020; 29(1): 994–1006. 27. Banchhor C, Srinivasu N. Holoentropy based Correlative Naive Bayes classifier and MapReduce model for classifying the big data. Evolutionary Intelligence. 2019; 15(2): 1037-1050. doi: 10.1007/s12065-019-00276-9 28. Paul C, Bora P. Detecting Hate Speech using Deep Learning Techniques. International Journal of Advanced Computer Science and Applications. 2021; 12(2): 619–623. 29. Paul C, Sahoo D, Bora P. Aggression in social media: Detection using machine learning algorithms. International Journal of Scientific and Technology Research. 2020; 9(4): 114–117. 30. Murty CSVVSN, Varma G, Satyanarayana C. Content-Based Collaborative Filtering with Hierarchical Agglomerative Clustering Using User/Item based Ratings. Journal of Interconnection Networks. 2022; 22. 31. Srihari D, Kishore PVV, Kumar EK, et al. A four-stream ConvNet based on spatial and depth flow for human action classification using RGB-D data. Multimedia Tools and Applications. 2020; 79(17-18): 11723-11746. doi: 10.1007/s11042-019-08588-9 32. Rajesh Kumar E, Rama Rao KVSN, Nayak SR, Chandra R. Suicidal ideation prediction in Twitter data using machine learning techniques. Journal of Interdisciplinary Mathematics. 2020; 23(1): 117–125. 33. Ahmad F, Shahid M, Alam M, et al. Levelized Multiple Workflow Allocation Strategy Under Precedence Constraints with Task Merging in IaaS Cloud Environment. IEEE Access. 2022; 10: 92809–92827. 34. Rekha G, Krishna Reddy V, Kumar Tyagi A. A novel approach for solving skewed classification problem using cluster based ensemble method. Mathematical Foundations of Computing. 2020; 3(1): 1-9. doi: 10.3934/mfc.2020001 35. Priyatharsini GS, Babu AJ, Kiran MG, et al. Self secured model for cloud-based IoT systems. Measurement: Sensors. 2022; 24. 36. Sravya GS, Pradeepini G. Mobile SMS spam filter techniques using machine learning techniques. International Journal of Scientific and Technology Research. 2020; 9(3): 384–389. 37. Yenduri G, Rajakumar BR, Praghash K, Binu D. Heuristic-Assisted BERT for Twitter Sentiment Analysis. International Journal of Computational Intelligence and Applications. 2021; 20(3). 9 38. El-Wahed Khalifa HA, Kumar P, Smarandache F. On optimizing neutrosophic complex programming using lexicographic order. Neutrosophic Sets and Systems. 2020; 23: 330–343. 39. Yadla HK, Rao PVRDP. Machine learning based text classifier centered on TF-IDF vectoriser. International Journal of Scientific and Technology Research. 2020; 9(3): 583–586. 40. Pradeep IK, Bhaskar MJ, Satyanarayana B. Data science and deep learning applications in the e-commerce industry: A survey. Indian Journal of Computer Science and Engineering. 2020; 11(5): 497–509. 41. Mallika IL, Ratnam DV, Raman S, et al. A New Ionospheric Model for Single Frequency GNSS User Applications Using Klobuchar Model Driven by Auto Regressive Moving Average (SAKARMA) Method Over Indian Region. IEEE Access. 2020; 8: 54535-54553. doi: 10.1109/access.2020.2981365 42. Fernandes JB, Mannepalli K. Enhanced Deep Hierarchal GRU & BILSTM using Data Augmentation and Spatial Features for Tamil Emotional Speech Recognition. International Journal of Modern Education and Computer Science. 2022; 14(3): 45-63. doi: 10.5815/ijmecs.2022.03.03 43. Reddy JR, Pandian A, Reddy CR. An efficient learning based RFMFA technique for islanding detection scheme in distributed generation systems. Applied Soft Computing Journal. 2020; 96. 44. Saha J, Chowdhury C, Ghosh D, et al. A detailed human activity transition recognition framework for grossly labeled data from smartphone accelerometer. Multimedia Tools and Applications. 2020; 80(7): 9895-9916. doi: 10.1007/s11042-020-10046-w 45. Saha J, Ghosh D, Chowdhury C, et al. Smart Handheld Based Human Activity Recognition Using Multiple Instance Multiple Label Learning. Wireless Personal Communications. 2020; 117(2): 923-943. doi: 10.1007/s11277-020-07903-0 46. Yadav J, Misra M, Rana NP, et al. Exploring the synergy between nano-influencers and sports community: behavior mapping through machine learning. Information Technology & People. 2021; 35(7): 1829-1854. doi: 10.1108/itp-03-2021-0219 47. Jammalamadaka K, Parveen N. Testing coverage criteria for optimized deep belief network with search and rescue. Journal of Big Data. 2021; 8(1). doi: 10.1186/s40537-021-00453-7 48. Mannepalli K, Sastry PN, Suman M. Emotion recognition in speech signals using optimization based multi-SVNN classifier. Journal of King Saud University - Computer and Information Sciences. 2022; 34(2): 384-397. doi: 10.1016/j.jksuci.2018.11.012 49. Naga Durga Saile K, Venkatramaphanikumar S, Venkata Krishna Kishore K, Bhattacharyya D. Review on the usage of deep learning models in multi-modal sentiment analysis. IEEE Transactions on Smart Processing and Computing. 2020; 9(6): 435–444. 50. Somase KP, Imambi SS. Develop and implement unsupervised learning through hybrid FFPA clustering in largescale datasets. Soft Computing. 2021; 25(1): 277–290. 51. Prasad KR, Reddy BE, Mohammed M. An effective assessment of cluster tendency through sampling based multiviewpoints visual method. Journal of Ambient Intelligence and Humanized Computing. 2021. 52. Thirugnanasambandam K, Rajeswari M, Bhattacharyya D, et al. Directed Artificial Bee Colony algorithm with revamped search strategy to solve global numerical optimization problems. Automated Software Engineering. 2022; 29(1). doi: 10.1007/s10515-021-00306-w 53. Khan MA, Khan GA, Khan J, et al. Multi-View Clustering Based on Multiple Manifold Regularized NonNegative Sparse Matrix Factorization. IEEE Access. 2022; 10: 113249-113259. doi: 10.1109/access.2022.3216705 54. Harish M, Srinivasa Rao S, Nageswara Rao B, et al. Specific optimal AWJM process parameters for Ti-6Al-4V alloy employing the modified Taguchi approach. Journal of Mathematical and Computational Science. 2021; 11(1): 292–311. 55. Pathak MK, Srinivasu N, Bairagi V. Support value-based fusion matching using iris and sclera features for person authentication in unconstrained environment. Journal of Engineering Science and Technology. 2020; 15(4): 2595– 2609. 56. Thota MK, Shajin FH, Rajesh P. Survey on software defect prediction techniques. International Journal of Applied Science and Engineering. 2020; 17(4): 331–344. 57. Mohammed M, Kolapalli R, Golla N, Maturi SS. Prediction of rainfall using machine learning techniques. International Journal of Scientific and Technology Research. 2020; 9(1): 3236–3240. 58. Navya Pratyusha M, Rajyalakshmi K, Apparao BV, Charankumar G. Impact of sleep on usage of the smartphone at the bedtime–A case study. Mathematics and Statistics. 2021; 9(1): 31–35. 59. Kumar MR, Pooja K, Udathu M, et al. Detection of Depression Using Machine Learning Algorithms. International Journal of Online and Biomedical Engineering. 2022; 18(4): 155–163. 60. Kantipudi MVVP, Kumar S, Jha AK. Scene text recognition based on bidirectional LSTM and deep neural network. Computational Intelligence and Neuroscience. 2021. 61. Sri Harsha NC, Anudeep YGVS, Vikash K, et al. Performance Analysis of Machine Learning Algorithms for Smartphone-Based Human Activity Recognition. Wireless Personal Communications. 2021; 121(1): 381-398. doi: 10.1007/s11277-021-08641-7 62. Sakhare NN, Sagar Imambi S. Technical Analysis Based Prediction of Stock Market Trading Strategies Using Deep Learning and Machine Learning Algorithms. International Journal of Intelligent Systems and Applications in Engineering. 2022; 10(3): 411–422. 10 63. Sakhare NN, Imambi SS, Kagad S, et al. Stock market prediction using sentiment analysis. International Journal of Advanced Science and Technology. 2020; 29(4): 1126–1133. 64. Tamiloli N, Venkatesan J, SampathKumar T. ANFIS based forecast and parametric investigation during processing activity of AA6082T6. Materials and Manufacturing Processes. 2021; 37(1): 99-112. doi: 10.1080/10426914.2021.1945093 65. Kimmatkar NV, Babu BV. Novel Approach for Emotion Detection and Stabilizing Mental State by Using Machine Learning Techniques. Computers. 2021; 10(3): 37. doi: 10.3390/computers10030037 66. Hema P, Sathish E, Maheswari M, et al. Robust soft sensor systems for industry: Evaluated through real-time case study. Measurement: Sensors. 2022; 24: 100542. doi: 10.1016/j.measen.2022.100542 67. Verma PK, Agrawal P, Madaan V, et al. UCred: fusion of machine learning and deep learning methods for user credibility on social media. Social Network Analysis and Mining. 2022; 12(1). doi: 10.1007/s13278-022-00880-1 68. Mishra P, Srinivas PVVS. Facial emotion recognition using deep convolutional neural network and smoothing, mixture filters applied during preprocessing stage. IAES International Journal of Artificial Intelligence (IJ-AI). 2021; 10(4): 889. doi: 10.11591/ijai.v10.i4.pp889-900 69. Patro P. Neuro Fuzzy System with Hybrid Ant Colony Particle Swarm Optimization (HASO) and Robust Activation. Journal of Advanced Research in Dynamical and Control Systems. 2020; 12(SP3): 741-750. doi: 10.5373/jardcs/v12sp3/20201312 70. Syamala Rao P, Parthasaradhi Varma G, Durga Prasad Ch. Financial time series forecasting using optimized multistage wavelet regression approach. International Journal of Information Technology. 2022; 14(4): 2231-2240. doi: 10.1007/s41870-022-00924-x 71. Srinivas PVVS, Mishra P. A novel framework for facial emotion recognition with noisy and de noisy techniques applied in data pre-processing. International Journal of System Assurance Engineering and Management. Published online July 19, 2022. doi: 10.1007/s13198-022-01737-8 72. Srinivas PVVS, Mishra P. An Improvised Facial Emotion Recognition System using the Optimized Convolutional Neural Network Model with Dropout. International Journal of Advanced Computer Science and Applications. 2021; 12(7). doi: 10.14569/ijacsa.2021.0120743 73. Gillala R, Vuyyuru KR, Jatoth C, et al. An efficient chaotic salp swarm optimization approach based on ensemble algorithm for class imbalance problems. Soft Computing. 2021; 25(23): 14955-14965. doi: 10.1007/s00500-02106080-x 74. Kumar R, Edalatpanah S, Mohapatra H. Note on “Optimal path selection approach for fuzzy reliable shortest path problem.” Journal of Intelligent & Fuzzy Systems. 2020; 39(5): 7653-7656. doi: 10.3233/jifs-200923 75. Setiawan R, Ponnam VS, Sengan S, et al. Certain Investigation of Fake News Detection from Facebook and Twitter Using Artificial Intelligence Approach. Wireless Personal Communications. 2021; 127(2): 1737-1762. doi: 10.1007/s11277-021-08720-9 76. Devi SA, Siva S. A Hybrid Document Features Extraction with Clustering based Classification Framework on Large Document Sets. International Journal of Advanced Computer Science and Applications. 2020; 11(7). doi: 10.14569/ijacsa.2020.0110748 77. Yasin SA, Prasada Rao PVRD. Enhanced CRNN-Based Optimal Web Page Classification and Improved Tunicate Swarm Algorithm-Based Re-Ranking. International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems. 2022; 30(05): 813-846. doi: 10.1142/s0218488522500246 78. Depuru S, Nandam A, Ramesh PA, et al. Human Emotion Recognition System Using Deep Learning Technique. Journal of Pharmaceutical Negative Results. 2022; 13(4): 1031–1035. 79. Deshmukh S, Thirupathi Rao K, Shabaz M. Collaborative Learning Based Straggler Prevention in Large-Scale Distributed Computing Framework. Security and Communication Networks. 2021; 2021: 1-9. doi: 10.1155/2021/8340925 80. Ahammad SH, Kale SD, Upadhye GD, et al. Phishing URL detection using machine learning methods. Advances in Engineering Software. 2022; 173: 103288. doi: 10.1016/j.advengsoft.2022.103288 81. Immareddy S, Sundaramoorthy A. A survey paper on design and implementation of multipliers for digital system applications. Artificial Intelligence Review. 2022; 55(6): 4575-4603. doi: 10.1007/s10462-021-10113-0 82. Bharti SK, Varadhaganapathy S, Gupta RK, et al. Text-Based Emotion Recognition Using Deep Learning Approach. Kumar V, ed. Computational Intelligence and Neuroscience. 2022; 2022: 1-8. doi: 10.1155/2022/2645381 83. Kokate S, Chetty MSR. Credit Risk Assessment of Loan Defaulters in Commercial Banks Using Voting Classifier Ensemble Learner Machine Learning Model. International Journal of Safety and Security Engineering. 2021; 11(5): 565-572. doi: 10.18280/ijsse.110508 84. Meeravali Sk, Bhattacharyya D, Rao NT, et al. Performance analysis of an improved forked communication network model. Connection Science. 2021; 33(3): 645-673. doi: 10.1080/09540091.2020.1867064 85. Praveen SP, Murali Krishna TB, Anuradha CH, et al. A robust framework for handling health care information based on machine learning and big data engineering techniques. International Journal of Healthcare Management. Published online December 15, 2022: 1-18. doi: 10.1080/20479700.2022.2157071 11 86. Nayak SR, Sivakumar S, Bhoi AK, et al. RETRACTED: Mixed-mode database miner classifier: Parallel computation of graphical processing unit mining. International Journal of Electrical Engineering & Education. 2021; 60(1_suppl): 2274-2299. doi: 10.1177/0020720920988494 87. Rajasoundaran S, Prabu AV, Routray S, et al. Machine learning based deep job exploration and secure transactions in virtual private cloud systems. Computers & Security. 2021; 109: 102379. doi: 10.1016/j.cose.2021.102379 88. Rani S, Lakhwani K, Kumar S. Three dimensional objects recognition & pattern recognition technique; related challenges: A review. Multimedia Tools and Applications. 2022; 81(12): 17303-17346. doi: 10.1007/s11042-02212412-2 89. Roy S, Patel B, Bhattacharyya D, et al. Demographical gender prediction of Twitter users using big data analytics: an application of decision marketing. International Journal of Reasoning-based Intelligent Systems. 2021; 13(2): 41. doi: 10.1504/ijris.2021.114629 90. Sekar S., Solayappan A, Srimathi J., et al. Autonomous Transaction Model for E-Commerce Management Using Blockchain Technology. International Journal of Information Technology and Web Engineering. 2022; 17(1): 114. doi: 10.4018/ijitwe.304047 91. Sengan S, Vidya Sagar P, Ramesh R, et al. The optimization of reconfigured real-time datasets for improving classification performance of machine learning algorithms. Mathematics in Engineering, Science and Aerospace. 2021; 12(1): 43–54. 92. Shakeela S, Shankar NS, Reddy PM, et al. Optimal Ensemble Learning Based on Distinctive Feature Selection by Univariate ANOVA-F Statistics for IDS. International Journal of Electronics and Telecommunications. Published online December 1, 2020: 267-275. doi: 10.24425/ijet.2021.135975 93. Tatale S, Chandra Prakash V. Combinatorial test case generation from sequence diagram using optimization algorithms. International Journal of System Assurance Engineering and Management. 2022; 13(S1): 642-657. doi: 10.1007/s13198-021-01579-w 94. Chakravorti T, Satyanarayana P. Non linear system identification using kernel based exponentially extended random vector functional link network. Applied Soft Computing. 2020; 89: 106117. doi: 10.1016/j.asoc.2020.106117 95. Ganesan V, Sobhana M, Anuradha G, et al. Quantum inspired meta-heuristic approach for optimization of genetic algorithm. Computers & Electrical Engineering. 2021; 94: 107356. doi: 10.1016/j.compeleceng.2021.107356 96. Narasamma VL, Sreedevi M. Twitter based Data Analysis in Natural Language Processing using a Novel Catboost Recurrent Neural Framework. International Journal of Advanced Computer Science and Applications. 2021; 12(5). doi: 10.14569/ijacsa.2021.0120555 97. Mahalakshmi V, Kulkarni N, Pradeep Kumar KV, et al. The Role of implementing Artificial Intelligence and Machine Learning Technologies in the financial services Industry for creating Competitive Intelligence. Materials Today: Proceedings. 2022; 56: 2252-2255. doi: 10.1016/j.matpr.2021.11.577 98. Mannem V, Kuchibhotla S. Deep learning methodology for recognition of emotions using acoustic features. International Journal of Pharmaceutical Research. 2020; 12(4): 3386–3389. 99. Talasila V, M R N, V MM. Optimized GAN for Text-to-Image Synthesis: Hybrid Whale Optimization Algorithm and Dragonfly Algorithm. Advances in Engineering Software. 2022; 173: 103222. doi: 10.1016/j.advengsoft.2022.103222 100. Pawan YVRN, Prakash KB, Chowdhury S, et al. Particle swarm optimization performance improvement using deep learning techniques. Multimedia Tools and Applications. 2022; 81(19): 27949-27968. doi: 10.1007/s11042022-12966-1 12