Analysis of Different Supervised Machine Learning Methods for Accelerometer-Based Alcohol Consumption Detection from Physical Activity
Volume 7, Issue 4, Page No 147–154, 2022
Adv. Sci. Technol. Eng. Syst. J. 7(4), 147–154 (2022);
DOI: 10.25046/aj070419
Keywords: Artificial Intelligence, Biomedical Engineering, Data Mining, Information Technology, Machine Learning, Systems Engineering, Mobile Computing, Embedded Systems
This paper builds on the realization that since mobile devices have become a common tool for researchers to collect, process, and analyze large quantities of data, we are now entering a generation where the creation of solutions to difficult real-world problems will mostly come in the form of mobile device apps. One such relevant real-life problem is to accurately and cheaply detect the over-consumption of alcohol, since it can lead to many problems including fatalities. Today, there are several expensive and/or tedious alternative procedures in the market that are used to test subjects’ Blood Alcohol Content (BAC). This paper explores a cheaper and more effective alternative to address this problem by classifying if subjects have consumed too much alcohol by using accelerometer data from the subjects’ mobile devices while they perform physical activity. In order to create the most accurate classification system, we conduct experiments with five different supervised machine learning methods and use them on two features derived from accelerometer data of two different male subjects. We then share our experiment results that support why “Decision Tree Learning” is the supervised machine learning method that is best suited for our mobile device sobriety classification system.
1. Introduction
This paper is an extension of work originally presented in the 2021 17th International Conference on Wireless and Mobile Computing, Networking and Communications (WiMob) [1].
1.1. Problem and Motivation
Alcohol misuse and abuse is responsible for great personal and economic harm in the United States (US) and around the world, where more than 88,000 people die from alcohol-related issues each year, which makes it the third leading preventable cause of death in the US [2]. Excessive drinking has been proven to damage the heart, liver, pancreas, and immune system [3]. In addition to the detrimental health effects, alcohol misuse has cost the US $223.5B in economic loss in 2006 alone [2].
The consumption of alcohol negatively affects individuals’ brains and their central nervous systems. These effects only become worse with larger alcohol concentrations in the individuals’ blood. More specifically, judgment, reaction time, balance, and psycho-motor performance start becoming compromised above a BAC of 0.02 – 0.05 [2]. These abilities are necessary to operate vehicles, so it is not surprising that almost 50% of traffic fatalities involve the (mis)use of alcohol.
Currently, there are a couple of different methods to test intoxication levels. These tests can be administered by drawing blood, monitoring breath (breathalyzer), and collecting urine/saliva/short strands of hair. All of these methods try to directly measure alcohol’s presence in the individuals’ bodies. On the other hand, field sobriety tests that are performed by law enforcement officials typically include physical tasks to gauge individuals’ levels of impairment. Most of these physical movements involve performing tasks such as individuals walking backwards in a straight line or maintaining a steady posture while touching their noses with their arms stretched out. The advantages of such physical tests are that they are convenient and cheap. On the other hand, devices like the breathalyzer are fairly expensive, where costs typically range from $3,000-$5,000 per unit, and require frequent device calibration, comprehensive maintenance, and expensive repairs. However, physical tests are considered to be subjective since they are based on the officers’ observation of the individuals, but are still cheaper than chemical tests.
As a result, we believe that mobile device-based systems can give individuals the best of both worlds by providing low-cost portable devices, which also measure individuals’ sobriety levels objectively. The advantage of mobile computing is due to the fact fact that mobile devices can sense real-world data and respond to trends based on the data and/or their surrounding environments. If mobile devices can signal individuals that they are intoxicated by their physical responses, then fatalities can be mitigated across the world.
1.2. Proposed Solution
As seen in Figure 1, we propose a multi-stage detection system that makes use of mobile devices’ accelerometers to capture real-time data from individuals. This data is then processed and fed into a data classifier, which can internally determine whether the individuals are intoxicated by comparing their data to an existing classification model that is based on historical data.

Figure 1: Block diagram of the system used for the experiments
In this paper, we take the first step towards implementing such a system by proposing robust accelerometer data features that can distinguish between sober and intoxicated individuals. We then test these features using five Supervised Machine Learning models to see their accuracy in predicting individuals’ sobriety levels: Support Vector Machines (SVM), Decision Tree Learning, Boosting, K-Nearest Neighbors (KNN), and Neural Networks (NN) [4].
2. Related Works
Several researchers have focused their prior research on movement-pattern recognition by using accelerometers, especially since these days all mobile devices are embedded with highly accurate and precise accelerometers [5]. There have been many interesting mobile applications developed, which can detect the individuals’ activities, such as daily exercises or crossing the street, by solely analyzing the accelerometer data.
There have also been several papers that have proposed different variations of mobile systems to detect individuals’ intoxication levels:
- Detecting abnormalities in individuals’ gaits while they walk intoxicated [6–9].
- Evaluating eye (iris) movements of individuals who are intoxicated [10,11].
- Monitoring the steadiness of postures of intoxicated individuals [12].
These mobile systems can sense individuals’ intoxication levels and log the location/time of the incidents. Although these papers show the individual differences in step variance time among individuals, the differences are relative to everyone’s unique baseline, so the differences cannot be used to cleanly separate any intoxicated and sober individuals in the general population.
3. Methodology
3.1. Physiological Basis
One of the first symptoms that individuals exhibit as they become intoxicated is that they have decreased balance and motor coordination. This is because of the effect of alcohol on the brain’s chemistry that it achieves by changing the neurotransmitters’ levels. These neurotransmitters are entities that act as chemical messengers, which send critical human signals throughout the body. These signals include those that control thought processes, behaviors, and emotions. Many believe that out of all the neurotransmitters, alcohol specifically targets the GABA neurotransmitter [13].
3.2. Physical Activity Data Collection
With these factors in mind, we planned our experiments such that we could easily distinguish between intoxicated and sober subjects based on the subjects’ abilities to balance their bodies and maintain steady postures. Our proposed mobile-based system logs subjects’ accelerations along the x-, y-, and z-axes.
In order to collect the subjects’ data, we created an Android app that runs on a Motorola Moto G mobile phone. This device contains a built-in API, which outputs the device’s linear acceleration after negating the effects of gravity. Our Android app has a built-in button, which is used to start and stop data collection periods during our experiments.
For our experiments, we recruited two subjects, Subject 1 and Subject 2, to grip the mobile devices in their right hands, keep their right arms outstretched, and maintain those steady postures for 10 seconds. The subjects’ right arms form 90-degree angles with their bodies while their left arms are kept by their sides. Additionally, their right feet are in front of the left feet in straight lines, so that their left toes are touching their right heels. During the experiments, the subjects keep their eyes to better test their balance and motor skills.
We first tested both subjects in sober states, before they consumed any alcohol. After that, both subjects consumed 3, 6, and 9 drinks of alcohol over a period of 120 minutes, while we recorded their data. To keep results consistent, we defined one drink in this paper to be 1.25 oz of 80 proof liquor i.e., vodka.
4. Results
After we finished our experiments with both subjects, we had four data points per subject, which came out to eight total data points for our initial analysis. For each of the data points, we plotted the subjects’ accelerations across the x-, y- and z-axes against their times, which can be seen above in Figures 2 – 9.

Figure 2: Accelerometer reading when Subject 1 has had 0 drinks

Figure 3: Accelerometer reading when Subject 1 has had 3 drinks

Figure 4: Accelerometer reading when Subject 1 has had 6 drinks

Figure 5: Accelerometer reading when Subject 1 has had 9 drinks

Figure 6: Accelerometer reading when Subject 2 has had 0 drinks

Figure 7: Accelerometer reading when Subject 2 has had 3 drinks

Figure 8: Accelerometer reading when Subject 2 has had 6 drinks

Figure 9: Accelerometer reading when Subject 2 has had 9 drinks
5. Supervised Learning Model Results
Based on the results of our experiments, we clearly see a distinction between the sober data (0 – 3 drinks) and intoxicated data (6 – 9 drinks). Additionally, there is a relationship between the number of drinks the subjects consumed and the “unsteadiness” of their corresponding data. In order to capture this change, we examined two features: variance and highest frequency.
Table 1: Tabulation of variance, highest frequency, and BAC for Subject 1
| Number of Drinks | Estimated BAC | Variance | Highest Frequency |
| 0 | 0 | 0.0382 | 0.1227 |
| 3 | 0.06 | 0.06059 | 0.11793 |
| 6 | 0.19 | 1.749063 | 2.860169 |
| 9 | 0.28 | 9.80903 | 1.5984 |
Table 2: Tabulation of variance, highest frequency, and BAC for Subject 2
| Number of Drinks | Estimated BAC | Variance | Highest Frequency |
| 0 | 0 | 0.0556 | 0.1092 |
| 3 | 0.06 | 0.06059 | 0.11793 |
| 6 | 0.19 | 0.214869 | 2.216312 |
| 9 | 0.28 | 12.0609 | 0.92764 |
The first distinguishing feature we found was the variance in the amplitude (as can be seen in Table 1). We calculated it for each axis, and then we selected the maximum variance from all the three axes. This helped make the feature more stable as the subjects held the phones in different positions, since we noticed that the variances transferred amongst the axes.
The second distinguishing feature we found was the highest frequency component of the time-series data (as can be seen in Table 2). However, the relationship was not easy to detect as in the case of the previous correlation between BAC and variance, despite the fact that the highest frequency for intoxicated data was still higher than the highest frequency for sober data.

Figure 10: Breakdown of data in terms of datapoints across both subjects

Figure 11: Density plots of the raw data for both subjects
Both the variance analysis and the highest frequency analysis were performed on Subject 1’s data and Subject 2’s data. Figures 10 – 11 given above indicate different levels of increasing variance and highest frequency, which we use to estimate the BAC of Subjects 1 and 2. The BAC is calculated using the number of drinks consumed and the body mass of each subject.
Based on these results, we tried the following Supervised Machine Learning methods to determine which methods were most effective at distinguishing between sober and intoxicated subjects, where we used Subject 1’s data as the training dataset and Subject 2’s data as the test dataset.
5.1. SVM
For the SVM implementation, we used the Python class “sklearn.svm.SVC” [14]. For our implementation, we used two different SVM models to see the different effects on the datasets (as shown in the Table 3). First, we used the original SVM model with the values of “random_state=None, kernel=poly”. However, to see the effect of hyperparameter tuning the original SVM model, we updated several different parameters “random_state=0, kernel=rbf” [15,16].
Table 3: SVM model statistics
| SVM Type | Accuracy (%) | Execution Time (s) |
| Default SVM | 100 | 0.18 |
| Adjusted SVM | 67 | 0.19 |
5.2. Decision Tree Learning
For the decision tree implementation, we used the Python class “sklearn.tree.DecisionTreeClassifier” [17]. For our implementation, we used two different decision trees to see the different effects on the datasets (as shown in the Table 4). First, we used the regular decision tree with the default values of “gini” for the Gini impurity and “None” for the “max-depth”. However, to see the effect of pruning the trees, we tested two different parameters: setting “entropy” for the information gain and setting the “max_depth” to “3” [18].
Table 4: Decision Tree Learning model statistics
| Decision Tree Type | Accuracy (%) | Execution Time (s) |
| Default Decision Tree | 67 | 0.71 |
| Pruned Decision Tree | 100 | 0.81 |
5.3. Boosting
For the decision tree implementation, we used the Python class “sklearn.tree. GradientBoost-ingClassifier” [19]. For our implementation, we used two different boosting classifiers to see the different effects on the datasets (as shown in the Table 5). First, we used regular gradient boosting with the default values of “n_estimators=100, learning_rate=0.1, max_depth=3” [20]. However, to see the effect of hyperparameter tuning the boosted model, we updated several different parameters “n_estimators=1000, learning_rate=1.0, max_depth=1” [16,18].
Table. 5. Gradient Boosting model statistics
| Gradient Boosting Type | Accuracy (%) | Execution Time (s) |
| Default Gradient Boosting | 67 | 1.13 |
| Adjusted Gradient Boosting | 67 | 1.12 |
5.4. KNN
For the KNN implementation, we used the Python class “sklearn.neighbors.KNeighborsClassifier” [21]. For our implementation, we used two different KNN models to see the different effects on the datasets (as shown in the Table 6). We created a loop to test the models on our dataset by performing hyperparameter tuning and updating the “n_neighbors” value to values 2 through 8 [16,22].
Table 6: KNN model statistics
| KNN Type | Accuracy (%) | Execution Time (s) |
| Default KNN | 75 | 0.54 |
| Adjusted KNN | 80 | 0.53 |
5.5. Neural Networks
For the neural network implementation, we used the Python class “keras.Sequential” [22]. For our implementation, we used two different models to see the different effects on the datasets (as shown in the Table 7). First, we used the regular keras model with two layers and 1000 epochs for the number of iterations across the data. However, to see the effect of an additional layer in my model, we tested the same keras model with a third layer and “epochs” value of “1000” for the number of cycles across the data [13,23].
Table. 7. NN model statistics
| NN Type | Accuracy (%) | Epoch |
| Default NN | 100 | 1000 |
| Adjusted NN | 67 | 1000 |
6. Analysis
6.1. SVM
This algorithm gave us some of the better results in both subjects’ datasets, as can be seen in Figure 12 below. This was potentially due to the fact that this algorithm is a good general-purpose classification algorithm, especially since we used the “rbf” kernel, which is known for its general and wide-spread use across many types of datasets [24].

Figure 12: Confusion Matrix for SVM model
The main goal of this algorithm is to divide datasets into several classes in order to find a maximum marginal hyperplane (MMH). This can be done in the following two steps where first Support Vector Machines will generate hyperplanes iteratively that separate the classes in the best way and thereafter, they will choose the hyperplane that segregate the classes correctly [25].
We decided to perform hyperparameter tuning on our SVM and test between the rbf and polynomial kernels. We tested these because we know that the polynomial kernel is typical considered more generalized and therefore less efficient and accurate, but the rbf kernel is considered to be one of the most preferred kernel functions in SVM [24].
Based on the average runtime for our runs, we got 0.18 seconds per run as per wall clock time.
Unlike other algorithms, we didn’t do too many modifications. We only tested with different kernels, since we wanted to test a kernel that was good with generalized results to avoid overfit-ting/underfitting the long-tailed tailed variables. We didn’t show the results for the “poly” kernel because the “rbf” kernel performed better on the non-uniform datasets.
6.2. Decision Tree Learning
We generally got better training results with decision trees, as can be seen in Figures 12 – 13 above. This might be due to our pruning methods, or it could be because of the bi-directional tails in some of the attributes in the dataset that rendered the algorithm less effective.

Figure 13. Node breakdown of Decision Tree model
This represents a flowchart-like tree structure, in that each internal node signifies a test on an attribute, each branch represents an outcome of the test, and each leaf node (terminal node) holds a class label [26]. This algorithm performs bests on non-linear datasets, which makes it a good choice (in theory) for our non-linear datasets
We decided to prune our trees using “criterion=entropy, max_depth=3”. We chose entropy because there are features that have “uncertainty” as they the values are clustered very close to each other [23]. Additionally, we did not want to overfit the model based on the training data, so we limited the max_depth to 3 [23].
Based on the average runtime for our runs, we got 1.12 seconds per run as per wall clock time.
As we mentioned above, we decided to prune our trees using “criterion=entropy, max_depth=3”. The results for the classifier for the default values were due to the fact that the model overfit on the training data and didn’t give as good performance on the testing data.

Figure 14: Confusion Matrix for Decision Tree model
6.3. Boosting
This algorithm provided interesting results because for both datasets because the larger the training data size, the better the test predictions, as can be seen in Figures 14 – 15 below. This could be due to the fact that, similar to our neural network algorithm implementation, we used a large number of iterations.

Figure 15: Breakdown for Boosting model
Since we were doing a Binary Classification problem, we used Gradient Boosting, which builds an additive model in a forward stage-wise fashion. This allows it to optimize arbitrary differentiable loss functions, where in each stage n_classes, regression trees are fit on the negative gradient of the binomial or multinomial deviance loss function [27].
We decided to perform hyperparameter tuning on our GradientBoostingClassifier and set “n_estimators=1000, learning_rate=1.0, max_depth=1”. We chose these values because we know that Boosting algorithms are prone to overfitting, so by choosing a high number of boosting stages to perform while shrinking the contribution of each tree by learning_rate, we thought we could counteract that tendency [19].
We set “n_estimators=1000”, so there were 1000 iterations per run. We decided to test with different values for “learning_rate” and “n_estimators” and only showed the results for “n_estimators=1000” and “learning_rate=1.0”, because this combination showed the best performance. This is because we can have a large “large learning rate” with iterations or have a “slow learning rate” with more iterations. This combination was a good middle choice to get the best of both parameters [24].

Figure 16: Confusion Matrix for Boosting model
6.4. KNN
This algorithm gave us expected results in that as our training dataset size increased, our test performance decreased significantly, as can be seen in Figures 16-17 below. This is a clear case of the algorithm overfitting on the training dataset and suffering as a result on the testing dataset.

Figure 17: Confusion Matrix for KNN model
This algorithm uses the nearest neighbors in a dataset, where those data points that have minimum distance in feature space from our new data point. In this algorithm, “K” is the number of such data points we consider as part of our implementation of the algorithm. As a result, distance metric and K value are two important considerations while using the KNN algorithm [14].
We decided to perform hyperparameter tuning on our KNN model and test the k values from two through five. We tested these because we wanted to test the underfitting vs overfitting nature of our model and realized that each of the two datasets had different k values that performed the best.
Based on the average runtime for my runs, we got 0.54 seconds per run as per wall clock time.
The only tuning we did on our KNN classifier was testing the k values from two through eight. We didn’t show the results for the other values because the performance was degrading after certain “k” values for both datasets.
6.5. Neural Networks
We noticed that this algorithm gave us the most consistent results across both datasets, as can be seen in Figure 18 below. This might be due to the large number of cycles we ran the algorithm on both datasets.

Figure 18: Loss function for NN model
This represents a mind-like algorithm which has different layers of nodes, or neurons, that get activated based on different parameters. In the case of classification datasets, like both of ours, the output layer classifies each example, applying the most likely label. Each node on the output layer represents one label. In turn, that node turns on or off according to the strength of the signal it receives from the previous layer’s input and parameters [28].
We decided to perform hyperparameter tuning on our neural networks using two activation functions: “relu” and “sigmoid”. We chose sigmoid because “the use of a single Sigmoid/Logistic neuron in the output layer is the mainstay of a binary classification neural network” [27].
We set epoch to 1000 for our tests, so there were 1000 cycles per run and we decided to test with two different stopping criteria: epoch 200 and epoch 1000. We didn’t choose to show the results for the epoch 200 because both the accuracy was higher for the epoch 1000 neural networks due to the higher number of cycles that all of the data was being processed [29]. Additionally, we tested with two and three activations layers, but only showed results for two activation layers because the adding the third activation layer hurt accuracy on the test dataset due to the fact that we used “softsign” as our third layer, which skewed results negatively [30].
7. Conclusion
The data we collected from our two subjects confirms that the effect of alcohol consumption by individuals can be strong enough to alter accelerometer readings. These readings can then be used to classify individuals’ sobriety levels. One of the limitations of our approach is that clear distinctions are mostly visible only for subjects who are well beyond the regular drinking amount, therefore, for intoxicated individuals who are not well beyond that drinking amount, the classifier model that we built is not as successful in establishing a clear difference between sober and intoxicated states. Additionally, in order to be able to classify “less intoxicated” vs “more intoxicated”, we will require larger and more diverse datasets.
7.1. Effect of Cross-Validation
There is tremendous benefit to use a K-folds cross-validation over the standard random data splits because when we build K different models, we are able to make predictions on all of our data. This is especially helpful in smaller data sets so that the algorithm can recognize better patterns [24].
7.2. Definition of Best
For our analysis, we defined “best” as the algorithm that gave us the best balance of the highest test accuracy (not necessarily highest training accuracy) based on a training set size and the fastest execution time.
7.3. Best Classifier
Each algorithm has its own strengths and weaknesses and is therefore good/bad on different types of datasets. That being said, we saw that for datasets that contain both uniform and tailed distributions, such as our subjects’ accelerometer data, the Decision Tree Learning was the best Supervised Machine Learning method and should be used as part of our mobile device sobriety classification system.
8. Future Work
In order to build on our research, we want to address the issue of limited test subjects and expand the experiments to include a larger sample size that is comprised of individuals with varied genders, body compositions, backgrounds, etc. [31].
Additionally, we want to explore real world-use cases that can be addressed by our research results. For example, our system can be used with data from any accelerometer, such as those found in car steering wheels. This means that if our system is embedded into cars, then cars’ internal systems will potentially be able to warn drivers if their BAC levels are above the legal driving limit, which will directly reduce traffic fatalities caused by alcohol consumption.
Conflict of Interest
The authors declare no conflict of interest.
Acknowledgment
We owe our gratitude to our mentors for guiding us throughout our experiments: Professor Pei Zhang and Dr. Xinlei Chen. We also thank both of our anonymous subjects for participating in our experiments, so that we could monitor their data and use it as part of our analysis.
- D. Kumar, A. Thanikkal, P. Krishnamurthy, X. Chen, P. Zhang, “Accelerometer-Based Alcohol Consumption Detection from Physical Activity,” in 2021 17th International Conference on Wireless and Mobile Computing, Networking and Communications (WiMob), 415–418, 2021, doi:10.1109/WiMob52687.2021.9606257.
- R.W. Olsen, H.J. Hanchar, P. Meera, M. Wallner, “GABAA receptor subtypes: the ‘one glass of wine’ receptors,” Alcohol, 41(3), 201–209, 2007, doi:https://doi.org/10.1016/j.alcohol.2007.04.006.
- Alcohol Facts and Statistics. National Institute on Alcohol Abuse and Alcoholism, 2021, Online: http://www.niaaa.nih.gov/alcohol-health/overview-alcohol-consumption/alcohol-facts-and-statistics..
- D. Kumar, Supervised Learning, Georgia Institute of Technology, 2022.
- R. Li, G.P. Balakrishnan, J. Nie, Y. Li, E. Agu, K. Grimone, D. Herman, A.M. Abrantes, M.D. Stein, “Estimation of Blood Alcohol Concentration From Smartphone Gait Data Using Neural Networks,” IEEE Access, 9, 61237–61255, 2021, doi:10.1109/ACCESS.2021.3054515.
- C. Nickel, C. Busch, “Classifying accelerometer data via hidden Markov models to authenticate people by the way they walk,” IEEE Aerospace and Electronic Systems Magazine, 28(10), 29–35, 2013, doi:10.1109/MAES.2013.6642829.
- B. Suffoletto, P. Dasgupta, R. Uymatiao, J. Huber, K. Flickinger, E. Sejdic, “A Preliminary Study Using Smartphone Accelerometers to Sense Gait Impairments Due to Alcohol Intoxication,” Journal of Studies on Alcohol and Drugs, 81(4), 505–510, 2020, doi:10.15288/jsad.2020.81.505.
- S. Bae, D. Ferreira, B. Suffoletto, J.C. Puyana, R. Kurtz, T. Chung, A.K. Dey, “Detecting Drinking Episodes in Young Adults Using Smartphone-Based Sensors,” in Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., Association for Computing Machinery, New York, NY, USA, 2017, doi:10.1145/3090051.
- J.A. Killian, K.M. Passino, A. Nandi, D.R. Madden, J. Clapp, “Learning to detect heavy drinking episodes using smartphone accelerometer data,” CEUR Workshop Proceedings, 2429, 35–42, 2019.
- T.H.M. Zaki, M. Sahrim, J. Jamaludin, S.R. Balakrishnan, L.H. Asbulah, F.S. Hussin, “The Study of Drunken Abnormal Human Gait Recognition using Accelerometer and Gyroscope Sensors in Mobile Application,” in 2020 16th IEEE International Colloquium on Signal Processing & Its Applications (CSPA), 151–156, 2020, doi:10.1109/CSPA48992.2020.9068676.
- Z. Arnold, D. Larose, E. Agu, “Smartphone Inference of Alcohol Consumption Levels from Gait,” in 2015 International Conference on Healthcare Informatics, 417–426, 2015, doi:10.1109/ICHI.2015.59.
- Aiello, Agu, “Investigating postural sway features, normalization and personalization in detecting blood alcohol levels of smartphone users,” in 2016 IEEE Wireless Health (WH), 1–8, 2016, doi:10.1109/WH.2016.7764559.
- W. S, How Alcoholism Works, HowStuffWorks Science, 2021, Online: http://science.howstuffworks.com/life/inside-the-mind/ human brain/alcoholism4.htm
- Sklearn.svm.SVC, Scikit, 2021, Online: https://scikit-learn.org/ stable/modules/generated/sklearn.svm.SVC.html
- A Beginner’s Guide to Neural Networks and Deep Learning, Pathmind, 2021, Online: https://wiki.pathmind.com/neural-network#logistic
- R. Pramoditha, Plotting the Learning Curve with a Single Line of Code.” Medium, Towards Data Science, Medium, Towards Data Science, 2021, Online: https://towardsdatascience.com/plotting-the-learning-curve-with-a-single-line-of-code-90a5bbb0f48a
- Sklearn.tree.decisiontreeclassifier, Scikit, 2021, Online: https://scikit-learn.org/stable/modules/generated/sklearn.tree. Decision TreeClassifier.htmll
- Romuald_84, Boosting: Why Is the Learning Rate Called a Regularization Parameter?” Cross Validated, Cross Validated, 1963, Online: https://stats.stackexchange.com/questions/168666/boosting -why-is-the-learning-rate-called-a-regularization-parameter..
- Sklearn.neighbors.kneighborsclassifier, Scikit, 2021, Online: https://scikit-learn.org/stable/modules/generated/sklearn.neighbors. KNeighborsClassifier.html.
- Baeldung, Epoch in Neural Networks, Baeldung on Computer Science, 2021, Online: https://www.baeldung.com/cs/epoch-neural-networks.
- C. Chaine, Using Reinforcement Learning for Classfication Problems, Stack Overflow, 1965, Online: https://stackoverflow.com/questions/44594007/using-reinforcement-learning-for-classfication-problems.
- Htoukour, Neural Networks to Predict Diabetes, Kaggle, 2018, Online: https://www.kaggle.com/htoukour/neural-networks-to-predict-diabetes
- Alcohol’s Effects on the Body. National Institute on Alcohol Abuse and Alcoholism, NIH, 2021, Online: http://www.niaaa.nih.gov/alcohol-health/alcohols-effects-body
- Sklearn.ensemble.gradientboostingclassifier, Scikit, 2021, Online: https://scikit-learn.org/stable/modules/generated/sklearn.ensemble. GradientBoostingClassifier.html.
- Shulga, Dima, 5 Reasons Why You Should Use Cross-Validation in Your Data Science Projects, Medium, Towards Data Science, 2018, Online: https://towardsdatascience.com/5-reasons-why-you-should -use-cross-validation-in-your-data-science-project-8163311a1e79.
- Decision Tree, GeeksforGeeks, 2021, Online: https://www.geeksforgeeks.org/decision-tree/
- Decision Tree Algorithm – A Complete Guide, Analytics Vidhya, 2021, Online: https://www.analyticsvidhya.com/blog/2021/08/ decision-tree-algorithm/.
- Complete-Life-Cycle-of-a-Data-Science-Project, Complete Life Cycle Of A Data Science Project, 2021, Online: https://awesomeopensource.com/project/achuthasubhash/Complete-Life-Cycle-of-a-Data-Science-Project
- Machine Learning with Python – Algorithms, 2022, Online: Online: https://awesomeopensource.com/project/achuthasubhash/Complete-Life-Cycle-of-a-Data-Science-Project
- K. Team, Keras Documentation: The Sequential Class, Keras, 2021, Online: https://keras.io/api/models/sequential/
- D. Deponti, D. Maggiorini, C.E. Palazzi, “DroidGlove: An android-based application for wrist rehabilitation,” in 2009 International Conference on Ultra Modern Telecommunications & Workshops, 1–7, 2009, doi:10.1109/ICUMT.2009.5345442.
- Vikas Thammanna Gowda, Landis Humphrey, Aiden Kadoch, YinBo Chen, Olivia Roberts, "Multi Attribute Stratified Sampling: An Automated Framework for Privacy-Preserving Healthcare Data Publishing with Multiple Sensitive Attributes", Advances in Science, Technology and Engineering Systems Journal, vol. 11, no. 1, pp. 51–68, 2026. doi: 10.25046/aj110106
- Kohinur Parvin, Eshat Ahmad Shuvo, Wali Ashraf Khan, Sakibul Alam Adib, Tahmina Akter Eiti, Mohammad Shovon, Shoeb Akter Nafiz, "Computationally Efficient Explainable AI Framework for Skin Cancer Detection", Advances in Science, Technology and Engineering Systems Journal, vol. 11, no. 1, pp. 11–24, 2026. doi: 10.25046/aj110102
- David Degbor, Haiping Xu, Pratiksha Singh, Shannon Gibbs, Donghui Yan, "StradNet: Automated Structural Adaptation for Efficient Deep Neural Network Design", Advances in Science, Technology and Engineering Systems Journal, vol. 10, no. 6, pp. 29–41, 2025. doi: 10.25046/aj100603
- Glender Brás, Samara Leal, Breno Sousa, Gabriel Paes, Cleberson Junior, João Souza, Rafael Assis, Tamires Marques, Thiago Teles Calazans Silva, "Machine Learning Methods for University Student Performance Prediction in Basic Skills based on Psychometric Profile", Advances in Science, Technology and Engineering Systems Journal, vol. 10, no. 4, pp. 1–13, 2025. doi: 10.25046/aj100401
- khawla Alhasan, "Predictive Analytics in Marketing: Evaluating its Effectiveness in Driving Customer Engagement", Advances in Science, Technology and Engineering Systems Journal, vol. 10, no. 3, pp. 45–51, 2025. doi: 10.25046/aj100306
- Khalifa Sylla, Birahim Babou, Mama Amar, Samuel Ouya, "Impact of Integrating Chatbots into Digital Universities Platforms on the Interactions between the Learner and the Educational Content", Advances in Science, Technology and Engineering Systems Journal, vol. 10, no. 1, pp. 13–19, 2025. doi: 10.25046/aj100103
- Ahmet Emin Ünal, Halit Boyar, Burcu Kuleli Pak, Vehbi Çağrı Güngör, "Utilizing 3D models for the Prediction of Work Man-Hour in Complex Industrial Products using Machine Learning", Advances in Science, Technology and Engineering Systems Journal, vol. 9, no. 6, pp. 01–11, 2024. doi: 10.25046/aj090601
- Yangjun Chen, Bobin Chen, "On Mining Most Popular Packages", Advances in Science, Technology and Engineering Systems Journal, vol. 9, no. 4, pp. 60–72, 2024. doi: 10.25046/aj090407
- Haruki Murakami, Takuma Miwa, Kosuke Shima, Takanobu Otsuka, "Proposal and Implementation of Seawater Temperature Prediction Model using Transfer Learning Considering Water Depth Differences", Advances in Science, Technology and Engineering Systems Journal, vol. 9, no. 4, pp. 01–06, 2024. doi: 10.25046/aj090401
- Brandon Wetzel, Haiping Xu, "Deploying Trusted and Immutable Predictive Models on a Public Blockchain Network", Advances in Science, Technology and Engineering Systems Journal, vol. 9, no. 3, pp. 72–83, 2024. doi: 10.25046/aj090307
- Anirudh Mazumder, Kapil Panda, "Leveraging Machine Learning for a Comprehensive Assessment of PFAS Nephrotoxicity", Advances in Science, Technology and Engineering Systems Journal, vol. 9, no. 3, pp. 62–71, 2024. doi: 10.25046/aj090306
- Taichi Ito, Ken’ichi Minamino, Shintaro Umeki, "Visualization of the Effect of Additional Fertilization on Paddy Rice by Time-Series Analysis of Vegetation Indices using UAV and Minimizing the Number of Monitoring Days for its Workload Reduction", Advances in Science, Technology and Engineering Systems Journal, vol. 9, no. 3, pp. 29–40, 2024. doi: 10.25046/aj090303
- Henry Toal, Michelle Wilber, Getu Hailu, Arghya Kusum Das, "Evaluation of Various Deep Learning Models for Short-Term Solar Forecasting in the Arctic using a Distributed Sensor Network", Advances in Science, Technology and Engineering Systems Journal, vol. 9, no. 3, pp. 12–28, 2024. doi: 10.25046/aj090302
- Tinofirei Museba, Koenraad Vanhoof, "An Adaptive Heterogeneous Ensemble Learning Model for Credit Card Fraud Detection", Advances in Science, Technology and Engineering Systems Journal, vol. 9, no. 3, pp. 01–11, 2024. doi: 10.25046/aj090301
- Marco I. Bonelli, Jiahao Liu, "Revolutionizing Robo-Advisors: Unveiling Global Financial Markets, AI-Driven Innovations, and Technological Landscapes for Enhanced Investment Decisions", Advances in Science, Technology and Engineering Systems Journal, vol. 9, no. 2, pp. 33–44, 2024. doi: 10.25046/aj090205
- Toya Acharya, Annamalai Annamalai, Mohamed F Chouikha, "Optimizing the Performance of Network Anomaly Detection Using Bidirectional Long Short-Term Memory (Bi-LSTM) and Over-sampling for Imbalance Network Traffic Data", Advances in Science, Technology and Engineering Systems Journal, vol. 8, no. 6, pp. 144–154, 2023. doi: 10.25046/aj080614
- Renhe Chi, "Comparative Study of J48 Decision Tree and CART Algorithm for Liver Cancer Symptom Analysis Using Data from Carnegie Mellon University", Advances in Science, Technology and Engineering Systems Journal, vol. 8, no. 6, pp. 57–64, 2023. doi: 10.25046/aj080607
- Ng Kah Kit, Hafeez Ullah Amin, Kher Hui Ng, Jessica Price, Ahmad Rauf Subhani, "EEG Feature Extraction based on Fast Fourier Transform and Wavelet Analysis for Classification of Mental Stress Levels using Machine Learning", Advances in Science, Technology and Engineering Systems Journal, vol. 8, no. 6, pp. 46–56, 2023. doi: 10.25046/aj080606
- Nizar Sakli, Chokri Baccouch, Hedia Bellali, Ahmed Zouinkhi, Mustapha Najjari, "IoT System and Deep Learning Model to Predict Cardiovascular Disease Based on ECG Signal", Advances in Science, Technology and Engineering Systems Journal, vol. 8, no. 6, pp. 08–18, 2023. doi: 10.25046/aj080602
- Kitipoth Wasayangkool, Kanabadee Srisomboon, Chatree Mahatthanajatuphat, Wilaiporn Lee, "Accuracy Improvement-Based Wireless Sensor Estimation Technique with Machine Learning Algorithms for Volume Estimation on the Sealed Box", Advances in Science, Technology and Engineering Systems Journal, vol. 8, no. 3, pp. 108–117, 2023. doi: 10.25046/aj080313
- Chaiyaporn Khemapatapan, Thammanoon Thepsena, "Forecasting the Weather behind Pa Sak Jolasid Dam using Quantum Machine Learning", Advances in Science, Technology and Engineering Systems Journal, vol. 8, no. 3, pp. 54–62, 2023. doi: 10.25046/aj080307
- Der-Jiun Pang, "Hybrid Machine Learning Model Performance in IT Project Cost and Duration Prediction", Advances in Science, Technology and Engineering Systems Journal, vol. 8, no. 2, pp. 108–115, 2023. doi: 10.25046/aj080212
- Paulo Gustavo Quinan, Issa Traoré, Isaac Woungang, Ujwal Reddy Gondhi, Chenyang Nie, "Hybrid Intrusion Detection Using the AEN Graph Model", Advances in Science, Technology and Engineering Systems Journal, vol. 8, no. 2, pp. 44–63, 2023. doi: 10.25046/aj080206
- Ossama Embarak, "Multi-Layered Machine Learning Model For Mining Learners Academic Performance", Advances in Science, Technology and Engineering Systems Journal, vol. 6, no. 1, pp. 850–861, 2021. doi: 10.25046/aj060194
- Clemens Gnauer, Andrea Prochazka, Elke Szalai, Sebastian Chlup, Anton Fraunschiel, "Technical Aspects and Social Science Expertise to Support Safe and Secure Handling of Autonomous Railway Systems", Advances in Science, Technology and Engineering Systems Journal, vol. 7, no. 6, pp. 283–294, 2022. doi: 10.25046/aj070632
- Roy D Gregori Ayon, Md. Sanaullah Rabbi, Umme Habiba, Maoyejatun Hasana, "Bangla Speech Emotion Detection using Machine Learning Ensemble Methods", Advances in Science, Technology and Engineering Systems Journal, vol. 7, no. 6, pp. 70–76, 2022. doi: 10.25046/aj070608
- Zhumakhan Nazir, Temirlan Zarymkanov, Jurn-Guy Park, "A Machine Learning Model Selection Considering Tradeoffs between Accuracy and Interpretability", Advances in Science, Technology and Engineering Systems Journal, vol. 7, no. 4, pp. 72–78, 2022. doi: 10.25046/aj070410
- Ayoub Benchabana, Mohamed-Khireddine Kholladi, Ramla Bensaci, Belal Khaldi, "A Supervised Building Detection Based on Shadow using Segmentation and Texture in High-Resolution Images", Advances in Science, Technology and Engineering Systems Journal, vol. 7, no. 3, pp. 166–173, 2022. doi: 10.25046/aj070319
- Toshiki Watanabe, Hiroyuki Kameda, "Designing a Model of Consciousness Based on the Findings of Jungian Psychology", Advances in Science, Technology and Engineering Systems Journal, vol. 6, no. 5, pp. 356–361, 2021. doi: 10.25046/aj060540
- Caglar Arslan, Selen Sipahio?lu, Emre ?afak, Mesut Gözütok, Tacettin Köprülü, "Comparative Analysis and Modern Applications of PoW, PoS, PPoS Blockchain Consensus Mechanisms and New Distributed Ledger Technologies", Advances in Science, Technology and Engineering Systems Journal, vol. 6, no. 5, pp. 279–290, 2021. doi: 10.25046/aj060531
- Osaretin Eboya, Julia Binti Juremi, "iDRP Framework: An Intelligent Malware Exploration Framework for Big Data and Internet of Things (IoT) Ecosystem", Advances in Science, Technology and Engineering Systems Journal, vol. 6, no. 5, pp. 185–202, 2021. doi: 10.25046/aj060521
- Arwa Alghamdi, Graham Healy, Hoda Abdelhafez, "Machine Learning Algorithms for Real Time Blind Audio Source Separation with Natural Language Detection", Advances in Science, Technology and Engineering Systems Journal, vol. 6, no. 5, pp. 125–140, 2021. doi: 10.25046/aj060515
- Baida Ouafae, Louzar Oumaima, Ramdi Mariam, Lyhyaoui Abdelouahid, "Survey on Novelty Detection using Machine Learning Techniques", Advances in Science, Technology and Engineering Systems Journal, vol. 6, no. 5, pp. 73–82, 2021. doi: 10.25046/aj060510
- Nuobei Shi, Qin Zeng, Raymond Shu Tak Lee, "The Design and Implementation of Intelligent English Learning Chabot based on Transfer Learning Technology", Advances in Science, Technology and Engineering Systems Journal, vol. 6, no. 5, pp. 32–42, 2021. doi: 10.25046/aj060505
- Radwan Qasrawi, Stephanny VicunaPolo, Diala Abu Al-Halawa, Sameh Hallaq, Ziad Abdeen, "Predicting School Children Academic Performance Using Machine Learning Techniques", Advances in Science, Technology and Engineering Systems Journal, vol. 6, no. 5, pp. 08–15, 2021. doi: 10.25046/aj060502
- Zhiyuan Chen, Howe Seng Goh, Kai Ling Sin, Kelly Lim, Nicole Ka Hei Chung, Xin Yu Liew, "Automated Agriculture Commodity Price Prediction System with Machine Learning Techniques", Advances in Science, Technology and Engineering Systems Journal, vol. 6, no. 4, pp. 376–384, 2021. doi: 10.25046/aj060442
- Banir Rimbawansyah Hasanuddin, Sani Muhammad Isa, "Business Intelligence Budget Implementation in Ministry of Finance (As Chief Operating Officer)", Advances in Science, Technology and Engineering Systems Journal, vol. 6, no. 4, pp. 123–129, 2021. doi: 10.25046/aj060414
- Hathairat Ketmaneechairat, Maleerat Maliyaem, Chalermpong Intarat, "Kamphaeng Saen Beef Cattle Identification Approach using Muzzle Print Image", Advances in Science, Technology and Engineering Systems Journal, vol. 6, no. 4, pp. 110–122, 2021. doi: 10.25046/aj060413
- Carla Blank, Matthew McBurney, Maria Morgan, Raed Seetan, "A Survey of Big Data Techniques for Extracting Information from Social Media Data", Advances in Science, Technology and Engineering Systems Journal, vol. 6, no. 3, pp. 189–204, 2021. doi: 10.25046/aj060322
- Md Mahmudul Hasan, Nafiul Hasan, Dil Afroz, Ferdaus Anam Jibon, Md. Arman Hossen, Md. Shahrier Parvage, Jakaria Sulaiman Aongkon, "Electroencephalogram Based Medical Biometrics using Machine Learning: Assessment of Different Color Stimuli", Advances in Science, Technology and Engineering Systems Journal, vol. 6, no. 3, pp. 27–34, 2021. doi: 10.25046/aj060304
- Marlene Ofelia Sanchez-Escobar, Julieta Noguez, Jose Martin Molina-Espinosa, Rafael Lozano-Espinosa, "Supporting the Management of Predictive Analytics Projects in a Decision-Making Center using Process Mining", Advances in Science, Technology and Engineering Systems Journal, vol. 6, no. 2, pp. 1084–1090, 2021. doi: 10.25046/aj0602123
- Dominik Štursa, Daniel Honc, Petr Doležel, "Efficient 2D Detection and Positioning of Complex Objects for Robotic Manipulation Using Fully Convolutional Neural Network", Advances in Science, Technology and Engineering Systems Journal, vol. 6, no. 2, pp. 915–920, 2021. doi: 10.25046/aj0602104
- Md Mahmudul Hasan, Nafiul Hasan, Mohammed Saud A Alsubaie, "Development of an EEG Controlled Wheelchair Using Color Stimuli: A Machine Learning Based Approach", Advances in Science, Technology and Engineering Systems Journal, vol. 6, no. 2, pp. 754–762, 2021. doi: 10.25046/aj060287
- Antoni Wibowo, Inten Yasmina, Antoni Wibowo, "Food Price Prediction Using Time Series Linear Ridge Regression with The Best Damping Factor", Advances in Science, Technology and Engineering Systems Journal, vol. 6, no. 2, pp. 694–698, 2021. doi: 10.25046/aj060280
- Javier E. Sánchez-Galán, Fatima Rangel Barranco, Jorge Serrano Reyes, Evelyn I. Quirós-McIntire, José Ulises Jiménez, José R. Fábrega, "Using Supervised Classification Methods for the Analysis of Multi-spectral Signatures of Rice Varieties in Panama", Advances in Science, Technology and Engineering Systems Journal, vol. 6, no. 2, pp. 552–558, 2021. doi: 10.25046/aj060262
- Phillip Blunt, Bertram Haskins, "A Model for the Application of Automatic Speech Recognition for Generating Lesson Summaries", Advances in Science, Technology and Engineering Systems Journal, vol. 6, no. 2, pp. 526–540, 2021. doi: 10.25046/aj060260
- Jason Valera, Sebastian Herrera, "Design Approach of an Electric Single-Seat Vehicle with ABS and TCS for Autonomous Driving Based on Q-Learning Algorithm", Advances in Science, Technology and Engineering Systems Journal, vol. 6, no. 2, pp. 464–471, 2021. doi: 10.25046/aj060253
- Sebastianus Bara Primananda, Sani Muhamad Isa, "Forecasting Gold Price in Rupiah using Multivariate Analysis with LSTM and GRU Neural Networks", Advances in Science, Technology and Engineering Systems Journal, vol. 6, no. 2, pp. 245–253, 2021. doi: 10.25046/aj060227
- Hyeongjoo Kim, Sunyong Byun, "Designing and Applying a Moral Turing Test", Advances in Science, Technology and Engineering Systems Journal, vol. 6, no. 2, pp. 93–98, 2021. doi: 10.25046/aj060212
- Helen Leligou, Despina Anastasopoulos, Anita Montagna, Vassilis Solachidis, Nicholas Vretos, "Combining ICT Technologies To Serve Societal Challenges", Advances in Science, Technology and Engineering Systems Journal, vol. 6, no. 1, pp. 1319–1327, 2021. doi: 10.25046/aj0601151
- Byeongwoo Kim, Jongkyu Lee, "Fault Diagnosis and Noise Robustness Comparison of Rotating Machinery using CWT and CNN", Advances in Science, Technology and Engineering Systems Journal, vol. 6, no. 1, pp. 1279–1285, 2021. doi: 10.25046/aj0601146
- Md Mahmudul Hasan, Nafiul Hasan, Mohammed Saud A Alsubaie, Md Mostafizur Rahman Komol, "Diagnosis of Tobacco Addiction using Medical Signal: An EEG-based Time-Frequency Domain Analysis Using Machine Learning", Advances in Science, Technology and Engineering Systems Journal, vol. 6, no. 1, pp. 842–849, 2021. doi: 10.25046/aj060193
- Reem Bayari, Ameur Bensefia, "Text Mining Techniques for Cyberbullying Detection: State of the Art", Advances in Science, Technology and Engineering Systems Journal, vol. 6, no. 1, pp. 783–790, 2021. doi: 10.25046/aj060187
- Anass Barodi, Abderrahim Bajit, Taoufiq El Harrouti, Ahmed Tamtaoui, Mohammed Benbrahim, "An Enhanced Artificial Intelligence-Based Approach Applied to Vehicular Traffic Signs Detection and Road Safety Enhancement", Advances in Science, Technology and Engineering Systems Journal, vol. 6, no. 1, pp. 672–683, 2021. doi: 10.25046/aj060173
- Inna Valieva, Iurii Voitenko, Mats Björkman, Johan Åkerberg, Mikael Ekström, "Multiple Machine Learning Algorithms Comparison for Modulation Type Classification Based on Instantaneous Values of the Time Domain Signal and Time Series Statistics Derived from Wavelet Transform", Advances in Science, Technology and Engineering Systems Journal, vol. 6, no. 1, pp. 658–671, 2021. doi: 10.25046/aj060172
- Carlos López-Bermeo, Mauricio González-Palacio, Lina Sepúlveda-Cano, Rubén Montoya-Ramírez, César Hidalgo-Montoya, "Comparison of Machine Learning Parametric and Non-Parametric Techniques for Determining Soil Moisture: Case Study at Las Palmas Andean Basin", Advances in Science, Technology and Engineering Systems Journal, vol. 6, no. 1, pp. 636–650, 2021. doi: 10.25046/aj060170
- Ndiatenda Ndou, Ritesh Ajoodha, Ashwini Jadhav, "A Case Study to Enhance Student Support Initiatives Through Forecasting Student Success in Higher-Education", Advances in Science, Technology and Engineering Systems Journal, vol. 6, no. 1, pp. 230–241, 2021. doi: 10.25046/aj060126
- Lonia Masangu, Ashwini Jadhav, Ritesh Ajoodha, "Predicting Student Academic Performance Using Data Mining Techniques", Advances in Science, Technology and Engineering Systems Journal, vol. 6, no. 1, pp. 153–163, 2021. doi: 10.25046/aj060117
- Teodoro Diaz-Leyva, Omar Chamorro-Atalaya, "Analysis of Learning Difficulties in Object Oriented Programming in Systems Engineering Students at UNTELS", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 6, pp. 1704–1709, 2020. doi: 10.25046/aj0506203
- Karamath Ateeq, Manas Ranjan Pradhan, Beenu Mago, "Elasticity Based Med-Cloud Recommendation System for Diabetic Prediction in Cloud Computing Environment", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 6, pp. 1618–1633, 2020. doi: 10.25046/aj0506193
- Tedi Priatna, Dian Sa’adillah Maylawati, Hamdan Sugilar, Muhammad Ali Ramdhani, "Social Engineering to Establish Digital Culture in Higher Education", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 6, pp. 1474–1479, 2020. doi: 10.25046/aj0506177
- El hadji Mbaye Ndiaye, Mactar Faye, Alphousseyni Ndiaye, "Comparative Study Between Three Methods for Optimizing the Power Produced from Photovoltaic Generator", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 6, pp. 1458–1465, 2020. doi: 10.25046/aj0506175
- Sara Ftaimi, Tomader Mazri, "Handling Priority Data in Smart Transportation System by using Support Vector Machine Algorithm", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 6, pp. 1422–1427, 2020. doi: 10.25046/aj0506172
- Othmane Rahmaoui, Kamal Souali, Mohammed Ouzzif, "Towards a Documents Processing Tool using Traceability Information Retrieval and Content Recognition Through Machine Learning in a Big Data Context", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 6, pp. 1267–1277, 2020. doi: 10.25046/aj0506151
- Azani Cempaka Sari, Natashia Virnilia, Jasmine Tanti Susanto, Kent Anderson Phiedono, Thea Kevin Hartono, "Chatbot Developments in The Business World", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 6, pp. 627–635, 2020. doi: 10.25046/aj050676
- Pearl Keitemoge, Daniel Tetteh Narh, "Effective Application of Information System for Purchase Process Optimization", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 6, pp. 594–605, 2020. doi: 10.25046/aj050673
- Alexander Raikov, "Accelerating Decision-Making in Transport Emergency with Artificial Intelligence", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 6, pp. 520–530, 2020. doi: 10.25046/aj050662
- Puttakul Sakul-Ung, Amornvit Vatcharaphrueksadee, Pitiporn Ruchanawet, Kanin Kearpimy, Hathairat Ketmaneechairat, Maleerat Maliyaem, "Overmind: A Collaborative Decentralized Machine Learning Framework", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 6, pp. 280–289, 2020. doi: 10.25046/aj050634
- Meriyem Chergui, Aziza Chakir, "IT GRC Smart Adviser: Process Driven Architecture Applying an Integrated Framework", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 6, pp. 247–255, 2020. doi: 10.25046/aj050629
- Khalid Ait Hadi, Rafik Lasri, Abdellatif El Abderrahmani, "Inferring Topics within Social Networking Big Data, Towards an Alternative for Socio-Political Measurement", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 6, pp. 155–159, 2020. doi: 10.25046/aj050618
- Pamela Zontone, Antonio Affanni, Riccardo Bernardini, Leonida Del Linz, Alessandro Piras, Roberto Rinaldo, "Supervised Learning Techniques for Stress Detection in Car Drivers", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 6, pp. 22–29, 2020. doi: 10.25046/aj050603
- Wongpanya Nuankaew, Pratya Nuankaew, "Tolerance of Characteristics and Attributes in Developing Student’s Academic Achievements", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 5, pp. 1126–1136, 2020. doi: 10.25046/aj0505137
- Kodai Kitagawa, Koji Matsumoto, Kensuke Iwanaga, Siti Anom Ahmad, Takayuki Nagasaki, Sota Nakano, Mitsumasa Hida, Shogo Okamatsu, Chikamune Wada, "Posture Recognition Method for Caregivers during Postural Change of a Patient on a Bed using Wearable Sensors", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 5, pp. 1093–1098, 2020. doi: 10.25046/aj0505133
- Zahra Jafari, Saman Rajebi, Siyamak Haghipour, "Using the Neural Network to Diagnose the Severity of Heart Disease in Patients Using General Specifications and ECG Signals Received from the Patients", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 5, pp. 882–892, 2020. doi: 10.25046/aj0505108
- Khalid A. AlAfandy, Hicham Omara, Mohamed Lazaar, Mohammed Al Achhab, "Using Classic Networks for Classifying Remote Sensing Images: Comparative Study", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 5, pp. 770–780, 2020. doi: 10.25046/aj050594
- Khalid A. AlAfandy, Hicham, Mohamed Lazaar, Mohammed Al Achhab, "Investment of Classic Deep CNNs and SVM for Classifying Remote Sensing Images", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 5, pp. 652–659, 2020. doi: 10.25046/aj050580
- Rajesh Kumar, Geetha S, "Malware Classification Using XGboost-Gradient Boosted Decision Tree", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 5, pp. 536–549, 2020. doi: 10.25046/aj050566
- Nghia Duong-Trung, Nga Quynh Thi Tang, Xuan Son Ha, "Interpretation of Machine Learning Models for Medical Diagnosis", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 5, pp. 469–477, 2020. doi: 10.25046/aj050558
- Mehdi Zhar, Omar Bouattane, Lhoussain Bahatti, "New Algorithm for the Development of a Musical Words Descriptor for the Artificial Composition of Oriental Music", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 5, pp. 434–443, 2020. doi: 10.25046/aj050554
- Chalinee Partanapat, Chuleerat Jaruskulchai, Chanankorn Jandaeng, "A Hybrid Mod", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 5, pp. 414–425, 2020. doi: 10.25046/aj050552
- Ladislav Burita, Ales Novak, "ISR Data Processing in Military Operations", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 5, pp. 314–331, 2020. doi: 10.25046/aj050540
- Oumaima Terrada, Soufiane Hamida, Bouchaib Cherradi, Abdelhadi Raihani, Omar Bouattane, "Supervised Machine Learning Based Medical Diagnosis Support System for Prediction of Patients with Heart Disease", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 5, pp. 269–277, 2020. doi: 10.25046/aj050533
- Haytham Azmi, "FPGA Acceleration of Tree-based Learning Algorithms", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 5, pp. 237–244, 2020. doi: 10.25046/aj050529
- Hicham Moujahid, Bouchaib Cherradi, Oussama El Gannour, Lhoussain Bahatti, Oumaima Terrada, Soufiane Hamida, "Convolutional Neural Network Based Classification of Patients with Pneumonia using X-ray Lung Images", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 5, pp. 167–175, 2020. doi: 10.25046/aj050522
- Wongpanya Nuankaew, Kanakarn Phanniphong, Sittichai Bussaman, Direk Teeraputon, Pratya Nuankaew, "Mentoring Model in an Active Learning Culture for Undergraduate Projects", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 4, pp. 805–815, 2020. doi: 10.25046/aj050495
- Young-Jin Park, Hui-Sup Cho, "A Method for Detecting Human Presence and Movement Using Impulse Radar", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 4, pp. 770–775, 2020. doi: 10.25046/aj050491
- Pratya Nuankaew, "Clustering of Mindset towards Self-Regulated Learning of Undergraduate Students at the University of Phayao", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 4, pp. 676–685, 2020. doi: 10.25046/aj050481
- Anouar Bachar, Noureddine El Makhfi, Omar EL Bannay, "Machine Learning for Network Intrusion Detection Based on SVM Binary Classification Model", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 4, pp. 638–644, 2020. doi: 10.25046/aj050476
- Adonis Santos, Patricia Angela Abu, Carlos Oppus, Rosula Reyes, "Real-Time Traffic Sign Detection and Recognition System for Assistive Driving", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 4, pp. 600–611, 2020. doi: 10.25046/aj050471
- Amar Choudhary, Deependra Pandey, Saurabh Bhardwaj, "Overview of Solar Radiation Estimation Techniques with Development of Solar Radiation Model Using Artificial Neural Network", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 4, pp. 589–593, 2020. doi: 10.25046/aj050469
- Maroua Abdellaoui, Dounia Daghouj, Mohammed Fattah, Younes Balboul, Said Mazer, Moulhime El Bekkali, "Artificial Intelligence Approach for Target Classification: A State of the Art", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 4, pp. 445–456, 2020. doi: 10.25046/aj050453
- Hani AlGhanem, Mohammad Shanaa, Said Salloum, Khaled Shaalan, "The Role of KM in Enhancing AI Algorithms and Systems", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 4, pp. 388–396, 2020. doi: 10.25046/aj050445
- Shahab Pasha, Jan Lundgren, Christian Ritz, Yuexian Zou, "Distributed Microphone Arrays, Emerging Speech and Audio Signal Processing Platforms: A Review", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 4, pp. 331–343, 2020. doi: 10.25046/aj050439
- Ilias Kalathas, Michail Papoutsidakis, Chistos Drosos, "Optimization of the Procedures for Checking the Functionality of the Greek Railways: Data Mining and Machine Learning Approach to Predict Passenger Train Immobilization", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 4, pp. 287–295, 2020. doi: 10.25046/aj050435
- Yosaphat Catur Widiyono, Sani Muhamad Isa, "Utilization of Data Mining to Predict Non-Performing Loan", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 4, pp. 252–256, 2020. doi: 10.25046/aj050431
- Hai Thanh Nguyen, Nhi Yen Kim Phan, Huong Hoang Luong, Trung Phuoc Le, Nghi Cong Tran, "Efficient Discretization Approaches for Machine Learning Techniques to Improve Disease Classification on Gut Microbiome Composition Data", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 3, pp. 547–556, 2020. doi: 10.25046/aj050368
- Ruba Obiedat, "Risk Management: The Case of Intrusion Detection using Data Mining Techniques", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 3, pp. 529–535, 2020. doi: 10.25046/aj050365
- Krina B. Gabani, Mayuri A. Mehta, Stephanie Noronha, "Racial Categorization Methods: A Survey", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 3, pp. 388–401, 2020. doi: 10.25046/aj050350
- Efrain Mendez, German Baltazar-Reyes, Israel Macias, Adriana Vargas-Martinez, Jorge de Jesus Lozoya-Santos, Ricardo Ramirez-Mendoza, Ruben Morales-Menendez and Arturo Molina, "ANN Based MRAC-PID Controller Implementation for a Furuta Pendulum System Stabilization", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 3, pp. 324–333, 2020. doi: 10.25046/aj050342
- Dennis Luqman, Sani Muhamad Isa, "Machine Learning Model to Identify the Optimum Database Query Execution Platform on GPU Assisted Database", Advances in Science, Technology and Engineering Systems Journal, vol. 5, no. 3, pp. 214–225, 2020. doi: 10.25046/aj050328