Senior Data Scientist /Machine Learning Engineer
Implementing various solutions regarding machine learning projects. I could mention some but not...
In this post willl show how KNN simple prediction model could be improved with kinetic energy and kullback leibler divergence for classification tasks in general and for image recognition tasks in particular. Added simple starter code on github repo class KineticKnn(object): def __init__(self,train,test,kinetic,KLdiv,optim_percent): self.train=train self.test=test self.optimization_percent=optimization_percent self.KLdivergence=Kldivergence self.kinetic=kinetic As you notticed in this simple class are feeded as parameters : train , test- the data sets where the training and predicting will tacke place kinetic and KLdiv are boolean values , optim_percent - number like 0.2 meaning holdout data set in order to optimize on it .
Data science in python5 years experience
R, Python, machine learning, regression, classification, correlation analysis, deep learning, h20, scikit learn, feature selection, mutual information, feature...
Machine learning5 years experience
Implementing various solutions regarding machine learning projects. I could mention some but not least : implementing genetic alghoritms , loan default predictions ,...
Natural language processing3 years experience
I have applied data science in more then 15 industry projects with topics such as: -image processing -machine learning -deep learning -neural networks ...
LANGUAGE AND TIMEZONE
Kinetic energy for feature engineering
Kinetic energies researched by Daia Alexandru , that can be used for feature engineering , improved PCA , improved neural nets and lots moe
Onicescu correlation coefficient-Python - Alexandru Daia
Implementing a new correlation method based on kinetic energies I research