บทความ

กำลังแสดงโพสต์จาก กุมภาพันธ์, 2018

Wine Quality Prediction(Logistic Regression) - Kaggle[Dataset]

รูปภาพ
Overview from kaggle Context The two datasets are related to red and white variants of the Portuguese "Vinho Verde" wine. For more details, consult the reference [Cortez et al., 2009]. Due to privacy and logistic issues, only physicochemical (inputs) and sensory (the output) variables are available (e.g. there is no data about grape types, wine brand, wine selling price, etc.). These datasets can be viewed as classification or regression tasks. The classes are ordered and not balanced (e.g. there are much more normal wines than excellent or poor ones). This dataset is also available from the UCI machine learning repository,  https://archive.ics.uci.edu/ml/datasets/wine+quality  , I just shared it to kaggle for convenience. (If I am mistaken and the public license type disallowed me from doing so, I will take this down if requested.) Content For more information, read [Cortez et al., 2009]. Input variables (based on physicochemical tests): 1 - fixed acidity...

CNN - Classification cat or dog?

รูปภาพ
This project i'm gonna classifying which one are cat or dog?. Here, is my training dataset Next, I'm gonna import libraries, and deal with CNN After, I already fit my model, then I will predict this image either cat or dog

House Prices Prediction - Kaggle

รูปภาพ
Competition Description Ask a home buyer to describe their dream house, and they probably won't begin with the height of the basement ceiling or the proximity to an east-west railroad. But this playground competition's dataset proves that much more influences price negotiations than the number of bedrooms or a white-picket fence. With 79 explanatory variables describing (almost) every aspect of residential homes in Ames, Iowa, this competition challenges you to predict the final price of each home. Practice Skills Creative feature engineering  Advanced regression techniques like random forest and gradient boosting Here, is result I has used gradient boosting regression for this project

Titanic - Kaggle

รูปภาพ
Competition Description The sinking of the RMS Titanic is one of the most infamous shipwrecks in history.  On April 15, 1912, during her maiden voyage, the Titanic sank after colliding with an iceberg, killing 1502 out of 2224 passengers and crew. This sensational tragedy shocked the international community and led to better safety regulations for ships. One of the reasons that the shipwreck led to such loss of life was that there were not enough lifeboats for the passengers and crew. Although there was some element of luck involved in surviving the sinking, some groups of people were more likely to survive than others, such as women, children, and the upper-class. In this challenge, we ask you to complete the analysis of what sorts of people were likely to survive. In particular, we ask you to apply the tools of machine learning to predict which passengers survived the tragedy. Practice Skills Binary classification Python and R basics And here is a result I use ...