What is Machine Learning?
Key Points:
What's machine learning? Indeed among machine learning interpreters, there is not a well-accepted description of what's and what is not machine learning.
Definition:
Then is a description of what machine learning is due to Arthur Samuel. He defined machine learning as the field of study that gives computers the capability to learn without being explicitly learned.
So Tom defines machine learning by saying that a well-posed learning problem is defined as follows. He says, a computer program is said to learn from experience E concerning some task T and some performance measure P, if its performance on T, as measured by P, improves with experience E.
Types of Learning Algorithm:
There are several different types of learning algorithms. The main two types are what we call supervised learning and unsupervised learning. It turns out that in supervised learning, the idea is we are going to educate the computer on how to do commodity. Whereas in unsupervised learning, we are going to let it learn by itself. You might also hear other ghost terms similar to underpinning learning and recommender systems. These are other types of machine learning algorithms that we'll talk about latterly.
It turns out that the other effect to spend a lot of time on in this class is practical advice for applying learning algorithms. tutoring about learning algorithms is like giving a set of tools. And inversely important or more important than giving you the tools as they educate you on how to apply these tools.
What's Machine Learning?
Two delineations of Machine Learning are offered. Arthur Samuel described it as" the field of study that gives computers the capability to learn without being explicitly programmed." This is an aged, informal description.
Tom Mitchell provides a more ultramodern description" A computer program is said to learn from experience E concerning some class of tasks T and performance measure P, if its performance at tasks in T, as measured by P, improves with experience E."
Illustration playing checkers.
E = the experience of playing numerous games of checkers
T = the task of playing checkers.
P = the probability that the program will win the coming game.
In general, any machine learning problem can be assigned to one of two broad groups
Supervised learning and Unsupervised learning.
Watch Video:
Machine Learning:
In this video, we will try to define what it's and also try to give you a sense of when you want to use machine learning. Indeed among machine learning interpreters, there is not a well-accepted description of what's and what is not machine learning. But let me show you a couple of exemplifications of the ways that people have tried to define it. Then is a description of what machine learning is due to Arthur Samuel. He defined machine learning as the field of study that gives computers the capability to learn without being explicitly learned.
Samuel's claim to fame was that back in 1950, he wrote a checkers playing program and the amazing thing about this checkers playing program was that Arthur Samuel himself was not a veritably good checkers player. But what he did was he'd programmed perhaps knockouts of thousands of games against himself, and by watching what feathers of board positions tended to lead to triumphs and what kind of board positions tended to lead to losses, the checkers playing program learned over time what are good board positions and what are bad board positions. And ultimately learn to play checkers better than Arthur Samuel himself was suitable to. This was a remarkable result. Arthur Samuel himself turns out not to be a veritably good checkers player. But because a computer has the tolerance to play knockouts of thousands of games against itself, no human has the tolerance to play that numerous games. By doing this, a computer was suitable to get so important checkers playing experience that it ultimately came a better checkers player than Arthur himself.
This is a kindly informal description and an aged one. Then is a slightly more recent description by Tom Mitchell who is a friend of Carnegie Melon. So Tom defines machine learning by saying that a well-posed learning problem is defined as follows. He says, a computer program is said to learn from experience E concerning some task T and some performance measure P, if its performance on T, as measured by P, improves with experience. I actually suppose he came out with this description just to make it agree. For the checkers playing exemplifications, experience E would be the experience of having the program play knockouts of thousands of games itself. The task T would be the task of playing checkers, and the performance measure P will be the probability that winning the coming game of checkers against some new opponent.
Throughout these videos, besides me trying to educate your staff, I will sometimes ask you a question to make sure you understand the content. Here is one.
On top is a description of machine learning by Tom Mitchell. Let's say your dispatch program watches which emails you do or don't mark as spam. So in a dispatch customer like this, you might click the Spam button to report some dispatch as spam but not other emails. And grounded on which emails you mark as spam, say your dispatch program learns better how to filter spam dispatch. What's task T in this setting? In many seconds, the video will break and when it does so, you can use your mouse to select one of these four radio buttons to let me know which of these four you suppose is the right answer to this question.
So hopefully you got that this is the right answer, classifying emails is the task T. In fact, this description defines a task T performance measure P and some experience. And so, watching you marker emails as spam or not spam, this would be the experience E and a few emails rightly classified, that might be a performance measure. And so on the task of systems performance, on the performance measure P will ameliorate after the experience.
In this class, I hope to educate you about colorful different types of learning algorithms. There are several different types of learning algorithms. The main two types are what we call supervised learning and unsupervised learning. I will define what these terms mean more in a coming couple of videos. It turns out that in supervised learning, the idea is we are going to educate the computer on how to do commodities. Whereas in unsupervised learning, we are going to let it learn by itself. Do not worry if these two terms do not make sense yet. In the coming two vids, I am going to say exactly what these two types of learning are. You might also hear other ghost terms similar to underpinning learning and recommender systems. These are other types of machine learning algorithms that we'll talk about latterly. But the two most used types of learning algorithms are presumably supervised learning and unsupervised learning. And I will define them in the coming two videos and we'll spend the utmost of this class talking about these two types of learning algorithms. It turns out that the other effect to spend a lot of time on in this class is practical advice for applying learning algorithms. This is a commodity that I feel enough explosively about. And exactly commodity that I do not know if any other university preceptors. tutoring about learning algorithms is like giving a set of tools. And inversely important or more important than giving you the tools as they educate you on how to apply these tools. I like to make an analogy to learning to come to a carpenter. Imagine that someone is tutoring you on how to be a carpenter, and they say, then is a hammer, then is a screwdriver, then is an aphorism, good luck. Well, that is no good. You have all these tools but the more important thing is to learn how to use these tools properly.
There is a huge difference between people that know how to use these machine learning algorithms, versus people that do not know how to use these tools well. Then, in Silicon Valley where I live, when I go visit different companies indeed at the top Silicon Valley companies, veritably frequently I see people trying to apply machine learning algorithms to some problem and occasionally they've been going at for six months. But occasionally when I look at what they're doing, I say, I could have told them like, gee, I could have told you six months ago that you should be taking a learning algorithm and applying it in like the slightly modified way and your chance of success will have been much advanced. So what we are going to do in this class is actually spend a lot of the time talking about how if you are actually trying to develop a machine learning system, how to make those stylish practices type opinions about how you make your system. So that when you are eventually learning algorithms, you are less likely to end up one of those people who end up pursuing a commodity after six months that someone differently could have figured out is just a waste of time for six months. So I am actually going to spend a lot of time tutoring you on those feathers of stylish practices in machine learning and AI and how to get the stuff to work and how the stylish people do it in Silicon Valley and around the world. I hope to make you one of the stylish people in knowing how to design and make serious machine learning and AI systems. So that is machine learning, and these are the main motifs I hope to educate. In the coming videotape, I am going to define what's supervised learning and after that what's unsupervised learning. And also time to talk about when you would use each of them.