Hello and welcome to the Foundations of computer science course. This is the first course in the Introduction to Computer Programming with Visual Basic Specialization. Now in this first course, we're not actually getting it rolled up our sleeves and code, those are in the three following courses. In this first course, we're going to talk about lots of topics that are very important to your applications that you're going to write. We want to give you a firm foundation or a strong foundation that you're going to build up on. Sit back, relax, let's have a little fun. Let's start out and talk about algorithms and history of computing. In this first module of the course, we want to learn what algorithms are and how problem-solving is done using algorithms. So some objectives. When you're done with this module, I want you to be able to define what an algorithm is. You should be able to understand algorithmic problem-solving, you should be able to understand key innovations in the history of computing, I want you to be able to explain three components of an algorithm. Let's start out with a formal definition of an algorithm. We can define an algorithm as a well-ordered collection of unambiguous and effectively computable operations that, when executed, produce a result and halts in a finite amount of time. So several of these pieces are really key. That last piece, it has to be able to be completed. The program needs to halt or the algorithm needs to halt when it's done in some measurable amount of time. It needs to be well-ordered, set of unambiguous instruction, so you can't have any questions about what needs to be done at a step in the algorithm, and they have to be computable. Now that's a more complicated topic in computer science about what's computable or not, so I'll leave that off our discussion for now. Some details, let's drill in, well-ordered collection. On completion of an operation, we always know which operation to do next. Unambiguous and effectively computable operations. It is not enough for an operation to be understandable it also must be computable. What we mean by that is the computer must be able to do it. The algorithm needs to produce a result and halts in a finite amount of time, that result needs to be observable to a user, so this could be a numeric answer, a new object, a change in the environment. Here's an example, hair washing algorithm. You wet the hair, apply shampoo to wet hair, scrub shampoo into the hair, and then rinse the shampoo out of the hair. Now each of these steps is clear. What do you do first? What do you do next? They are all doable. They're not necessarily done by a computer, but they're all doable by a person. Here's an alternative hair washing algorithm. Wet the hair, repeat until your hair is clean, the following three steps, apply shampoo to wet hair, scrap shampoo into hair, bring shampoo out of the hair. This reminds me of an old joke I'll tell you, which is, why did the programmer gets stuck in the shower? The answer is, he read the directions of the shampoo which said, rinse lather, repeat. We didn't have a repeat until clean, here we've got an, a finite amount of time which makes us a valid algorithm. Some common Computer Science algorithm types, so searching. A lot of what we do in artificial intelligence is searching for an answer. But some of the simpler computational things we do are also searching will talk about some of those. Sorting. Sorting is rearranging data in a certain order, maybe alphabetic order, maybe based on value from ascending order from smallest to largest. Encoding, we may encode with compression or with encryption, that sort of thing. Graphics, minimization; try to take a hard problem, make it smaller. Parsing, try to take something that's in one form like texts, human text and make it into binary. That's our first lesson, a little review here. Computer science is the study of what can be computed. That's back to what the algorithm has to be, each step has to be able to be done by a computer. Algorithms have ordered sequences of operations, and algorithms are going to complete in finite amount of time. That's it for lesson 1. I'll see you in lesson 2.