Welcome to week three. In this week, you get to apply RNNs and LSTMs to the time sequence data. Last week you'd applied DNNs to this data. But time series is temporal data, seems like you should be applying a sequence model, like an RNN or an LCM to that. So that's what you do this week. >> Yes, so it's, if you remember in the previous courses, we did natural language processing. And we learned a lot about RNNs and LSTMs there. And how things like the state vector and the cell state in these allow you to maintain context across a series. And with something like a time series data, if you're looking at maybe a 30-day window, or a 30 period window, the likelihood in some time series of data closer to your predictive date, having a bigger impact on data further away, it's there, right? There is a higher likelihood for that. So being able to use RNNs and LSTMs might factor that in to our data to give us a much more accurate prediction. >> Yeah, that's right, looking over a much bigger windows and carrying context from far away. >> Yeah, yeah, exactly, and you know my old favorite LSTMs, and the way they have that cell state, that allows you, we should call it L state, after me. [LAUGH] And the way you have that cell state that allows you to carry context across a long learning process. And I think in some time series data, that would be a really large impact. So for example, like financial data, today's closing price has probably got a bigger impact on tomorrow's closing price than the closing price from 30 days ago, or 60 days ago, or 90 days ago. So being able to use recurrent networks and LSTMs I think it will help us be able to be much more accurate in predicting seasonal data. >> Yeah, cool, and one of the fun things that we'll see this week as well is Lambda layers. >> Lambda layers, yeah. So as a coder, Lambda layers give me comfort. Because sometimes one of the hardest things for me, when I first started doing neural networks was, I write all this code for pre-processing and I write all this code for post-processing. And then I define a neural network and it does all this magic inside a neural network. But I may not have control over what's going on in there. >> It just does whatever it wants to do. >> Or whatever it's been designed to do [LAUGH]. And so, but in Tensorflow and with Keras, Lambda layers allow us to write effectively an arbitrary piece of code as a layer in the neural network. >> So rather than, for example, scaling your data with an explicit pre-processing step and then feeding that data into the neural network, you can instead have a Lambda layer. Basically a Lambda function, an unnamed function, but implemented as a layer in the neural network that resend the data, scales it. And now that pre-processing step is no longer a separate and distinct step, it's part of the neural network. >> So as a programmer that just gives me a lot more comfort, having that kind of control. And it makes it even more fun to be able to build this kind of stuff. >> Yeah, so in this week, you get to apply our RNNs and LCMs to the sequence data. And, as one of the fun features of Tensorflow you learn as well, you also get to implement Lambda layers. Please go on to the next video.