1
00:00:02.040 --> 00:00:30.889
Axcel ILT: And with that said, Let's kick it off. So today we are here for the exploring, explainable AI Webinar. Thank you for joining. My name is Alex. I'm gonna be the Mc. For today's session with our host, Chris, and before we begin I just kinda wanted to go over some housekeeping roles first. Everyone's microphones and cameras, of course, are often muted. However, open discussion. The chat is open, so please feel free to have you know if you have any questions, put them in the chat, and you, if we have time late at the end of the session, we'll certainly answer them.
2
00:00:30.980 --> 00:00:40.520
So. Chris, you don't mind going to the next slide, really, briefly, depending on how everyone came here today and came across this webinar. You may or may not be familiar with
3
00:00:40.520 --> 00:01:04.319
Axcel ILT: axle, ilt or instructor led training services and our other brand. So I just wanted to take a bit of a moment and explain that a little bit further, and in the next slide I can kind of dive into what these 3 brands are. So the axle instructor led training or axle. It's a collection of 3 3 brands which some of you may be familiar with, accelerate exit certified, and web page solutions.
4
00:01:04.489 --> 00:01:22.870
Axcel ILT: and the collection of these 3 brands allows us to essentially service the entire life cycle of employee technical training. So we're really excited about that, joining all 3 of them together and with accelerate. You know, we provide customized courses for te teams and advanced technology and software application training
5
00:01:22.900 --> 00:01:45.429
Axcel ILT: that enables businesses or teams of all skill levels to accomplish their tasks efficiently and effectively with exit certified. You know, we provide authorized training from leading technology vendors. And we offer combination of official courseware and internally developed custom materials to provide clients with comprehensive education and industry, recognized certifications.
6
00:01:45.430 --> 00:02:10.089
Axcel ILT: Now, lastly, with web age solutions, you know, they create upscaling programs for entire organizations. So these unique, specialized upscaling solutions are tailored to each organization specific to their needs and goals. So with all through those brands, we essentially provide a full suite of services and resources to help organizations meet growing demand for professional development.
7
00:02:10.479 --> 00:02:26.910
Axcel ILT: So with that said, I'd like to go into our next slide. I'm going to introduce our host for today, and our instructor, Chris Bennett. So Chris Pradick, is an instructor for web age solutions, and he's got about over 30 years of experience in the It industry across variety of platforms.
8
00:02:26.910 --> 00:02:42.220
Axcel ILT: you know. During his career he has guided clients in various areas of cyber security, architecture, development, data, science, and learning. You know, he's worked independently for the government. In the military and as a civilian in the private sector
9
00:02:42.220 --> 00:02:59.589
Axcel ILT: he holds multiple certifications and security and development, and teaches AI ml terraform get kubernetes. Java, among other courses so highly skilled and very excited to introduce him to today's webinar, so with that, said, I'm gonna pass it along to Chris. Take it away.
10
00:03:00.150 --> 00:03:02.410
Chris Penick | Axcel ILT: hey? Okay, all right, thanks, Alec.
11
00:03:02.570 --> 00:03:10.559
Chris Penick | Axcel ILT: Appreciate it. So first of all, a couple things here is said we're gonna try to keep an eye on the chat to see if there's any questions as we go.
12
00:03:10.610 --> 00:03:13.630
Chris Penick | Axcel ILT: I'll make some time for questions and answers at the end of this, too.
13
00:03:13.770 --> 00:03:24.779
Chris Penick | Axcel ILT: There's a lot to cover broad, broad topic with. Explainable AI, or we'll just call it XA for right now to, because that's so fewer syllables to say
14
00:03:24.900 --> 00:03:37.910
Chris Penick | Axcel ILT: but I'm I'm gonna go ahead and just dive in. I'll keep an eye on the chat. I'll try to watch for it. I know there's also a. QA. There. If you do happen to put something in QA. We'll get to it the end of it. But we we go fairly quick
15
00:03:37.950 --> 00:03:39.110
Chris Penick | Axcel ILT: and
16
00:03:39.190 --> 00:03:45.549
Chris Penick | Axcel ILT: cover quite a bit here. So it's always weird. I learned something about myself every time someone introduces me there. Pretty interesting.
17
00:03:45.630 --> 00:03:48.349
Chris Penick | Axcel ILT: Alright. So first of all, there you go. So
18
00:03:48.380 --> 00:04:00.689
Chris Penick | Axcel ILT: let's just you can't talk about explainable AI, and I promise. This is not going to be just slides because we're going to go explore a little bit here. But you can't talk about explainable AI without mentioning
19
00:04:01.020 --> 00:04:01.740
Chris Penick | Axcel ILT: message
20
00:04:02.080 --> 00:04:29.609
Chris Penick | Axcel ILT: and AI, we can't go without mentioning machine learning. So we're gonna kinda explain some of these as we go. And I'm gonna assume I don't know if you can see me on camera there that much I'm gonna assume that much knowledge of what I'm talking about, because there are registered 500, something of you with vast amounts of experience in all different directions. So I don't. Wanna. I don't wanna offend. But I wanna make sure. You know, I'm never insulting your intelligence, but I wanna make sure I don't miss anything.
21
00:04:29.680 --> 00:04:31.369
Chris Penick | Axcel ILT: So I'm I'm going to start
22
00:04:31.490 --> 00:04:50.430
Chris Penick | Axcel ILT: sort of from the basics here. But one of one of the things that we we look for in ethical AI, and we do offer courses on that. Here in everyone is trying to become an expert. We'll talk about that in a little bit. And that is that you know, we're looking for human augmentation. Right? We're trying to augment our decision-making process for our just our daily task.
23
00:04:51.020 --> 00:05:01.160
Chris Penick | Axcel ILT: Part of that means that there's some bias built into the system. So bias evaluation is something we look at in an ethics, ethical, AI. And a simple example
24
00:05:01.300 --> 00:05:02.250
Chris Penick | Axcel ILT: is that
25
00:05:02.840 --> 00:05:08.259
Chris Penick | Axcel ILT: most of you are probably familiar with one of the biggies. Openai Chat, Gp. All right. Do you know where Chat
26
00:05:08.690 --> 00:05:11.440
Chris Penick | Axcel ILT: Gpt. By the way, is the interface
27
00:05:11.520 --> 00:05:18.509
Chris Penick | Axcel ILT: all right, the Gpt part that's the model. That's the actual application.
28
00:05:18.680 --> 00:05:31.199
Chris Penick | Axcel ILT: So chat is just the ui and chat Gbt, where did it get its data? And so what I like to show is that if you go look, you can look up the pile, and that's the start of it.
29
00:05:31.460 --> 00:05:34.720
Chris Penick | Axcel ILT: I saw a hand go up for a second there, real quick. Get to that.
30
00:05:34.870 --> 00:05:48.439
Chris Penick | Axcel ILT: And so basically, you know, things like Wikipedia Reddit news articles, new York Times, of course, and that's why they're suing. And that's another story we can talk about. But all of these documents out there that were fed to something like Jetg.
31
00:05:48.500 --> 00:05:58.530
Chris Penick | Axcel ILT: or Bard, or Claude, or Gemini, or Gemma, or whatever you want to look at right now. There's plenty of these models out there. They were fed tons and tons of data.
32
00:05:58.680 --> 00:06:05.829
Chris Penick | Axcel ILT: and then said, Make some evaluation from that, make some context, make some conclusions from that
33
00:06:05.990 --> 00:06:10.270
Chris Penick | Axcel ILT: alright. Well, in that data itself we can see that
34
00:06:10.450 --> 00:06:17.490
Chris Penick | Axcel ILT: there's some bias. People, unfortunately, are not always the nicest on the Internet. Right? Kind of just go look at Reddit for a while, and you know what I'm talking about.
35
00:06:17.640 --> 00:06:34.119
Chris Penick | Axcel ILT: But so that bias evaluation is something we've built in. But the big one today, we're really only looking at one section of these 8 principles here. And that is explainability. You know. How do I, when a decision is made using AI, either in whole or as an assistant.
36
00:06:34.600 --> 00:06:47.630
Chris Penick | Axcel ILT: How do I explain why that decision was made? For example, you're trying to get a home loan. and the Finance Company uses an AI tool or machine learning tool to decide whether you're a good credit risk or not
37
00:06:48.190 --> 00:06:54.459
Chris Penick | Axcel ILT: would be kind of nice to know whether they? Yeah. Why, they said yay, or nay, right, that would be a good idea
38
00:06:54.490 --> 00:07:15.630
Chris Penick | Axcel ILT: part of that. This event. To that point. There are laws in the Us. For that in particular, for credit in particular, we don't have. We're we're still lagging a little bit behind Europe on when it comes to some of these laws on really pulling back the curtains. So we wanna be also make sure it's reproducible. Think about if any of you have experimented with Chat Gp, have you ever tried
39
00:07:16.030 --> 00:07:19.599
Chris Penick | Axcel ILT: coming back with the same prompt but at a different time.
40
00:07:20.040 --> 00:07:25.670
Chris Penick | Axcel ILT: or maybe a month or 2 later? Does it always give you the exact same results. There's some randomness involved in there.
41
00:07:25.710 --> 00:07:37.540
Chris Penick | Axcel ILT: and that randomness may or may not induce bias, too. So it's about displacement. You know the other thing that in in ethics we worry about. For with ethical use of AI worry about how that displaces humans.
42
00:07:37.570 --> 00:07:40.409
Chris Penick | Axcel ILT: how that displaces current processes.
43
00:07:40.450 --> 00:07:43.000
Chris Penick | Axcel ILT: And of course, just accuracy
44
00:07:43.110 --> 00:07:44.910
Chris Penick | Axcel ILT: trust on this
45
00:07:44.960 --> 00:07:57.569
Chris Penick | Axcel ILT: and data risk awareness. So all these are topics that we cover in an ethics course. But like set up today in this little bitty 45 min to an hour session that I get. I get to talk about explainability. So I'm I'm promise to make it a little bit more interesting.
46
00:07:57.590 --> 00:08:06.559
Chris Penick | Axcel ILT: So when it comes to that explainability, then just the challenges of explainability are, you know, how do we deal with bias? Is there bias in the system? How do we reveal bias?
47
00:08:06.650 --> 00:08:12.330
Chris Penick | Axcel ILT: You know? How do we make sure that the decisions or the choices or the suggestions
48
00:08:12.770 --> 00:08:16.289
Chris Penick | Axcel ILT: are based in fairness. And are they safe?
49
00:08:16.340 --> 00:08:27.279
Chris Penick | Axcel ILT: Are they, you know? Are they looking at something, you know, is. for example, a I is used in image recognition, image. Recognition is part of the tools used
50
00:08:27.560 --> 00:08:36.420
Chris Penick | Axcel ILT: for self-driving cars. Among other things, there's there's lighter-based systems, and so on. There's other other types of ways to sense that something's around us.
51
00:08:36.530 --> 00:08:41.829
Chris Penick | Axcel ILT: right? Those systems, they themselves could be compromised, or they could be based on
52
00:08:41.940 --> 00:08:43.429
Chris Penick | Axcel ILT: bad assumptions.
53
00:08:43.500 --> 00:08:56.370
Chris Penick | Axcel ILT: So a lot of it goes in there. So the the components, right? It's it's it's a tough task to ensure that an algorithm. And that's our first buzzword. Okay? So algorithm, anytime, you hear me say that you just think, fancy, recipe.
54
00:08:56.540 --> 00:09:10.789
Chris Penick | Axcel ILT: recipe to to calculate something right even when you're using chat. Tbt, your words are. This is the 900,000 foot view, not the 80,000 foot view, or the 10,000 foot view. Your words are essentially turned into numbers.
55
00:09:10.850 --> 00:09:12.809
Chris Penick | Axcel ILT: big, long strings of numbers.
56
00:09:12.970 --> 00:09:14.460
Chris Penick | Axcel ILT: and and
57
00:09:14.500 --> 00:09:19.590
Chris Penick | Axcel ILT: what Gpt generated pre-trans train transformers are doing as they are
58
00:09:20.940 --> 00:09:35.959
Chris Penick | Axcel ILT: doing. Big mouth calculations of what's the next likely probable word that I could use to give you a reply that you want right? So that decision, if you think about that, all that data going in there, safety where that data came from.
59
00:09:36.020 --> 00:09:42.820
Chris Penick | Axcel ILT: it's very difficult to decide if AI is reliable or not without evaluating
60
00:09:43.100 --> 00:09:45.890
Chris Penick | Axcel ILT: process of how it came to its conclusion.
61
00:09:46.240 --> 00:09:55.329
Chris Penick | Axcel ILT: So unless we can see some of these things right? How it came to whatever choice, whether that's a simple model that's giving you. Hey? Buy your stock at this price.
62
00:09:55.790 --> 00:10:01.400
Chris Penick | Axcel ILT: or it's something more complicated like, help me write this essay. None. No kids ever do that right.
63
00:10:01.870 --> 00:10:10.379
Chris Penick | Axcel ILT: Those things in there. Where are they getting those, the ethics surrounding that we'll worry about. That's a side topic that I'd love to get into, and I wish I had a whole lot of time to talk about.
64
00:10:10.720 --> 00:10:27.480
Chris Penick | Axcel ILT: We have a 2 day course that we cover on. You know, some of these ethics security and and explainability. But as far as right now, I'm gonna just touch on just the explainable part. So let's look at this, then alright. So we need some components of how we decide that an AI algorithm, a model
65
00:10:27.610 --> 00:10:30.689
Chris Penick | Axcel ILT: is another word I'm going to use. We'll get to that there.
66
00:10:30.860 --> 00:10:36.310
Chris Penick | Axcel ILT: How we decide that it is interpretable right that I can. I can explain.
67
00:10:36.500 --> 00:10:48.449
Chris Penick | Axcel ILT: and it's kind of like. There's a lovely video series that I see all the time where they get some expert in a particular topic, and they have them explain it at like 5 different levels of complexity
68
00:10:48.570 --> 00:10:50.950
Chris Penick | Axcel ILT: from someone who would have
69
00:10:51.310 --> 00:11:02.809
Chris Penick | Axcel ILT: a Ph. D. On the topic down to a 5 year old. And they kind of go, you know, and just lay people like you and me right somewhere in between there. And they explain it at different levels, the same topic, but at different levels. There.
70
00:11:02.840 --> 00:11:04.970
Chris Penick | Axcel ILT: when we think about interpretability.
71
00:11:05.380 --> 00:11:08.730
Chris Penick | Axcel ILT: that's what we need for AI. You know.
72
00:11:09.380 --> 00:11:15.290
Chris Penick | Axcel ILT: my, my, my kids roll their eyes. I have, you know a 12 and a 16 year old still at home.
73
00:11:15.320 --> 00:11:25.520
Chris Penick | Axcel ILT: They roll their eyes when I start to explain to them when they play with AI tools all the time. I mean, have it in this thing right in their little phone. They have it, you know. They can talk to a
74
00:11:26.600 --> 00:11:30.810
Chris Penick | Axcel ILT: Chatbot. Basically, that's got AI backing alright.
75
00:11:30.950 --> 00:11:39.839
Chris Penick | Axcel ILT: And to them it's just normal. There you go. But why gave it what it is? You know it's like you can't trust anything from the Internet. You're gonna trust something from that.
76
00:11:39.970 --> 00:11:52.790
Chris Penick | Axcel ILT: So that transparency of where that data came from. because the term we'll talk about this with generative AI in just a moment that we're generative, we need to generate. We need to create new content.
77
00:11:52.880 --> 00:12:09.310
Chris Penick | Axcel ILT: Right? Generative. AI has is been fed lots of data and sees patterns in that data, and now can turn that into new content for you. so that raises a whole nother issue. You hear of authors now suing like Openai, because, hey.
78
00:12:09.680 --> 00:12:20.710
Chris Penick | Axcel ILT: you know your Gp, T. Read my book. And now it's spitting out parts of my book that got thrown out on some cases. because what it's doing is it's generating things like the book.
79
00:12:21.840 --> 00:12:25.510
Chris Penick | Axcel ILT: all right. But the book may have been fed to it in the first place.
80
00:12:25.710 --> 00:12:32.550
Chris Penick | Axcel ILT: so that transparency. Openai has not gotten up and said, Hey, by the way, here's all the data that we use to train. They do not
81
00:12:32.810 --> 00:12:57.130
Chris Penick | Axcel ILT: now so open it may be in their name and by the way, the views and opinions are those of this instructor right here, but Open may be in their name, but they're not exactly forthcoming of exactly what data was was used to teach it. Google, you know, with clogged with some of these others. It's the same case. Now, there are some open, completely open models out there where we do know exactly what data was used to train the model.
82
00:12:57.330 --> 00:12:58.230
Chris Penick | Axcel ILT: So
83
00:12:58.480 --> 00:13:18.340
Chris Penick | Axcel ILT: who's accountable for that in accountability. So each of these apply to the data where they get, where do they get the data from? What's the model, you know? Is it transparent on what it's doing, how it's making these decisions, and what operations can I do on that? Can I tweak it? You know we talk about in AI courses, and if you come visit me in machine learning or in a
84
00:13:18.610 --> 00:13:28.189
Chris Penick | Axcel ILT: generative AI course we'll talk about. You know how we adjust weights and biases and the use of the word biases, there is different. But you know, biases towards certain
85
00:13:28.630 --> 00:13:30.060
Chris Penick | Axcel ILT: pieces of input
86
00:13:30.200 --> 00:13:43.579
Chris Penick | Axcel ILT: how we lean to one another. So I'm gonna do some bad sketches in just a second here. And I'm going to break this down here and say, first of all, that AI, we have to understand that AI is just
87
00:13:44.650 --> 00:13:48.789
Chris Penick | Axcel ILT: machine learning grown up. How's that? No, that's that's a little bit
88
00:13:49.830 --> 00:13:51.530
Chris Penick | Axcel ILT: Now try that this route.
89
00:13:52.170 --> 00:14:08.630
Chris Penick | Axcel ILT: People say that, but it it it it will. It does not give you the the source that isn't exactly right. It'll give you sources with if you have an Internet Plugin Chat Gbt doesn't work that way. Actually, it does not browse the web actively without the plugin.
90
00:14:09.260 --> 00:14:18.509
Chris Penick | Axcel ILT: but things like perplexity. If you go look at perplexity which uses can use multiple models on the back end. There's a question from in the chat there.
91
00:14:18.700 --> 00:14:33.649
Chris Penick | Axcel ILT: you know. Ask it, try it, ask it at source. And the thing is is that it's, you know. Ask it to how it came to that conclusion. You'll you'll see that it's not always forthcoming. It's not always exactly accurate, and even that, remember, it's generating
92
00:14:33.840 --> 00:14:35.010
Chris Penick | Axcel ILT: information
93
00:14:35.240 --> 00:14:47.940
Chris Penick | Axcel ILT: it's telling. It's why you can get a hallucination and and wish for that. And I hope I'm if I mispronounce anybody's name. I apologize ahead of time. But yeah, it's a really good question. And the the thing to take with that is that. Remember that
94
00:14:48.200 --> 00:14:54.589
Chris Penick | Axcel ILT: the generative word in Gpt generative, we need to generate, to create, to make new right.
95
00:14:55.070 --> 00:14:58.269
Chris Penick | Axcel ILT: So even what we'll will often do is tell you
96
00:14:58.490 --> 00:15:01.790
Chris Penick | Axcel ILT: what you want to hear. That's called hallucination.
97
00:15:01.930 --> 00:15:07.510
Chris Penick | Axcel ILT: Yeah, it won't do. There you go. I'll be. Yeah. Wait's got a question, too. Wait
98
00:15:07.770 --> 00:15:15.630
Chris Penick | Axcel ILT: Hold on one sec. Or or if you can put it in the chat. let me grab. But so yeah, or if you can put it in the chat, we'll get it from there
99
00:15:15.920 --> 00:15:28.000
Chris Penick | Axcel ILT: in just a second, and I'll try it. I'm going to make sure, too, that there's some time that we can ask a bunch of these questions all at once. So on the left side, if anybody's a program and I know I've probably got some programmers in this group. Got some AI data scientist
100
00:15:28.170 --> 00:15:33.330
Chris Penick | Axcel ILT: experts. I've got some brand new beginners. I've got everybody in between.
101
00:15:33.430 --> 00:15:44.860
Chris Penick | Axcel ILT: right. So on the left side is what we do without machine learning. I, let's just call this program right on the on the left side is is programming. Right? We just go in. And we say, Okay, I want to write a program
102
00:15:45.950 --> 00:15:46.800
Chris Penick | Axcel ILT: alright.
103
00:15:48.130 --> 00:15:55.529
Chris Penick | Axcel ILT: And we just write that. There we go. Okay. Pen turns out my pen's misaligned. Actually, give me 2 Si will fix this real quick
104
00:15:59.620 --> 00:16:02.739
Chris Penick | Axcel ILT: that I set it in. Let
105
00:16:04.730 --> 00:16:05.710
Chris Penick | Axcel ILT: here we go.
106
00:16:11.740 --> 00:16:12.520
Chris Penick | Axcel ILT: Here we go.
107
00:16:13.050 --> 00:16:22.500
Chris Penick | Axcel ILT: It was mapped to the wrong area. Now my pen looks better. Yay, okay, good. Alright. There we go. So left side. We call this programming right?
108
00:16:23.020 --> 00:16:25.730
Chris Penick | Axcel ILT: we'll just over here, and we'll just say.
109
00:16:25.760 --> 00:16:32.880
Chris Penick | Axcel ILT: then we got this little line over here. But with machine learning. So with programming, you have to give very, very specific instructions. You have to say, this is what I want you to do
110
00:16:32.900 --> 00:16:54.759
Chris Penick | Axcel ILT: so. You often have very specific instructions. You also often have very specific data not mentioned in this lovely drawing, and I can. I have resources at the end of this of where some of these drawings come from. These are not mine. I went to Geek school, not art school, so these are even better than what I could have done for you. So you you get to survive my drawings right here. But on the on the other side of this alright when we talk about machine learning.
111
00:16:54.850 --> 00:16:58.020
Chris Penick | Axcel ILT: which is, yeah, the basis of our AI,
112
00:16:58.260 --> 00:17:03.119
Chris Penick | Axcel ILT: what we're doing is we're giving lots of data right? And we're saying, okay, take these this data.
113
00:17:03.730 --> 00:17:18.530
Chris Penick | Axcel ILT: and we can do it a couple different ways. We can. We can, we have supervised and unsupervised. We can get into more detail. Basically, do we have the answers or not? Do we have the predictions? So we call that data. We we collect a lot of these features, and we know that X leads to Y,
114
00:17:18.990 --> 00:17:25.279
Chris Penick | Axcel ILT: alright. So we say, you know, we know this, we can give. If we have a bunch of examples like, we say, okay.
115
00:17:25.410 --> 00:17:30.329
Chris Penick | Axcel ILT: if you wanted to teach math to a machine learning model.
116
00:17:30.680 --> 00:17:38.629
Chris Penick | Axcel ILT: you could just give it examples of okay, if you see 2 plus 2, then you give me a 4. If you see 2 plus 3, then you give me a 5,
117
00:17:38.890 --> 00:17:50.280
Chris Penick | Axcel ILT: and so on down the line. Alright. And this is actually why it looked at, why, they're they're not so great. And but Chat Gp was not trained this way all right. It can be tuned this way, but it was not trained this way.
118
00:17:50.330 --> 00:17:58.970
Chris Penick | Axcel ILT: It was trained by basically saying, Oh, here's a bunch of interesting data. Let's start finding patterns within that. Let's finding words. What comes next.
119
00:17:59.140 --> 00:18:01.079
Chris Penick | Axcel ILT: If you know the quick
120
00:18:02.060 --> 00:18:08.670
Chris Penick | Axcel ILT: is the word that you have. What's the next word that comes out there? Somebody's gonna say, Brown, cause they know. The quick brown fox jumped over the lazy dog.
121
00:18:08.780 --> 00:18:11.929
Chris Penick | Axcel ILT: Right? So there's a likelihood there's a percentage
122
00:18:12.490 --> 00:18:17.390
Chris Penick | Axcel ILT: a probability that after you see the word quick you will probably see the word brown
123
00:18:17.780 --> 00:18:21.520
Chris Penick | Axcel ILT: in the Corpus, the collection of all the data
124
00:18:21.990 --> 00:18:32.550
Chris Penick | Axcel ILT: that they fed to the original Gpt. So we've gotten. And now we're up to what? 4 and 5 somethings coming sometime, probably not right away. But it's coming
125
00:18:32.860 --> 00:18:46.909
Chris Penick | Axcel ILT: alright. So the quick brown fox. So then again, there's a certain percentage of likely getting fox, you know, next, or my favorite food is now part of this relies on the fact that when we put things into things like Gpt, they get turned into numbers.
126
00:18:47.120 --> 00:19:10.119
Chris Penick | Axcel ILT: Alright, there's there are a classic algorithm for this, a recipe for doing. This is something called word to beck. There are all kinds of ways to to turn this into numbers, to data that the model can understand. And so we call that embedding right that we take these words, and we go, oh, okay, I'm gonna make these words into something that basically, it's a big math problem. So this learner.
127
00:19:10.510 --> 00:19:20.129
Chris Penick | Axcel ILT: that's what it's doing. The model is learning. Oh. these are the likelihoods of all the data that I've seen. This is the likelihood that this is the next word. So if I say.
128
00:19:20.320 --> 00:19:23.120
Chris Penick | Axcel ILT: little beau.
129
00:19:23.310 --> 00:19:27.969
Chris Penick | Axcel ILT: if you're American, you might know a little. But peep if our Jack and Jill went up, the
130
00:19:28.580 --> 00:19:40.329
Chris Penick | Axcel ILT: the next word might be hill right, because you've seen it in a nursery rhyme or heard it in nursery rhyme. or my country tis of somebody knows the right whatever again, because you've seen it somewhere.
131
00:19:40.710 --> 00:19:47.010
Chris Penick | Axcel ILT: Now, that doesn't mean that it couldn't get it differently. There is some randomness added to when it
132
00:19:47.390 --> 00:19:57.790
Chris Penick | Axcel ILT: starts to predict these. So the model that's built from this right is saying, Oh, okay, here's the likelihood. And we could even adjust that. Those of you who probably many of you have a Chat Gp account.
133
00:19:57.960 --> 00:20:05.910
Chris Penick | Axcel ILT: You may have the pro. If you have the pro, you can play with things like adjusting the temperature. The temperature is the randomness, so to speak.
134
00:20:05.960 --> 00:20:11.759
Chris Penick | Axcel ILT: of. If I asked you if I asked everybody here just real quick, I said, Okay, my favorite food is.
135
00:20:12.190 --> 00:20:21.890
Chris Penick | Axcel ILT: I bet you a lot of them. There's a high percentage that a lot of them would say pizza. There's maybe a lot of tacos. There's maybe a lot of burritos, and that's the same thing for any word choice within
136
00:20:21.910 --> 00:20:35.519
Chris Penick | Axcel ILT: these Gpt models there is that they have higher percentages, and you can tell it. You know what don't always pick the highest one. Give me a little bit more random to it, and part of that is that those we call those parameters that we could pass to it.
137
00:20:35.830 --> 00:20:47.919
Chris Penick | Axcel ILT: So this is our nutshell. All right. You want more about this. Come, take an Ml. Class, and I will gladly go into gory detail, and we will build stuff. We will write code. We will make things happen. Okay.
138
00:20:48.420 --> 00:20:50.370
Chris Penick | Axcel ILT: alright.
139
00:20:50.740 --> 00:20:51.870
Chris Penick | Axcel ILT: So
140
00:20:52.220 --> 00:20:56.080
Chris Penick | Axcel ILT: that means, then, is that model. Let's go going back to that picture
141
00:20:56.250 --> 00:20:59.799
Chris Penick | Axcel ILT: some models. We don't know how they work inside
142
00:21:00.090 --> 00:21:05.360
Chris Penick | Axcel ILT: alright, not even do we not know how they work, even though we
143
00:21:05.590 --> 00:21:16.569
Chris Penick | Axcel ILT: but have a rough idea of what's going on. We don't know the exact things. I'm going to give you a little picture in just a second here. But the other way to think about this is
144
00:21:16.630 --> 00:21:25.080
Chris Penick | Axcel ILT: that we pass parameters to a model. That's the input along with the data. So we pass the data and think of it like configuration settings. If you will
145
00:21:26.260 --> 00:21:28.830
Chris Penick | Axcel ILT: alright, we pass that to the model.
146
00:21:30.400 --> 00:21:33.089
Chris Penick | Axcel ILT: But the better way would would we say, input
147
00:21:34.750 --> 00:21:43.330
Chris Penick | Axcel ILT: and conf in settings? You know that temperature I told you about, or there, there are others there. When I add that to the model alright
148
00:21:43.850 --> 00:21:51.990
Chris Penick | Axcel ILT: changing this, let's say, I say, you know, change the temperature from maybe point 7, because it's on a one to 10 scale to a
149
00:21:52.460 --> 00:22:04.440
Chris Penick | Axcel ILT: point 9. Maybe I want it to be. Give me the most common thing all the time. Or you know, it's it's a 0 to one, basically. Or maybe I put it way down here. Be as random as you, you know. Be be crazy. Just give me weird stuff.
150
00:22:04.990 --> 00:22:10.220
Chris Penick | Axcel ILT: If that's the parameter that how can I? Can I explain exactly
151
00:22:11.240 --> 00:22:20.229
Chris Penick | Axcel ILT: why I get the results. I do. And with most models now they're black box, Claude, you know Bard Gemini.
152
00:22:20.630 --> 00:22:22.690
Chris Penick | Axcel ILT: Now, Gemma,
153
00:22:22.830 --> 00:22:33.220
Chris Penick | Axcel ILT: Gpt, whatever version you want to play with, Sora for the video. I'm amazed that Sora, you know, yeah, creates live video from open AI from a text prompt basically.
154
00:22:33.450 --> 00:22:37.359
Chris Penick | Axcel ILT: But again, why, why did you pick that image? Why did you pick those colors?
155
00:22:37.470 --> 00:22:45.429
Chris Penick | Axcel ILT: Where did you pick that, you know? If I asked it for people, why did you pick that people? Why did you pick that race? Why did you pick that
156
00:22:45.440 --> 00:22:47.810
Chris Penick | Axcel ILT: height gender? It's
157
00:22:47.870 --> 00:22:51.500
Chris Penick | Axcel ILT: can't tell me there! There is no way to look at that
158
00:22:51.590 --> 00:23:03.769
Chris Penick | Axcel ILT: now. The rest of the other side, and I see there's a question in Q. And A, too. I'm gonna I'll get to those trying to rearrange my windows with my queues. Oh, configuration set. So I would say, it's just when you
159
00:23:03.820 --> 00:23:05.549
Chris Penick | Axcel ILT: configuration. Okay? So
160
00:23:05.690 --> 00:23:07.760
Chris Penick | Axcel ILT: on the question here, it just
161
00:23:08.000 --> 00:23:09.680
Chris Penick | Axcel ILT: settings that I can give
162
00:23:09.870 --> 00:23:13.940
Chris Penick | Axcel ILT: right? Yeah, the higher. Exactly so one would be higher. There you go!
163
00:23:14.440 --> 00:23:18.880
Chris Penick | Axcel ILT: There's but oh, man, there's Punch going there!
164
00:23:20.000 --> 00:23:23.749
Chris Penick | Axcel ILT: How does Math return? Answer when the users ask you a question right?
165
00:23:24.150 --> 00:23:33.650
Chris Penick | Axcel ILT: Sure was, it's predicted right? It seems. So. Okay, so there's a couple of questions. Here's virtual. The configuration set is just that's my work. for there
166
00:23:33.910 --> 00:23:37.120
Chris Penick | Axcel ILT: parameters are just information
167
00:23:38.170 --> 00:23:41.340
Chris Penick | Axcel ILT: changing settings on how you want the model to behave
168
00:23:42.340 --> 00:24:06.530
Chris Penick | Axcel ILT: right? So you think of it this way you have in your car. You have settings for your seat, and where your steering wheel is, and the radio, and whether you got heated seats or something like that on my car. I've got a little thing that's got a little button that I can push that sets it to a conno or sport mode for the the thing or to you know the the 4 wheel drives on all the time, but you can
169
00:24:06.640 --> 00:24:17.390
Chris Penick | Axcel ILT: change it so that it supposedly it it gets better fuel economy. So you you pass it a parameter to change the way the engine is driving this. The net result is the same. I'm going forward.
170
00:24:17.480 --> 00:24:25.639
Chris Penick | Axcel ILT: I'm getting some some transportation out of this. But the way it's doing that has changed exactly what magic is happening under the hood
171
00:24:26.060 --> 00:24:29.480
Chris Penick | Axcel ILT: that I could go find out. but unlike
172
00:24:29.900 --> 00:24:35.599
Chris Penick | Axcel ILT: with chat. Gvt, I can't go find out exactly what the magic is happening underneath the hood.
173
00:24:35.760 --> 00:24:50.359
Chris Penick | Axcel ILT: So that's even talked about the idea of math predicting that right? The answer is a question. If you're asking a question, a question is just text, right? Okay? So your question mark, it's a question because you put a question mark on the end of it.
174
00:24:50.510 --> 00:24:57.400
Chris Penick | Axcel ILT: Alright. But to chat. Gbt. That question marks just a token token, meaning that that question meaning that you want to.
175
00:24:57.450 --> 00:25:08.230
Chris Penick | Axcel ILT: and an answer so a token as in a piece of a word or a piece of a word. And, Kate, yeah, I flipped the numbers when I was talking about the temperature there a little bit here. right?
176
00:25:08.450 --> 00:25:16.430
Chris Penick | Axcel ILT: Yeah, for a second there. But so the thing is is that you know it is predicting text. It's not really predicting text. It's right
177
00:25:16.900 --> 00:25:20.789
Chris Penick | Axcel ILT: that it's already been trained, Steven. It's been trained on
178
00:25:22.120 --> 00:25:35.779
Chris Penick | Axcel ILT: gigabytes and gigabytes of data. So Chat Gbt has already been given mounds and mounds of data. It's not. It's not just getting math chat. Gbt is not just getting data from you. It's not just from your input it's not searching the Internet either.
179
00:25:36.080 --> 00:25:47.329
Chris Penick | Axcel ILT: by default. That's not what Chat Gbt does. It's going through its built in model of the way that all that data, all those. So we give it a bunch of text.
180
00:25:47.940 --> 00:25:48.740
Chris Penick | Axcel ILT: Yeah.
181
00:25:49.910 --> 00:25:51.260
Chris Penick | Axcel ILT: a little bit of both.
182
00:25:51.300 --> 00:26:09.109
Chris Penick | Axcel ILT: So there's a question about? How has it been given? Tons of information by people or machines themselves? Both? So Openai actually went through a stage when they were building their model where they did something called instruct a Gpt. Which is where they essentially had humans correcting the responses given.
183
00:26:09.430 --> 00:26:17.690
Chris Penick | Axcel ILT: and they paid Nigeria paid Nigerians very little for that to work. Yes, to basically
184
00:26:17.900 --> 00:26:20.529
Chris Penick | Axcel ILT: put responses and correct those responses
185
00:26:20.670 --> 00:26:32.769
Chris Penick | Axcel ILT: in there. So. But how it's given Dumps, some information is just it's fed. That's that's a that's a little bit beyond the scope of here. Suffice to say that you you could go fine here. Actually, let's do this.
186
00:26:33.550 --> 00:26:34.970
Chris Penick | Axcel ILT: You could start with this.
187
00:26:35.090 --> 00:26:39.289
Chris Penick | Axcel ILT: There, you wanna you wanna build your own. Let's do that. Take a I don't.
188
00:26:39.630 --> 00:26:46.250
Chris Penick | Axcel ILT: Viviana, discard that. Let's go to the pile. This is. This is not the one.
189
00:26:46.530 --> 00:26:47.769
Chris Penick | Axcel ILT: but it's a start.
190
00:26:48.700 --> 00:26:59.199
Chris Penick | Axcel ILT: So the pile is a collection of data 825 GB, gigabytes, actually to be exact. And there is a difference between a gibbet and and a gigabyte. Don't worry about it, but it's in
191
00:26:59.910 --> 00:27:03.029
Chris Penick | Axcel ILT: what we call Json lines format. Basically, it's
192
00:27:03.430 --> 00:27:09.630
Chris Penick | Axcel ILT: it's a bunch of of data with sort of question response, question, response, question, response.
193
00:27:10.230 --> 00:27:11.500
Chris Penick | Axcel ILT: It's been given there.
194
00:27:11.510 --> 00:27:24.179
Chris Penick | Axcel ILT: And that data is loaded up into a you know. By the way, you know, AI models would not be possible without having large clusters of computers with Gpus
195
00:27:24.260 --> 00:27:34.749
Chris Penick | Axcel ILT: or Tpus. Basically, fancy computer chips that are capable of dealing with large collections of numbers. We're gonna get that name. So we're gonna we're gonna change this a little bit here.
196
00:27:34.810 --> 00:27:42.799
Chris Penick | Axcel ILT: So we take all this data alright, and you can go look at it. By the way, the data where it came from. If you look at the paper on it. Actually there you go.
197
00:27:42.930 --> 00:27:46.920
Chris Penick | Axcel ILT: Bring that up for a second. It's right about here. Yeah.
198
00:27:49.080 --> 00:27:54.210
Chris Penick | Axcel ILT: there you go. So the original pile that is used is kind of like a
199
00:27:54.350 --> 00:28:04.770
Chris Penick | Axcel ILT: sort of like a report card for various models, because they can try how well they work with the pile. It's what we call the common crawl. Yeah. So
200
00:28:05.170 --> 00:28:06.210
Chris Penick | Axcel ILT: they
201
00:28:06.400 --> 00:28:16.759
Chris Penick | Axcel ILT: the pile. Here, for example, see Wikipedia open Web, the National Institute, how the Us. Patent Trademark Office, free Law. All these places
202
00:28:16.860 --> 00:28:18.439
Chris Penick | Axcel ILT: we're gathered together.
203
00:28:18.670 --> 00:28:27.589
Chris Penick | Axcel ILT: Alright, you could get. By the way, there's more detail in the paper about exactly what those pile sets were. There. You go, Gutenberg, which is a bunch of books.
204
00:28:27.730 --> 00:28:29.789
Chris Penick | Axcel ILT: we want to chat.
205
00:28:29.880 --> 00:28:37.910
Chris Penick | Axcel ILT: So if you want help on Linux, there you go. Youtube subtitles. Which would so means it didn't actually watch the Youtube shows and videos.
206
00:28:37.940 --> 00:28:46.660
Chris Penick | Axcel ILT: But it could read all the subtitles of them. So giving that that's where this information comes from. So how do you think it knows that a Youtube video exists.
207
00:28:46.790 --> 00:28:48.580
Chris Penick | Axcel ILT: So this is called the pile
208
00:28:48.780 --> 00:29:03.919
Chris Penick | Axcel ILT: arc phase. There you go. What program? Lots, you can basically open ais. Or if you could, you'd write that you don't. The back end software would be you. Don't you get it? You get you ask for a trillion dollars worth of
209
00:29:04.130 --> 00:29:07.470
Chris Penick | Axcel ILT: of the equipment, and
210
00:29:07.710 --> 00:29:11.870
Chris Penick | Axcel ILT: then you can build it. You need a cluster of computers. The paper itself is here.
211
00:29:12.710 --> 00:29:16.509
Chris Penick | Axcel ILT: This is just one of many connect collections.
212
00:29:16.770 --> 00:29:25.100
Chris Penick | Axcel ILT: Alright. So let's back this up for a second here. Alright. So because again, I'll try to keep it focused on explainable.
213
00:29:25.280 --> 00:29:29.690
Chris Penick | Axcel ILT: But the big thing to understand. With most transformer models
214
00:29:29.730 --> 00:29:42.409
Chris Penick | Axcel ILT: they were given a fixed set of data, very large. That fixed set of data was used to train them, teach them about how words relate together.
215
00:29:43.310 --> 00:29:51.939
Chris Penick | Axcel ILT: then they can generate content based on what they've seen before. They're not spitting it out word for word. They're making new
216
00:29:52.090 --> 00:29:57.090
Chris Penick | Axcel ILT: content. based on. We know that. You know the word does kind of a throwaway word
217
00:29:57.140 --> 00:30:02.010
Chris Penick | Axcel ILT: right? But that data format uses very specific format, something called Json lines.
218
00:30:02.080 --> 00:30:10.840
Chris Penick | Axcel ILT: You can add a corporation. You can get a corporate account with Openai, and you could add your own data locally
219
00:30:10.940 --> 00:30:13.790
Chris Penick | Axcel ILT: and use it to do what we call fine tuning
220
00:30:14.110 --> 00:30:25.079
Chris Penick | Axcel ILT: the general model. So they'll use the big. It's still got the big generic set of data out there. There are places that do have Internet access for
221
00:30:25.270 --> 00:30:31.289
Chris Penick | Axcel ILT: their models that they are incorporating both pre-trained. See? That's the word pre-trained.
222
00:30:31.490 --> 00:30:39.389
Chris Penick | Axcel ILT: And they do have live access Openai by default. Jetg, doesn't do that you have to do. Ask use a Plugin for that.
223
00:30:39.450 --> 00:30:46.640
Chris Penick | Axcel ILT: In other words, it has a fixed set of time. Ask it how what its knowledge cutoff is, if you have chat, gpt open. Ask Chat Gp.
224
00:30:46.710 --> 00:30:47.870
Chris Penick | Axcel ILT: For your
225
00:30:47.940 --> 00:30:52.620
Chris Penick | Axcel ILT: version. Your permission, your account, what its knowledge cutoff is.
226
00:30:52.750 --> 00:31:04.169
Chris Penick | Axcel ILT: and you know the most like they. They just upped it for the pro subscribers. Yeah, because they're constantly adding more training. They're gathering more data. And they're feeding it to this cluster of computers and saying, find patterns in there?
227
00:31:04.310 --> 00:31:06.060
Chris Penick | Axcel ILT: Alright. So that's kind of the
228
00:31:06.090 --> 00:31:13.850
Chris Penick | Axcel ILT: that's a whole other. You can. We do a whole course on? You want to build genai. You want to learn about the components that make up genii is in generative. AI.
229
00:31:13.870 --> 00:31:15.900
Chris Penick | Axcel ILT: We could do that alright.
230
00:31:16.570 --> 00:31:21.739
Chris Penick | Axcel ILT: So let's let's think of there are lots of different ways to do this, but essentially
231
00:31:21.940 --> 00:31:28.810
Chris Penick | Axcel ILT: they they come up with choices. So I'm gonna give you a sort of a bad sketch to kind of further this out a little bit. Here. Right?
232
00:31:28.920 --> 00:31:41.969
Chris Penick | Axcel ILT: Let's say we wanted to build a model that was good at recognizing shapes for image recognition, so we could do something like this. We each of these. We'll call this just a node real quick.
233
00:31:43.630 --> 00:31:51.299
Chris Penick | Axcel ILT: okay, that that
234
00:31:51.500 --> 00:32:03.149
Chris Penick | Axcel ILT: alright. I was looking at the question, sorry there. So I'm gonna put some lines in between here, we're gonna call this site over here. Input we're gonna call this over here output. And this is just a
235
00:32:03.670 --> 00:32:11.770
Chris Penick | Axcel ILT: a deep learning network here. You could call these sort of a neuron if you wanted to. I typically, I'll just use the term node
236
00:32:11.930 --> 00:32:19.490
Chris Penick | Axcel ILT: right here. And in in a cluster of computers, in. So, for example, like Openai's model.
237
00:32:19.720 --> 00:32:27.109
Chris Penick | Axcel ILT: this is essentially this is a function you can. Or if anybody's worked with something called tensorflow or keras.
238
00:32:27.190 --> 00:32:38.390
Chris Penick | Axcel ILT: you can essentially write, you can define what a node is. It's a piece of software. and this is so a piece of software running in a cloud of computers don't try to put your head around it.
239
00:32:38.680 --> 00:32:40.329
Chris Penick | Axcel ILT: It's gonna hurt right?
240
00:32:40.530 --> 00:32:45.160
Chris Penick | Axcel ILT: yeah, hold off on that one. Now we take that note
241
00:32:45.810 --> 00:32:50.440
Chris Penick | Axcel ILT: and let's say, for example, we wanted to teach this thing how to
242
00:32:50.820 --> 00:32:57.399
Chris Penick | Axcel ILT: recognize shapes and colors so the input could be. I could say, All right, I'm going to give you a
243
00:32:57.840 --> 00:32:59.290
Chris Penick | Axcel ILT: green triangle.
244
00:32:59.770 --> 00:33:10.779
Chris Penick | Axcel ILT: So let's put a green triangle in here and I give it. I show it a green triangle, and I convert that to digital. Whether that's Jpeg or you know, diff whatever. I convert it to some format, a bitmap
245
00:33:11.000 --> 00:33:14.740
Chris Penick | Axcel ILT: some way that I can feed that into the computer, that it can recognize this.
246
00:33:14.760 --> 00:33:16.979
Chris Penick | Axcel ILT: Now, what it will do
247
00:33:17.090 --> 00:33:19.240
Chris Penick | Axcel ILT: part of the training is, you know.
248
00:33:19.460 --> 00:33:33.290
Chris Penick | Axcel ILT: right? Exactly the the threshold. The no threshold is what sort of happens here, and this isn't exact. I I'm giving you an idea of what may be happening. This gets back to our explainability. Let's say that the First Level is just figuring out
249
00:33:33.630 --> 00:33:35.600
Chris Penick | Axcel ILT: what the edges are like.
250
00:33:35.770 --> 00:33:46.350
Chris Penick | Axcel ILT: what what the edges are within the node. So it goes. Oh, you got an edge like that, you know. You get an edge like that within the input sorry you got it. Do you have an edge like that, or maybe you have no edges.
251
00:33:46.520 --> 00:33:47.419
Chris Penick | Axcel ILT: I'll put 0
252
00:33:49.350 --> 00:33:50.470
Chris Penick | Axcel ILT: something like that.
253
00:33:50.630 --> 00:33:59.260
Chris Penick | Axcel ILT: That's the first thing it does. Maybe so this first layer. The reason we call this deep learning is because it's got more than one that we're going to call these hidden layers
254
00:34:00.030 --> 00:34:00.740
Chris Penick | Axcel ILT: alright.
255
00:34:03.700 --> 00:34:11.920
Chris Penick | Axcel ILT: See? So they're hidden because I know they exist. A hidden is a bad word. I think opaque would be a better word.
256
00:34:12.199 --> 00:34:33.520
Chris Penick | Axcel ILT: I know they exist. They're not hidden from me. I know those layers are there. I can see them, you know, in them all there. What I don't know is how they're making the decisions that they're deciding. And Matthew mentioned something about weights and nodes and the node threshold right? There's basically, there's a function that we can say, Okay, decide whether or not. Is it true that it has an edge like this? Yes or no?
257
00:34:33.630 --> 00:34:36.120
Chris Penick | Axcel ILT: Is it true that it has an edge like this? Yes or no
258
00:34:36.920 --> 00:34:46.789
Chris Penick | Axcel ILT: sort of? And we call that an activation function. And there are all different ways, sigmoid Ralph, that it's all different ways to do activation functions. All you need to know is that this is basically just a decision point.
259
00:34:46.989 --> 00:34:57.849
Chris Penick | Axcel ILT: and it's making it choice, based on the input saying, yes, you have an edge like this. Yes, you have an edge like that. Okay, so let me clean this up for a second. So let's go through what you might end up with. praised the wrong thing.
260
00:34:58.490 --> 00:35:02.400
Chris Penick | Axcel ILT: Let's put that back. Let's get this out of here. What you might end up with then.
261
00:35:03.330 --> 00:35:04.290
Chris Penick | Axcel ILT: you know.
262
00:35:05.200 --> 00:35:12.830
Chris Penick | Axcel ILT: Now, unfortunately, Mary, not. But so Mary asked about. Is there a rule of thumb? Knowing how many hidden layers needed?
263
00:35:13.210 --> 00:35:24.099
Chris Penick | Axcel ILT: No, but I would highly recommend. You can see the effect of changing the number of layers. I'll show you a link in just a little bit called Tetraflow playground, some of you, some of my other Ml. People may
264
00:35:24.130 --> 00:35:28.430
Chris Penick | Axcel ILT: seen that before. It's a good way to kind of guess. Yeah, other than experimentation.
265
00:35:28.560 --> 00:35:41.899
Chris Penick | Axcel ILT: no, it is not it. Typically because again, this is not even. I don't even know for sure that this is why it's deciding what it is. But I'm going to layer this. I'm going to label this layer.
266
00:35:42.280 --> 00:35:53.009
Chris Penick | Axcel ILT: it's it's right. It's been trained in this example. So I'm going to label this layer edge. So this this layer figured out what the edges were.
267
00:35:53.470 --> 00:36:01.430
Chris Penick | Axcel ILT: and then this next layer says, you know, how about if it's got corners and what those corners are like. If those corners are.
268
00:36:02.000 --> 00:36:08.819
Chris Penick | Axcel ILT: So we'll draw some corners here. Maybe we have, like sort of an acute angle, right? Maybe we have a very
269
00:36:09.050 --> 00:36:11.730
Chris Penick | Axcel ILT: wide angle like that. No, no, no.
270
00:36:11.810 --> 00:36:12.850
Chris Penick | Axcel ILT: still a line.
271
00:36:13.020 --> 00:36:14.080
It's still alive.
272
00:36:14.400 --> 00:36:18.149
Chris Penick | Axcel ILT: Maybe it doesn't have the there you go just has the
273
00:36:20.350 --> 00:36:21.950
Chris Penick | Axcel ILT: maybe
274
00:36:22.590 --> 00:36:25.230
Chris Penick | Axcel ILT: a right angle, just the line.
275
00:36:25.430 --> 00:36:36.170
Chris Penick | Axcel ILT: Oh, curt, if it's basically the if the corner of it is is curved. and then maybe the last little bit might be something like. And again, I don't have control over this
276
00:36:36.350 --> 00:36:40.850
Chris Penick | Axcel ILT: entirely. There are lots of ways that I can influence its learning.
277
00:36:41.240 --> 00:36:51.809
Chris Penick | Axcel ILT: and that's what parameters are going back to those config settings, as I call them. Parameters are some ways I can influence. But it's coming from the data. It's it's looking at this data.
278
00:36:51.870 --> 00:37:03.099
Chris Penick | Axcel ILT: Yeah, we do a class. We'll we'll show you like we'll teach it how to recognize digits. It's a classic one that we use in machine learning all the time to show how to do image recognition, how to recognize, recognize handwritten digits.
279
00:37:03.160 --> 00:37:16.319
Chris Penick | Axcel ILT: And I can. You know we we do other examples. I've done one with like cats and dogs, and tried to get it to recognize cats and dogs, and the more data, the more accurate in general. That's kind of your rule of thumb, the more layers
280
00:37:16.970 --> 00:37:19.990
Chris Penick | Axcel ILT: tends to give you better results, Mary.
281
00:37:20.190 --> 00:37:28.389
Chris Penick | Axcel ILT: because you can have different decision thresholds there. But exactly, this is just my guess is what's happening right now
282
00:37:28.560 --> 00:37:41.740
Chris Penick | Axcel ILT: to illustrate the concept of what could be happening here? How about the last little bit then would be color. So I said it was a green triangle. Let me make sure I'm gonna make it really green like, fill it in, green. There you go. That's great. Can't you tell?
283
00:37:42.080 --> 00:37:50.369
Chris Penick | Axcel ILT: Yeah. Told you at Geek geek school, not art school. Sorry. So maybe the first color is check and see if it's yellow.
284
00:37:50.930 --> 00:38:00.930
Chris Penick | Axcel ILT: Next one's check and see if it's green right the next one, maybe. How about some blue. Put some blue in here right?
285
00:38:04.580 --> 00:38:06.770
Chris Penick | Axcel ILT: It's not much different. Tim
286
00:38:06.990 --> 00:38:18.299
Chris Penick | Axcel ILT: AI is built upon. So Tim's got this question about how's this different from probabilistic models. It's it's not each. The difference is size deep. that that the operative word is deep. Right?
287
00:38:18.440 --> 00:38:19.680
Chris Penick | Axcel ILT: Say a deep.
288
00:38:19.710 --> 00:38:22.950
Chris Penick | Axcel ILT: So those hidden layers
289
00:38:22.960 --> 00:38:26.160
Chris Penick | Axcel ILT: give it more chances to make
290
00:38:26.210 --> 00:38:28.509
Chris Penick | Axcel ILT: those probabilistic choices there.
291
00:38:28.520 --> 00:38:30.840
Chris Penick | Axcel ILT: Alright. And last little bit.
292
00:38:31.860 --> 00:38:36.950
Chris Penick | Axcel ILT: yeah. So, Victor, if you're gonna do a question, you gonna have to put it in in chat or or
293
00:38:37.180 --> 00:38:40.589
Chris Penick | Axcel ILT: somewhere. I did so, since we're all muted right the moment
294
00:38:40.670 --> 00:38:49.950
Chris Penick | Axcel ILT: and the last little bit that oh, I don't know. Let's just send some red or something. There we go. Okay, so this is color. So let's let's look at this example. Then we'll follow along. We'll use
295
00:38:50.200 --> 00:38:52.290
Chris Penick | Axcel ILT: about. We'll use
296
00:38:52.640 --> 00:38:57.029
Chris Penick | Axcel ILT: this to represent. So the first thing the input goes in and they all see it
297
00:38:57.170 --> 00:39:05.559
Chris Penick | Axcel ILT: right. See? So all these edges here, checking edge all these nodes of the edge layer check and see. So we need, yeah, right?
298
00:39:06.470 --> 00:39:10.000
Chris Penick | Axcel ILT: Right? How reliable. See, you're not building your own
299
00:39:10.070 --> 00:39:18.060
Chris Penick | Axcel ILT: unless you have millions of dollars. You are not building your own. Sir Kosovo. There was question about building your own. It's lots of data, lots of resources.
300
00:39:18.860 --> 00:39:22.619
Chris Penick | Axcel ILT: Alright, so as we go in here right.
301
00:39:23.160 --> 00:39:26.429
Chris Penick | Axcel ILT: and we come in that one gets triggered because it's got an edge like that.
302
00:39:26.450 --> 00:39:30.520
Chris Penick | Axcel ILT: That one gets triggered. This one doesn't get triggered because there is no
303
00:39:30.690 --> 00:39:39.110
Chris Penick | Axcel ILT: just sort of weird straight. I guess I could do like another straight up and down one, too. Maybe there's like, and then it it definitely has edges. So this one doesn't get triggered.
304
00:39:39.440 --> 00:39:44.279
Chris Penick | Axcel ILT: Alright. So so far, the pathway has been through here to here. Okay, then, what's next?
305
00:39:44.680 --> 00:39:47.860
Chris Penick | Axcel ILT: Well, it has a corner like that. Okay?
306
00:39:48.630 --> 00:39:53.170
Chris Penick | Axcel ILT: And it has just a line. So I guess it will take those 2.
307
00:39:53.970 --> 00:39:55.200
Chris Penick | Axcel ILT: But then.
308
00:39:55.520 --> 00:40:02.939
Chris Penick | Axcel ILT: you know, we can't from here. All right. So we've gone. Move to here to here. Can we move down here now? It doesn't have a right angle. So it's not gonna go down here.
309
00:40:03.370 --> 00:40:13.889
Chris Penick | Axcel ILT: you know. See, it's it's basically path. Of what about this one now? And what about this one? Now? It's not gonna come down to here either. So these are going to be the picture. So so far we've gone here to here.
310
00:40:13.970 --> 00:40:21.590
Chris Penick | Axcel ILT: and then we get to. All right, Cisco, that way. It's like, well, no, we don't have this color. No, we don't have this. No, we don't have this. But yes, we do have
311
00:40:21.740 --> 00:40:29.260
Chris Penick | Axcel ILT: green. So this is sort of the pathway it takes to come out and say, Okay, I see it green. It's a green trying.
312
00:40:32.070 --> 00:40:32.830
Chris Penick | Axcel ILT: right?
313
00:40:34.360 --> 00:40:35.370
Chris Penick | Axcel ILT: So
314
00:40:36.260 --> 00:40:40.260
Chris Penick | Axcel ILT: doing all that right? Going through that pathway. Okay. So here's the thing
315
00:40:40.950 --> 00:40:46.880
Chris Penick | Axcel ILT: I don't know this path actually exists. All I can do is kind of from the outside.
316
00:40:47.120 --> 00:40:58.050
Chris Penick | Axcel ILT: Look at this and say, Well, here's what I think has happened, and there are ways to. There are all sorts of ways to probe this, and that's what I want to get back into with the slides. Here is a little bit there, all right. So
317
00:40:58.210 --> 00:40:59.170
Chris Penick | Axcel ILT: you know.
318
00:40:59.530 --> 00:41:04.930
Chris Penick | Axcel ILT: we'll talk about so. Vaudrey, we'll talk about practical applications in healthcare.
319
00:41:05.190 --> 00:41:11.039
Chris Penick | Axcel ILT: But that's just one of many, many topics in here. There you go, there you go. There was a question for Kate.
320
00:41:12.200 --> 00:41:22.750
Chris Penick | Axcel ILT: you can train it on your data to fine tune it. But you all are using right? The original data would be there pi torch, or others like this. Yeah, yeah. yeah. Pythorch would be more like this event. Exactly
321
00:41:22.820 --> 00:41:26.179
Chris Penick | Axcel ILT: so, right. We were saying before that you could definitely add
322
00:41:27.560 --> 00:41:32.829
Chris Penick | Axcel ILT: then, right? So Tim's questions about how to go back and train it. Well, that's just it. Is that
323
00:41:32.980 --> 00:41:37.560
Chris Penick | Axcel ILT: what we can do? We have various functions that we could adjust and say, no, that's wrong.
324
00:41:37.680 --> 00:41:55.990
Chris Penick | Axcel ILT: Right? Then that's a whole different thing of being going back and saying, Okay, adjust the weighting of these factors. And again, that's something that it does on its own. And all of these things are, yeah, right? See? So like, this is exactly my prediction. I said, you know, this is this is called explainable. AI,
325
00:41:56.110 --> 00:41:58.069
Chris Penick | Axcel ILT: not how to build AI.
326
00:41:58.470 --> 00:42:11.169
Chris Penick | Axcel ILT: But if you want those we do offer those, come in there, there you go. So, but basically what's happening? Yeah, exactly. It's so we'll talk about something called gradient descent sometime, too, to figure out well, what we calculate is how far off it was.
327
00:42:11.290 --> 00:42:13.880
Chris Penick | Axcel ILT: If we're really we want to train it.
328
00:42:14.260 --> 00:42:22.400
Chris Penick | Axcel ILT: What if it if it said something along the lines of How about instead of green triangle it, said Red Triangle. But somehow it came down here.
329
00:42:22.680 --> 00:42:30.879
Chris Penick | Axcel ILT: And it, said Red Triangle, even though we know it's great. All right? Well, then, the color needs to have the weight adjusted basically
330
00:42:31.180 --> 00:42:32.830
Chris Penick | Axcel ILT: is that it's not using that.
331
00:42:32.870 --> 00:42:39.129
Chris Penick | Axcel ILT: the number of features right to experiment, we can, we can use the number of features we can use the number of of
332
00:42:39.150 --> 00:42:51.989
Chris Penick | Axcel ILT: layers we can use weights and biases. And, by the way, again, when I use that word bias here, where it's it's different than we're talking about bias bias in general, you know of making decisions.
333
00:42:52.190 --> 00:42:58.649
Chris Penick | Axcel ILT: There you go. We are sharing. Yes, we are sharing, you are.
334
00:42:58.840 --> 00:43:00.390
Chris Penick | Axcel ILT: Don't know why you can't see it.
335
00:43:01.730 --> 00:43:06.099
Chris Penick | Axcel ILT: Alright. There. all right. So let's let's jump back. We can come back to this in a bit.
336
00:43:06.570 --> 00:43:16.129
Chris Penick | Axcel ILT: We can can play with this example again. There's lots of like, I said, you know, we could, we could talk about these. And this is just a generic network for a second here.
337
00:43:18.090 --> 00:43:21.649
Chris Penick | Axcel ILT: so the black box, like I said. The problem is is that
338
00:43:21.760 --> 00:43:29.450
Chris Penick | Axcel ILT: you know it's not really. You can't. Just look at the parameters that you passed it. You can't. you know.
339
00:43:30.080 --> 00:43:33.289
Chris Penick | Axcel ILT: Look well, there you go. Swven. That's a good example.
340
00:43:34.350 --> 00:43:54.289
Chris Penick | Axcel ILT: Right? You cannot just look at you. Give an image of an X-ray you give an image of of. Or you know, certain cells cells, cancer cells have certain shapes. Sickle cell anemia has a certain shape, right? Trying to detect the shape of the the cell all of those things and figuring those out. There are various classic algorithms to deal with each of those. But
341
00:43:55.030 --> 00:43:58.550
Chris Penick | Axcel ILT: White box is what we will call some of these models.
342
00:43:58.900 --> 00:44:09.229
Chris Penick | Axcel ILT: these tend to be the classic ones, what we call an interpretable model, or you might hear the term glass box, and I like to use that phrase more often. So the black versus glass I can see through. And I know what's happening.
343
00:44:09.490 --> 00:44:12.359
Chris Penick | Axcel ILT: So when we talk about what we're
344
00:44:12.370 --> 00:44:24.330
Chris Penick | Axcel ILT: trying to interpret. Alright. So let's let's kind of argue over some terms for a second. Here we'll make sure we're all in agreement. Here is the scope of this interpretability is the algorithm transparent. How does the algorithm create the model?
345
00:44:24.570 --> 00:44:33.020
Chris Penick | Axcel ILT: All I can tell you open AI is not going to give you exact details. Open. AI is, is misnamed in the title. They should be called opaque AI,
346
00:44:33.070 --> 00:44:41.489
Chris Penick | Axcel ILT: because they don't entirely give you the details of how that works, and they will train change their Api. In other words, how I write a program to use
347
00:44:41.840 --> 00:44:50.399
Chris Penick | Axcel ILT: their model on my data. So it's been trained on a big collection of data. And now I'm using it specifically on mine.
348
00:44:50.720 --> 00:45:00.550
Chris Penick | Axcel ILT: I just had to rewrite a course with another instructor just last week, because Openai changed the Api the way we write our python program to do this.
349
00:45:00.790 --> 00:45:13.599
Chris Penick | Axcel ILT: And so did Llama and a couple of others. So they're constantly rewriting ways. So you know how to interact with. You know what to give it, but you don't know exactly how to work with it, Matthew, if you raise your hand I can't do everything, because I can't turn on get it.
350
00:45:13.640 --> 00:45:22.700
Chris Penick | Axcel ILT: But somewhere there you go. Global. So the other thing is like, what we look at then is, how does the model train model make those predictions?
351
00:45:22.870 --> 00:45:25.529
Chris Penick | Axcel ILT: And then, yeah, that's a good one, Stephen.
352
00:45:25.660 --> 00:45:26.819
Chris Penick | Axcel ILT: Yeah, no, worries Matt.
353
00:45:26.840 --> 00:45:33.749
Chris Penick | Axcel ILT: It's not a problem, really. I it's just dealing with with large numbers here. It makes it fun the
354
00:45:33.910 --> 00:45:42.390
Chris Penick | Axcel ILT: what we want to look at. So when I was giving my example of the green triangle, the red triangle, blue triangle, one right?
355
00:45:42.660 --> 00:45:45.100
Chris Penick | Axcel ILT: So one model
356
00:45:45.700 --> 00:45:57.560
Chris Penick | Axcel ILT: sort of perplexity, for example, perplexity uses gives resources. It's psyched because it's going out and doing it. It's used. It's training on a certain level. And then
357
00:45:59.950 --> 00:46:05.540
Chris Penick | Axcel ILT: what's that? I don't know what's going on. Okay? Oh.
358
00:46:05.860 --> 00:46:08.300
Chris Penick | Axcel ILT: this. We've okay. I got you
359
00:46:08.820 --> 00:46:22.400
Chris Penick | Axcel ILT: alright. There you go, sir, but Irving brought up you said one model clicks to give citations, so perplexity is one. For example, what perplexity is doing is using a pre-trained model, and in addition, doing the Internet Internet search on related topics.
360
00:46:22.450 --> 00:46:26.489
Chris Penick | Axcel ILT: And it's giving you resources for that. I love perplexity. I use it quite a bit, actually
361
00:46:26.620 --> 00:46:36.749
Chris Penick | Axcel ILT: because it's very useful in finding kind of both there. But each of these here's the thing is like, so let's break this down for a second. All right. First of all, you know the transparency of the algorithm. That's one thing.
362
00:46:36.870 --> 00:46:45.970
Chris Penick | Axcel ILT: Then the next thing I want to know is, how does the model itself make predictions? How once it's trained. what, what is its process? Its procedure for giving me a prediction?
363
00:46:46.130 --> 00:46:57.269
Chris Penick | Axcel ILT: Right? Then the next thing is, you know what about it? And a modular level, you know. How do the parts of the model affect those predictions? In other words, is there a parameter that I can change to give it
364
00:46:57.320 --> 00:47:04.440
Chris Penick | Axcel ILT: better or worse predictions. There, not all models are interpretable at that. In that parameter level.
365
00:47:04.640 --> 00:47:07.500
Chris Penick | Axcel ILT: you know some linear models. For example.
366
00:47:08.080 --> 00:47:14.669
Chris Penick | Axcel ILT: we'll talk about. That's the weights. And you know, for trees, it's it would be where it make a decision tree. It's where it makes the splits
367
00:47:15.000 --> 00:47:19.409
Chris Penick | Axcel ILT: right and cut up points. And and so,
368
00:47:19.520 --> 00:47:27.710
Chris Penick | Axcel ILT: bing, bing is, yeah, doing the same sort of thing as perplexity. Bing, co-pilot. Yeah, it's basically. yeah. Oh, there you go. Bonus.
369
00:47:29.210 --> 00:47:34.620
Chris Penick | Axcel ILT: yeah, exactly. Llama. There are. There are open models. Yeah, the the scope of this. And like, I said.
370
00:47:34.970 --> 00:47:43.849
Chris Penick | Axcel ILT: yeah, we'll come to the Gn Gen. AI course and we'll build. We'll look at many of them. Like, I was just going through labs for that. But yesterday
371
00:47:43.870 --> 00:47:46.870
Chris Penick | Axcel ILT: updating those labs to make sure that they still work just fine.
372
00:47:46.970 --> 00:47:50.739
Chris Penick | Axcel ILT: And then when we get down to, we talk about local.
373
00:47:51.130 --> 00:47:56.850
Chris Penick | Axcel ILT: what local interpretability means is, why did the model make a prediction for a particular instance?
374
00:47:56.970 --> 00:48:02.220
Chris Penick | Axcel ILT: You know words. Why did it make a prediction for my green triangle that it said it was a red Triangle.
375
00:48:02.240 --> 00:48:15.109
Chris Penick | Axcel ILT: or why did it, you know, for a bunch of triangles? That would be the group of predictions right? See? For a group of instances. Why do triangles come out weird in general? Why do triangles keep coming out with the wrong color?
376
00:48:15.300 --> 00:48:24.910
Chris Penick | Axcel ILT: Right? Those are things that in my particular model as my purely hypothetical sketch. Bad sketch. I can't explain with my model there why it would work
377
00:48:25.140 --> 00:48:28.509
Chris Penick | Axcel ILT: so some techniques to get around this, to to figure this out
378
00:48:28.720 --> 00:48:31.869
Chris Penick | Axcel ILT: alright. We have, you know, techniques to sort of
379
00:48:32.490 --> 00:48:42.620
Chris Penick | Axcel ILT: jump in there. We could do things, you know. We'll talk about feature, importance, or we can do some model specific techniques. And we can talk about model agnostic techniques. Right? And
380
00:48:43.570 --> 00:48:51.250
Chris Penick | Axcel ILT: so for feature importance, the idea is that what we do is we rank all that input. and we base it on
381
00:48:51.470 --> 00:48:55.859
Chris Penick | Axcel ILT: what we call its importance to the predictive modeling. I'll give you an example, all right.
382
00:48:55.980 --> 00:49:13.609
Chris Penick | Axcel ILT: for example, so Shapley, so Shapley additive explanations. It's a horrible acronym. because, you see, you seal the SH. And the A, and it doesn't matter anyway. But Shapley additive explanations. Here's the way to think about it. I bet you those of you who went with college you had to do a group project at some point or another.
383
00:49:13.960 --> 00:49:18.680
Chris Penick | Axcel ILT: And you know, my my daughter in high school, my college age kids. They all did the
384
00:49:18.730 --> 00:49:30.660
Chris Penick | Axcel ILT: the group projects, and there's always right. There's the one person that seems to do like 80% of the work. And then maybe everybody you know 2 or 3 others do 5 or percent a piece. And then one does like practically nothing.
385
00:49:30.850 --> 00:49:31.660
Chris Penick | Axcel ILT: Right?
386
00:49:31.920 --> 00:49:36.219
Chris Penick | Axcel ILT: Okay, it's the same thing with these AI models. What
387
00:49:37.410 --> 00:49:43.130
Chris Penick | Axcel ILT: pathways through those nodes, what decisions contributed to the results.
388
00:49:43.340 --> 00:49:56.760
Chris Penick | Axcel ILT: And so the way Shapley works is kind of based on it's based on game theory. It's essentially saying so. What if we you know you and you, your friends, decide to go into business together, and you want to figure out how much each person's individual contribution
389
00:49:56.980 --> 00:50:00.119
Chris Penick | Axcel ILT: contribute it to the overall profit of the organization.
390
00:50:00.400 --> 00:50:17.330
Chris Penick | Axcel ILT: That's essentially what Chapley's doing, genie importance and permutation, importance or other methods that are similar to that. So like, I said, when I'm trying to figure out which features most affect the prediction. These are tools that we can use. And Chapley, for example, there has some good examples in python that we can even go look at and
391
00:50:17.490 --> 00:50:19.059
Chris Penick | Axcel ILT: than do right online.
392
00:50:19.490 --> 00:50:24.039
Chris Penick | Axcel ILT: Next little bit model specific techniques would be things like, see, the thing is
393
00:50:24.760 --> 00:50:28.429
Chris Penick | Axcel ILT: things like linear aggression, logistic regression, decision trees
394
00:50:28.580 --> 00:50:32.279
Chris Penick | Axcel ILT: sort of in order here. Some of these are
395
00:50:32.510 --> 00:50:37.599
Chris Penick | Axcel ILT: very like, you know, linear regression is very glass box. You know exactly what it's doing.
396
00:50:37.700 --> 00:50:45.629
Chris Penick | Axcel ILT: It's saying, oh, look at these data points, and let's see if we can make a line that predicts the next value. Let's see if we can map a line plot, a line
397
00:50:45.700 --> 00:50:54.840
Chris Penick | Axcel ILT: to that data so that we can say, Okay, the next value. If I give it this XI should get a Y right about here on that line.
398
00:50:55.300 --> 00:51:08.180
Chris Penick | Axcel ILT: Alright. So it's very right. See? Direct interpretation of it is is pretty easy with these sort of things but decision tree. You know, we can see decisions, decision trees where they make their choice to go. You know
399
00:51:08.300 --> 00:51:30.929
Chris Penick | Axcel ILT: the classic one in machine learning classes. We? We go, and we talk about whether you know who survived the Titanic, and we go down things like what passenger class they were, whether they were male or female, how old they were. And so on down the line, and you get down to the bottom, and then, you know, it gives you your likelihood of that they survived or not. And we compare that with the actual survivors list.
400
00:51:30.970 --> 00:51:40.190
Chris Penick | Axcel ILT: And it's actually pretty good by figuring out that the tree there you can get a good, you know, almost 90 here of yeah, you're right. That that person would survive
401
00:51:40.550 --> 00:51:58.919
Chris Penick | Axcel ILT: or survive the disaster. Deep learning, on the other hand, like said, these are these things. Here we can do what we call layer, wise relevance propagation. Essentially, the idea is that we'll be. We start to go in and look at individual layers to visualize what's going on there. So you have to kind of open up and
402
00:51:58.980 --> 00:52:08.179
Chris Penick | Axcel ILT: break open the black box. But these are all tailor explanations, you know, for particular models. So the model you use and try to build
403
00:52:08.580 --> 00:52:09.840
Chris Penick | Axcel ILT: gives me
404
00:52:10.000 --> 00:52:15.109
Chris Penick | Axcel ILT: sort of limits to the choices I have to to build
405
00:52:15.390 --> 00:52:22.559
Chris Penick | Axcel ILT: explainability into it. And these are just. And the thing is, there's always new ideas on how we can make these more transparent.
406
00:52:23.320 --> 00:52:28.629
Chris Penick | Axcel ILT: There are some model agnostic techniques, for example, something called lime.
407
00:52:28.680 --> 00:52:32.350
Chris Penick | Axcel ILT: which is this local, interpretable model agnostic seats, even in the name.
408
00:52:32.390 --> 00:52:34.349
Chris Penick | Axcel ILT: So basically it
409
00:52:34.590 --> 00:52:41.780
Chris Penick | Axcel ILT: think of it like looking at a complex model and making a bunch of mini simpler ones that we can interpret.
410
00:52:42.130 --> 00:52:44.489
Chris Penick | Axcel ILT: So we, the theory being
411
00:52:44.730 --> 00:52:57.140
Chris Penick | Axcel ILT: that a complex model is just a bunch of smaller simple ones, like understanding one node is easy. That's why I gave you my little sketch understanding one node that says, Is this a cube angle.
412
00:52:57.150 --> 00:53:00.180
Chris Penick | Axcel ILT: or is this green? That's easy.
413
00:53:00.380 --> 00:53:07.729
Chris Penick | Axcel ILT: But starting to put those together, see, they get more complicated as you start putting those patterns together. But what it tries to do is build
414
00:53:08.000 --> 00:53:09.730
Chris Penick | Axcel ILT: mini models, so to speak.
415
00:53:10.030 --> 00:53:20.210
Chris Penick | Axcel ILT: partial dependency we look for like a single feature that you know, if we want to see what the effect is of a single feature. So it's kind of what I teach people when I'm like
416
00:53:20.320 --> 00:53:25.010
Chris Penick | Axcel ILT: programming classes and troubleshooting. I say, change one thing, just one thing at a time
417
00:53:25.280 --> 00:53:34.760
Chris Penick | Axcel ILT: mit ctl, and what's the what's the difference? You get? Change one thing again. Change one thing, you know. Put the other thing back. One thing at time is that's sort of how Pdp works. 200
418
00:53:34.810 --> 00:53:46.439
Chris Penick | Axcel ILT: counterfactual explanations. What we do is we we keep describing like a classic example is, let's say I was going to teach
419
00:53:46.550 --> 00:53:49.579
Chris Penick | Axcel ILT: AI to recognize the Mona Lisa. That's that's one
420
00:53:49.770 --> 00:54:00.520
Chris Penick | Axcel ILT: I've seen a couple of professors that I've I've used to kind of sir, or or art in general, or particularly the screen. But moocs, whatever we're trying to teach it to recognize up a certain piece of art.
421
00:54:00.590 --> 00:54:03.870
Chris Penick | Axcel ILT: What we can do is start saying, you know, describing.
422
00:54:04.260 --> 00:54:12.420
Chris Penick | Axcel ILT: you know, features. What if I change her eyes? What if I change her smile. Does it still recognize that image as
423
00:54:12.710 --> 00:54:14.640
Chris Penick | Axcel ILT: the Mona Lisa?
424
00:54:14.780 --> 00:54:23.639
Chris Penick | Axcel ILT: At what threshold does that change? And all that's doing is telling me how it figured out. you know, whether that was important or not, and so these are flexible bottles.
425
00:54:23.910 --> 00:54:33.549
Chris Penick | Axcel ILT: Each of these could be an hour to lesson in and of themselves of just working through this. So my suggestion to you, if you're going to really get into this is, of course, go through.
426
00:54:33.660 --> 00:54:37.460
Chris Penick | Axcel ILT: Make sure you've got a good handle on classic machine learning.
427
00:54:37.500 --> 00:54:39.399
Chris Penick | Axcel ILT: Go through some sort of genai.
428
00:54:39.630 --> 00:54:42.959
Chris Penick | Axcel ILT: and then you can go and learn to build
429
00:54:44.630 --> 00:54:59.549
Chris Penick | Axcel ILT: extensible, explainable Xai from there. But interpol models like all these, if my data scientists look at these and they go. Yeah, I know all these right, like I. If if I if I asked you, you really like, yeah. And you know how they work. And you know what they're doing.
430
00:55:00.180 --> 00:55:02.729
Chris Penick | Axcel ILT: They're inherently explainable
431
00:55:02.920 --> 00:55:13.020
Chris Penick | Axcel ILT: because they're just math problems. Basically right. We take the data points to figure out the linear logistic regression. Naive Bayesian is based on probability of of, you know, a given B.
432
00:55:13.690 --> 00:55:18.380
Chris Penick | Axcel ILT: K nearest neighbors. We just look at the data points when we we say, Oh, let's look at
433
00:55:18.490 --> 00:55:24.090
Chris Penick | Axcel ILT: 3, 5, 7, whatever of your nearest neighboring data points, and we'll decide that you belong with them.
434
00:55:24.890 --> 00:55:27.900
Chris Penick | Axcel ILT: That's easy to interpret, easy to understand how it came to that.
435
00:55:28.290 --> 00:55:37.790
Chris Penick | Axcel ILT: But there's trade-offs as you keep going through this, as you keep going to. You know you want model accuracy. So deep learning models, you know, have very accurate.
436
00:55:37.880 --> 00:55:42.140
Chris Penick | Axcel ILT: but interpretability tends to be nothing
437
00:55:42.480 --> 00:56:06.370
Chris Penick | Axcel ILT: right, and and so kind of work your way up, and I don't entirely agree that it's a nice smooth curve like this or that. It's exactly this difference here, but it's just kind of illustrating the point of somewhere along that as you go from classic machine learning which would be what we have down this way and working our way up to these fancy deep learning models that the interpreter just disappears.
438
00:56:06.540 --> 00:56:09.429
Chris Penick | Axcel ILT: just starts to go away, becomes okay. Right?
439
00:56:10.220 --> 00:56:16.490
Chris Penick | Axcel ILT: So there's another way. You could look at this, we could not just looking at the interpretability. But the bias.
440
00:56:16.530 --> 00:56:29.520
Chris Penick | Axcel ILT: What we could do alright is try to add in to that normal process the kind of nutshell process I gave you earlier there of going through this. I know I'm getting close on time. I'm going to lose you all soon. I'm almost done.
441
00:56:29.680 --> 00:56:33.069
Chris Penick | Axcel ILT: Alright. There is that we could just add, you know.
442
00:56:33.120 --> 00:56:44.879
Chris Penick | Axcel ILT: document the bias, gather more data and keep rinse repeat, we go through this process we put in specifically places to check for bias, to check for explainability. But this is still outside the box.
443
00:56:45.090 --> 00:57:08.190
Chris Penick | Axcel ILT: Right? See? Manual automated process to assess monitor the production unseen data. So this answers the bias side of it. And that's an ethical issue more than explainable issue here. It doesn't really explain how they're getting their choices. So I mentioned some algorithms. I'm gonna kind of buzz through this quickly. II can make the slide set available. The recording will be there, I know, getting near the end. There's so much to cover. I knew it was not gonna fit in
444
00:57:08.400 --> 00:57:10.100
Chris Penick | Axcel ILT: 45 min to an hour, but
445
00:57:10.210 --> 00:57:28.459
Chris Penick | Axcel ILT: you know the local that I mentioned the locally interpretable model agnostic explanation. This is lime. Basically, we make Mini models. As I said, you know, Shapley, I mentioned few others, though, Morris, sensitivity analysis, basically, it works, we work through. We could sometimes call this as a one step.
446
00:57:28.700 --> 00:57:34.439
Chris Penick | Axcel ILT: All right. So we do one step at a time. Only one input. Every time we run through, the model is adjusted.
447
00:57:34.510 --> 00:57:36.600
Chris Penick | Axcel ILT: one input and one input only.
448
00:57:37.020 --> 00:57:41.829
Chris Penick | Axcel ILT: So that takes a lot of work to. But you almost need another one. Yeah, to do the same thing.
449
00:57:42.410 --> 00:58:06.769
Chris Penick | Axcel ILT: Contrastive explanation model. I'm gonna buzz through these real. I'm gonna come back to them. I'm gonna move to something else. Here I do have in the slide set there links of some of these tools. There's a a nice little collection on alibi, and and these are, you know, awesome production learning as a good collection of about ethical AI. But the there's a toolbox for machine learning here at ethical mol.
450
00:58:06.780 --> 00:58:14.089
Chris Penick | Axcel ILT: So xai and I. So I'll be like, I'll have these available for you to look at, chap lucid.
451
00:58:14.300 --> 00:58:21.290
Chris Penick | Axcel ILT: So let's talk at healthcare. Then, because there was a question earlier about it. Let's let's let's mention it real quick, all right. And that is the thing is that
452
00:58:21.420 --> 00:58:29.150
Chris Penick | Axcel ILT: you know how it's gonna matter here. And there's you know that. was it. I have the example here. See if I can grab it.
453
00:58:29.600 --> 00:58:33.130
Chris Penick | Axcel ILT: unfortunately, yeah.
454
00:58:33.640 --> 00:58:34.929
Chris Penick | Axcel ILT: let me grab.
455
00:58:35.260 --> 00:58:42.450
Chris Penick | Axcel ILT: you know that was it. United healthcare was getting sued over the use of AI in here. Let me see if I have this link.
456
00:58:43.080 --> 00:58:45.870
Chris Penick | Axcel ILT: there we go.
457
00:58:46.010 --> 00:58:49.240
Chris Penick | Axcel ILT: like, I said, I have too much to fit in an hour. There's just too much to fit.
458
00:58:49.460 --> 00:58:56.249
Chris Penick | Axcel ILT: There you go. Yeah, yeah. Ethics lawsuit. Was it united Us. District Court? Yeah. United healthcare for utilizing the Nh predict
459
00:58:56.390 --> 00:58:58.829
Chris Penick | Axcel ILT: algorithm to make healthcare determinations.
460
00:58:58.900 --> 00:59:08.349
Chris Penick | Axcel ILT: And so plaintives they're suing. They're representing deceased people who said they weren't. They were forced to pay out of pocket for things that were necessarily the AI. Said it wasn't necessary.
461
00:59:08.640 --> 00:59:11.840
Chris Penick | Axcel ILT: they say, well, it should have been. So
462
00:59:12.120 --> 00:59:13.510
Chris Penick | Axcel ILT: I
463
00:59:13.560 --> 00:59:24.100
Chris Penick | Axcel ILT: anyway, you know. So this is where we're heading. This is what we're gonna start saving. So you know, what we need is to put that explainability? How did that that Nh, predict
464
00:59:24.220 --> 00:59:32.370
Chris Penick | Axcel ILT: make that prediction? So that we can understand that? Give that to the end users, then we can have data decisions made on this right?
465
00:59:32.570 --> 00:59:36.330
Chris Penick | Axcel ILT: And now, great, we can decide, do that. You really need that care?
466
00:59:36.370 --> 00:59:46.519
Chris Penick | Axcel ILT: Alright, yes or no. And if you do, and even better. Yeah, hopefully, it finds that. Yes, you do need that care. It's similar to other cases. And we end up with hopefully, better quality care.
467
00:59:46.700 --> 00:59:59.090
Chris Penick | Axcel ILT: finance, same thing we've got onboarding, crediting. I have a lot of resources in the slides here that said, there's so so much that is trying to fit into one little hour here. But you know the the thing with finance is just.
468
00:59:59.170 --> 01:00:08.970
Chris Penick | Axcel ILT: Consequences are often radical. Right? It's like, if if you use an AI to predict whether or not someone is going to be able to approve a loan. You know those are life-changing decisions, right?
469
01:00:09.010 --> 01:00:20.679
Chris Penick | Axcel ILT: So I need to know why, with with my credit card. I can. If you could, turn down for credit or home loan right now. The Us. Government has a law that's their credit reporting act. You could basically ask.
470
01:00:20.820 --> 01:00:22.920
Chris Penick | Axcel ILT: here's why. And they have to give you a reason.
471
01:00:23.640 --> 01:00:31.440
Chris Penick | Axcel ILT: these AI models in a specific reason. It's hard to do that. Now, Europe also looks at this as what we call a right to explanation.
472
01:00:31.610 --> 01:00:33.050
Chris Penick | Axcel ILT: and that's the big one on there.
473
01:00:33.190 --> 01:00:44.760
Chris Penick | Axcel ILT: So there's some examples in there. I've got a bunch of them to look at. You can go check them out, you know. Image-based fraud detection on checks. And yeah, people still use checks. General one is Eno, which is the AI. There
474
01:00:45.130 --> 01:00:48.939
Chris Penick | Axcel ILT: I have so much more, but like so come visit us. We do have a course.
475
01:00:49.210 --> 01:00:56.600
Chris Penick | Axcel ILT: I know there's been questions we go, but come to the slip for this one you can contact. You know your your person there. The the link is in the chat.
476
01:00:56.770 --> 01:01:08.029
Chris Penick | Axcel ILT: That's just one to start. This is a it's a hands-on 2 day course. However, it's it's definitely it's it's good for beginners. It's not designed to, you know we're not gonna learn how to build chat. Gbt.
477
01:01:08.060 --> 01:01:11.820
Chris Penick | Axcel ILT: but we are going to start getting a lot of that together.
478
01:01:11.840 --> 01:01:18.049
Chris Penick | Axcel ILT: I could stick around for like about. you know, 5 or so minutes here. And then I have to go through this.
479
01:01:18.070 --> 01:01:30.440
Chris Penick | Axcel ILT: Yeah, the thing is, yeah, exactly, Kate. It's II. From the moment we decided we were gonna do this, I said, there is no way I'm getting all this done in an hour. It's never gonna happen, because it's also still changing, which we didn't even talk about that yet.
480
01:01:30.590 --> 01:01:37.680
Chris Penick | Axcel ILT: But if anybody's got any other little questions there, or if you need to, this is this either proves I'm crazy or stupid.
481
01:01:37.850 --> 01:01:45.269
Chris Penick | Axcel ILT: You're you're welcome to contact me. I you know, as we mentioned earlier, we're actually part of this.
482
01:01:45.760 --> 01:01:53.150
Chris Penick | Axcel ILT: The copy of the slides. Will those be posted with with the recording? I know the recording will be up there where you're
483
01:01:53.700 --> 01:02:00.509
Axcel ILT: the recording will show the slides themselves. So you'll be able to go through the slides
484
01:02:00.930 --> 01:02:02.179
Axcel ILT: in the recording.
485
01:02:02.720 --> 01:02:07.920
Chris Penick | Axcel ILT: And we'll be sending a follow up email to all the attendees with that recording in the email.
486
01:02:08.070 --> 01:02:21.770
Chris Penick | Axcel ILT: I can give you the slide site if you wanted to send that along. That's fine. It's just got in my notes. It's got, you know some of the links that are referred to feel free. I need to see you all, and like there are all kinds of of
487
01:02:21.910 --> 01:02:33.260
Chris Penick | Axcel ILT: courses that we do on AI, you know, like I, said, the the general pathway. I love making road maps with organizations. I work mostly with web age side of this, but I have taught for telebreak and exit certified.
488
01:02:33.450 --> 01:02:44.339
Chris Penick | Axcel ILT: One of the things that that I love to do is build programs for organizations on, how do. How do we get your people in your organization to make the use best use of AI machine learning data science.
489
01:02:44.370 --> 01:02:48.879
Chris Penick | Axcel ILT: So first thing is, start with, if you don't have the the data science.
490
01:02:49.020 --> 01:02:57.429
Chris Penick | Axcel ILT: the machine learning. That's your start with basic data science, then go learn. So that's you know, python and numpy appendix.
491
01:02:58.000 --> 01:03:00.320
Chris Penick | Axcel ILT: Send me, Kate. Send me. Yeah, sure
492
01:03:01.350 --> 01:03:07.889
Chris Penick | Axcel ILT: be bad about Linkedin. I have to confess I'm on recording right here. It's like II don't Update things on Linkedin. It's the only
493
01:03:08.070 --> 01:03:14.040
Chris Penick | Axcel ILT: well that and I have a Twitter account. Those are like the only social media accounts I have, and otherwise I don't do social media. So
494
01:03:14.300 --> 01:03:27.609
Chris Penick | Axcel ILT: yeah, David, it's a good question. So David's got a question about, you know, like tools that detect the use of AI and how effective. They are, I think, part of ethical AI is fesse up to when we use it.
495
01:03:27.680 --> 01:03:35.140
Chris Penick | Axcel ILT: I at work, I will tell you know my coworkers are like, look! I started with this outline. I had Chat Gp, help me put this together
496
01:03:35.390 --> 01:03:45.950
Chris Penick | Axcel ILT: because I just want to try to organize my thoughts, and I will tell them I got it from Chatg, or I got it from perplexity. Or Hey, look at these resources. I found from there. As far as like schoolwork.
497
01:03:45.980 --> 01:03:50.890
Chris Penick | Axcel ILT: You know, there are tools. The the funny thing is, most of them aren't very good.
498
01:03:51.260 --> 01:03:52.689
Chris Penick | Axcel ILT: you know, because
499
01:03:53.340 --> 01:03:56.530
Chris Penick | Axcel ILT: it trying to make. I have
500
01:03:56.910 --> 01:04:07.700
Chris Penick | Axcel ILT: one of our instructors teaches with Berkeley. and he and I often discuss those tools, David, of of how to detect AI. You know, if somebody's actually cheating.
501
01:04:07.770 --> 01:04:12.360
Chris Penick | Axcel ILT: and he every week he comes and said, Yeah, I tried this one. Nope.
502
01:04:12.500 --> 01:04:13.389
Chris Penick | Axcel ILT: it's like.
503
01:04:13.550 --> 01:04:25.390
Chris Penick | Axcel ILT: you know, either false a lot of false positives or the other way around, you know. You know, you know, somebody got this. It sounds to chat. Gpte. That's my new word. I'm gonna coin that word down Chat Gpte
504
01:04:25.540 --> 01:04:40.850
Chris Penick | Axcel ILT: with a Y on the end. You know you've seen it like that, you know. Occasionally somebody really slips up and like it says, you know, it says, you know, I'm only it actually has. Like, I'm only an AI above whatever I can provide information. There.
505
01:04:40.940 --> 01:04:44.269
Chris Penick | Axcel ILT: II think it. It's a tough road.
506
01:04:44.520 --> 01:04:52.379
Chris Penick | Axcel ILT: yeah, it right. Stephen makes a good point. II kind of agree with that. III think, especially in organizations.
507
01:04:53.230 --> 01:04:56.649
Chris Penick | Axcel ILT: You know. I use it a lot.
508
01:04:56.750 --> 01:05:01.869
Chris Penick | Axcel ILT: I it has changed the way I do a lot of things. It's changed the way I work with the kids.
509
01:05:02.220 --> 01:05:14.400
Chris Penick | Axcel ILT: you know, on their homework and things like that that you know, trying to find information. I yeah, like, I said, I tend, I like perplexity a lot because it's also current. It does go out there and find things.
510
01:05:14.440 --> 01:05:26.200
Chris Penick | Axcel ILT: you know, that are new chat Gp is really sort of a fixed point in time, unless you use the pro account with a plugin. Yes, we do offer Rob. We do offer training
511
01:05:26.600 --> 01:05:31.800
Chris Penick | Axcel ILT: for individuals. We do have open enrollment sort of classes that would.
512
01:05:32.490 --> 01:05:35.619
Chris Penick | Axcel ILT: you'd have to check the schedule at one of the sites there.
513
01:05:35.820 --> 01:05:40.700
Chris Penick | Axcel ILT: Typically we we most of our classes, for like with Web Edge, we do as a program?
514
01:05:41.040 --> 01:05:44.210
Chris Penick | Axcel ILT: yeah.
515
01:05:46.810 --> 01:05:49.210
Chris Penick | Axcel ILT: Yeah. Well, David, I agree, okay, so
516
01:05:50.260 --> 01:05:58.169
Chris Penick | Axcel ILT: yeah, calculator takes understanding. But then you know, how do we get that? How do we get around that with a calculator on my math test. There was always the professor that said, Show your work.
517
01:05:58.460 --> 01:05:59.550
Chris Penick | Axcel ILT: Say so.
518
01:06:00.740 --> 01:06:02.230
Chris Penick | Axcel ILT: You can't show your work.
519
01:06:02.680 --> 01:06:05.690
Chris Penick | Axcel ILT: the
520
01:06:06.100 --> 01:06:07.120
Chris Penick | Axcel ILT: the
521
01:06:07.440 --> 01:06:15.289
Chris Penick | Axcel ILT: the recording will have everything but like, so we'll I'll make the the slides. I'll give the slides to
522
01:06:16.360 --> 01:06:25.210
Chris Penick | Axcel ILT: you. You'll be able to get slides, and and you only you. I went through everything except the the some of the detail and some of the methods there.
523
01:06:25.430 --> 01:06:27.659
Chris Penick | Axcel ILT: but it'll be there. We'll make it to there.
524
01:06:29.140 --> 01:06:33.060
Chris Penick | Axcel ILT: Yeah. there you go. Hit again.
525
01:06:36.550 --> 01:06:40.670
Chris Penick | Axcel ILT: Yeah. yeah. A lot of schools are marketed as dishonesty on there.
526
01:06:40.790 --> 01:06:49.339
Chris Penick | Axcel ILT: And again, I and I. And again, I guess it really depends on the class, too, you know. I would think like if you're in a
527
01:06:49.970 --> 01:06:51.600
Chris Penick | Axcel ILT: if you're in a course
528
01:06:52.430 --> 01:07:05.170
Chris Penick | Axcel ILT: where you are learning Spanish, and you, you know it's not your native language, and you use AI to translate. Yeah, that's major cheating and ridiculous. But if you're in a course where you're learning
529
01:07:05.420 --> 01:07:06.940
Chris Penick | Axcel ILT: pathology.
530
01:07:07.240 --> 01:07:16.749
Chris Penick | Axcel ILT: And you're looking at trying to find, you know, these are the symptoms. And you know, recommender systems that say, Okay, here are my symptoms for my patient.
531
01:07:16.870 --> 01:07:24.590
Chris Penick | Axcel ILT: What are some conditions? And then you further study that and go, you know, explore those resources of oh, it might be this disease or this
532
01:07:24.670 --> 01:07:39.990
Chris Penick | Axcel ILT: syndrome or this condition. See? That's it's a different thing. It's got its tools. It goes back to healthcare that we asked about earlier. Yeah, III look at this way. It's it's a tool. It's here we're not. We're not putting the genie back in the bottle. We just have to figure out how to use it correctly.
533
01:07:40.270 --> 01:07:44.250
Chris Penick | Axcel ILT: And I think that's going to be a case-by-case basis on a lot of things. Yeah.
534
01:07:44.730 --> 01:08:08.750
Chris Penick | Axcel ILT: The flip side to support that. There you go. So what? Shakhon said, yeah, if you go look at yeah. There's there's. you know, a case study of a lawyer who got all his, you know, when you're a lawyer and you go to trial, whether that's civil or criminal. You know I was a legal specialist with the army. So we we do a lot of previous. We go look for precedent. Right? We go look for other case work. Other cases that have been decided.
535
01:08:08.820 --> 01:08:10.900
Chris Penick | Axcel ILT: and a lawyer for his client
536
01:08:11.090 --> 01:08:22.860
Chris Penick | Axcel ILT: went and used Chat Gbt, saying, Hey, help me find precedent chat. Gbt told him what he wanted to hear. All the example cases that Chat Gbt gave him were fake.
537
01:08:23.210 --> 01:08:24.750
Chris Penick | Axcel ILT: They didn't exist.
538
01:08:24.859 --> 01:08:31.089
Chris Penick | Axcel ILT: It generates content. It generates what sounds like what you want.
539
01:08:31.350 --> 01:08:35.490
Chris Penick | Axcel ILT: It doesn't create. It doesn't find fact, it creates
540
01:08:35.500 --> 01:08:38.169
Chris Penick | Axcel ILT: pleasing content. It's a difference.
541
01:08:38.439 --> 01:08:43.319
Chris Penick | Axcel ILT: So yeah, that lawyer just got kind of laughed at at court because he basically
542
01:08:43.550 --> 01:08:44.479
Chris Penick | Axcel ILT: grabbed.
543
01:08:44.850 --> 01:08:58.699
Chris Penick | Axcel ILT: He. He made up his whole case. I mean, if he if that was my lawyer I'd definitely go talk to the Bar Association. I'd be suing. Say so. Yeah. So Flip, side of this. you know, you have to find an actual case precedent to support it.
544
01:08:58.910 --> 01:09:03.689
Chris Penick | Axcel ILT: anyway. So it's good to have you all here I've got we'll still get. Wow! So better crab
545
01:09:03.790 --> 01:09:08.899
Chris Penick | Axcel ILT: like, I said. I put it in the chat before one way or another, we can make the slides available.
546
01:09:09.770 --> 01:09:14.110
Chris Penick | Axcel ILT: Yeah, exactly right. You can't. Oh, I never take anything from
547
01:09:14.220 --> 01:09:21.330
Chris Penick | Axcel ILT: from, especially Chat tbt, but even perplexity, I'll put my email address in there again, if if you'd like feel free.
548
01:09:21.569 --> 01:09:24.260
Chris Penick | Axcel ILT: you know, I tend to.
549
01:09:24.479 --> 01:09:25.200
Chris Penick | Axcel ILT: Wow.
550
01:09:28.130 --> 01:09:30.929
Chris Penick | Axcel ILT: yeah, actually, one of the things. So.
551
01:09:32.960 --> 01:09:45.509
Chris Penick | Axcel ILT: Tim, that's a really good question. So Tim has a question about like a memo pattern of you know, help to help with explainability. Sort of logging what they do. Some of the algorithms that I was mentioning for explainability. That's what they do.
552
01:09:45.569 --> 01:09:47.209
Chris Penick | Axcel ILT: So they.
553
01:09:47.430 --> 01:09:59.820
Chris Penick | Axcel ILT: yeah line, for example, will give you sort of a here's at each layer sort of telling you what the decisions were made. It's trying to add that into the whole processing effect. Chaplie actually has a good
554
01:09:59.890 --> 01:10:08.679
Chris Penick | Axcel ILT: mit Ctl and Demo online that if you've got, you know, access to like Google pro lab, or you've got anaconda installers. And like that, you can just run and download on your own.
555
01:10:08.830 --> 01:10:19.810
Chris Penick | Axcel ILT: The you know the but the right. You end up making a model to examine your model. So it's this sort of, but you use a model that you can interpret.
556
01:10:19.970 --> 01:10:22.909
Chris Penick | Axcel ILT: Yeah. So Shapley uses known
557
01:10:23.030 --> 01:10:29.409
Chris Penick | Axcel ILT: game theory algorithms to explain how the the opaque model made its choices.
558
01:10:29.530 --> 01:10:40.990
Chris Penick | Axcel ILT: Corey was asking about was this about what areas to make sure they have a base knowledge. If you want to go deep into just AI in general, like said, I would go, yeah.
559
01:10:41.260 --> 01:10:42.290
Chris Penick | Axcel ILT: Math.
560
01:10:42.350 --> 01:10:50.990
Chris Penick | Axcel ILT: including some kind, you know. Brush up, refresh your calculus. refresh your linear algebra. If if you did take it.
561
01:10:51.260 --> 01:11:03.890
Chris Penick | Axcel ILT: The second one. Yeah, actually, Tim. That's what it is. Sort of a use a model to reverse engineer. Exactly. That's a good way I like that. I'm gonna borrow that I'll give you credit. That's a good way to explain it. Yeah, use. Use a different model to reverse engineer. Another model.
562
01:11:04.430 --> 01:11:10.219
Chris Penick | Axcel ILT: There you go. Credit, Tim. There you go. But Cory, in answer your question.
563
01:11:10.350 --> 01:11:25.549
Chris Penick | Axcel ILT: yeah, there's a roadmap. If you send me an email, I'll point you to a data science roadmap and start with that start with. You know the math, the data science. then move to the machine learning in what we call sometimes called classic machine learning. And then you can move to
564
01:11:25.650 --> 01:11:34.789
Chris Penick | Axcel ILT: AI. Whether that's ex, you know, explainable AI, or, you know, explainable. AI is just a tool set. Really, it's it's a way for me to
565
01:11:36.020 --> 01:11:40.230
put trust into the choices that we all make
566
01:11:40.270 --> 01:11:41.630
Chris Penick | Axcel ILT: based on.
567
01:11:41.910 --> 01:11:45.500
Chris Penick | Axcel ILT: You know these models. Yeah, Josephine, great to have you
568
01:11:45.650 --> 01:11:52.519
Chris Penick | Axcel ILT: there you go. So yeah, like, I said II put that you're welcome to to contact me in there.
569
01:11:52.870 --> 01:11:59.739
Chris Penick | Axcel ILT: Oh, Cory, work in cyber security that would. That's actually more of my background where I started years ago, so
570
01:12:00.110 --> 01:12:03.290
Chris Penick | Axcel ILT: used to work with semantic way back in the day.
571
01:12:03.480 --> 01:12:19.720
Chris Penick | Axcel ILT: Now, they're what they're very tough now, whatever back when the Norton antivirus and all that stuff existed. So yeah, again, we we look for patterns. Right? We will. We would use this thing called bloodhound that we would look for malware misbehaving right on your system. And so we we'd love to have.
572
01:12:19.960 --> 01:12:27.599
Chris Penick | Axcel ILT: you know, and a big AI model to watch and say, this is how your computer should behave. But that program is being bad. So
573
01:12:27.830 --> 01:12:31.500
Chris Penick | Axcel ILT: yeah, yeah, send send to me. There you go.
574
01:12:31.770 --> 01:12:34.090
Chris Penick | Axcel ILT: let me take a while.
575
01:12:35.160 --> 01:12:47.130
Chris Penick | Axcel ILT: Alright, I'm gonna let's see. Wow! I see it told you we get we get it through an extra 15 on there, and I know you all have lives to get back to. II do as well, good, and and got a hard stop in just a few minutes.
576
01:12:47.220 --> 01:12:51.130
Chris Penick | Axcel ILT: But feel free to contact me. You know how to to get through there.
577
01:12:51.270 --> 01:12:58.559
Chris Penick | Axcel ILT: and we have tons of this, and if if you liked what you heard here, I'm not.
578
01:12:58.780 --> 01:13:09.139
Chris Penick | Axcel ILT: I'm I'm a mediocre instructor compared to some of the people that we we bring. So we like to get experts in the field alright. So hopefully, I'll see you all again. Something soon.
579
01:13:09.160 --> 01:13:10.309
Chris Penick | Axcel ILT: I'm going to
580
01:13:10.470 --> 01:13:22.679
Axcel ILT: do. You want me to push the big button, or do you do? I'll push it. Yeah, just everyone be on the lookout for that. Follow up email. We'll include the slides and the recording of the webinar. And thank you, Chris, for hosting. And thank you all for joining today. Have a good day.
581
01:13:23.260 --> 01:13:27.219
Axcel ILT: Alright. I'll send you the slides right after this great bye. Everyone.