Episode Transcript
[00:00:02] Speaker A: Hey, everybody. Welcome to the ZORA Talks Podcast. I'm your host, Yawande Odusamwo, and we're all about innovation with purpose. And on this season of the ZORA Talks Podcast, we'll be covering everything around how AI is being used for good and speaking with entrepreneurs and businesses who are building things that matter.
On this first episode, I speak with Eric Hamilton, someone I've known for almost 15 years.
He's currently serving as the vice chair on the board of AI Movie, a company that is leveraging AI to give time back to teachers so that they have more time to focus on students and put people first.
So, without further ado, let's get started.
[00:00:55] Speaker B: Yeah. So we met at a Black. Black and technology event at Mike Street. Mike street had an event. You remember Mike Street?
[00:01:03] Speaker A: I remember the name.
[00:01:05] Speaker B: So he had an event at Yodel. At Yotel. Yodel.
[00:01:09] Speaker A: Okay.
[00:01:10] Speaker B: I think it was a Yodel event and that's where we met.
[00:01:14] Speaker A: Or was it.
[00:01:15] Speaker B: No, I've got two things then. I've got. We met at.
Remember the CNN Soledado, Brian. The. The CNN Black in America event. Were you at that event?
[00:01:24] Speaker A: Yeah, yeah. I forget how I found out about that, but yeah. Solar debt.
[00:01:29] Speaker B: So I've got. I've got us meeting at that event.
I know. I was there.
[00:01:34] Speaker A: Like, that is super impressive. You got all that detail because.
[00:01:37] Speaker B: Because I'll be. Because I'll be forgetting where I meet people. And like, I was like, you meet impressive people. Like, I knew, like, okay, I'm probably not going to interact with you every day, but, like, years later, I'm going to, like, try to. I'm not going to forget. And I wrote it down. Yeah, it was the. It was November of 2011, so I wrote that down in my notes, too. So that was like 14 years ago almost.
[00:01:58] Speaker A: Dang.
That is super impressive.
Wow. That's a lot about you that you have that much detail. Notes.
[00:02:07] Speaker B: Well, I think it basically means that I can't remember anything. And like I said, you meeting people that you know are important.
[00:02:14] Speaker C: Right.
[00:02:15] Speaker B: And I'm like, okay, this woman is important.
But I know I'm not going to remember, so it's important for me to write it down. I put it in my notes, so.
[00:02:24] Speaker A: Wow.
[00:02:24] Speaker B: Yep.
[00:02:25] Speaker A: Yeah. My dad actually keeps the diary. He's kept the diary for years. And I really want to read him because he has, you know, you know, when he met my mom probably, and all of that stuff.
[00:02:36] Speaker B: Wow, impressive.
[00:02:37] Speaker A: I said, you should write a book. And he was saying, yeah, they keep telling me that so we shall see. But yeah, I'm just going to kick off. It's great, Eric, you being here.
I was very impressed when I saw you on LinkedIn. I saw you're connected with Moonbeam. AI.
Or is it AI Moonbeam? I just want to make sure.
[00:02:56] Speaker B: AI Moonbeam.
[00:02:58] Speaker A: Okay, so I saw you were connected with AI Moonbeam, and I just wanted to start out by having you introduce yourself and telling us a little bit about how you got involved with AI Moonbeam.
[00:03:09] Speaker B: Well, thank you very much. Thank you much, much, much. So, of course, Eric Hamilton. And so my background is in technology. I studied computer science as an undergrad back I was a freshman in college in fall of 87, long time ago.
So bachelor's degree in computer science. I have an mba. I've worked tech jobs at Yahoo, Google, if you remember, Netscape back in the day, Fidelity, all these different types of tech jobs over the years and how I found out and got into Moonbeam. So something very similar to what's happening of what's happening today with us. I was on a podcast called All Things Education.
One of my frat brothers, Alpha Phi Alpha 06, one of my frat brothers who was the host of that show, Dr. Moussa Abdullah, one of my bros. I've known him for like 35 years.
And we were talking about education, but particularly the episode was about artificial intelligence.
And his show focuses on people of color and how education affects them. So he's asking all these definitive questions about AI with people of color. How's it going to affect them? How are we involved in the technology? So I was on that podcast and one of his colleagues saw the podcast and said, hey, I want that guy. I want to meet him. Because I'm starting a startup and I'm looking for someone to head the board of advisors and be an AI advisor to him. The CEO, Jim Beaks. And that's how we met. Dr. Abdullah connected us. I hit it off with Jim, and next thing you know, I'm the vice chair of his board for his company.
[00:04:49] Speaker A: Awesome. Yeah, I know a lot of people are trying to get on board, and so it's a great stepping stone from your career to the board. And I actually was really impressed when I saw your background. I knew you were smart. I knew you had a lot going on. I even remember you founding your own company. But when I actually dug deep into your background, I was like, wow, this guy is super credentialed. And so I can definitely see why they wanted you on the board.
I did also want to ask you, now that we kind of got that out of the way, the backstory a little bit more about AI Moonbeam, could you explain in simple terms what you guys do?
[00:05:29] Speaker B: Absolutely. So, AI Moonbeam, we're an AI integrator.
So one might think like, okay, what do you build? How do you compete with OpenAI and Anthropic and Gemini and all these other things? We don't compete. We basically go into schools and we help them adopt AI.
[00:05:49] Speaker C: Right.
[00:05:50] Speaker B: Particularly our founder, Jim Beaks, wants to focus on inner city schools because that's what his area of expertise. A lot of folks who are part of the company are former administrators, principals of inner city schools.
So Jim feels, and he's correct, there is a need because the more affluent schools, the schools with the money are already on board and they're already integrating AI into curriculum and tools that allow teachers to automate lesson plans and things like that. And schools that are inner city schools are not adopting it as fast as. Because unfortunately, they have bigger fish to fry as far as priorities.
[00:06:30] Speaker C: Right.
[00:06:30] Speaker B: You may have a situation where you've got a child who's homeless.
[00:06:34] Speaker C: Right.
[00:06:35] Speaker B: And trying to deal with maybe behavioral issues related to a homeless child supersedes adopting AI.
[00:06:43] Speaker C: Right.
[00:06:43] Speaker B: It's some of these bigger issues. Right. So what we wanted to do is, Jim, is to form a company that again focuses on inner city schools, helping them to adopt AI.
And instead of these administrators and teachers and principals and things like that, having to be experts in generative AI and understanding the difference between Khan Academy, school AI, what Google offers, how it can be integrated with Gemini and Chrome and some of these other tools, they don't need to know that they talk to us. We do a free assessment and then we map out a strategy for them. So the strategy could pertain to how do we use AI to automate lesson plans, how do we use AI to reach certain students who are behind in math but excelling in English and vice versa. So we map out a holistic plan where technology agnostic. So we are not tied to one technology. We have partnerships with school AI, with Google, with Khan Academy and some of these other organizations.
And we pick and choose which technology to deploy based on the needs of the client. So in summary, we are a company that helps schools understand and adopt AI, but also we help them deploy it and we do training also. That's what we do.
[00:08:07] Speaker A: Okay, yeah. And that's simple enough that I understood. So.
But also, yeah, it sounds like a lot of the technologies in the background, so you are using those big Players, but the schools just don't see the, like, what's in the black box almost.
[00:08:24] Speaker B: For, for the most part, yes. So again, once they adopt a specific platform, of course they know whether they're using school AI or Khan Academy or, or, or whatever technology we, we recommend, they understand it because we're going to train them on it.
But they don't need to have all this pre knowledge on what the capabilities are on each of these platforms. That's our job. And my job, my role is to keep the CEO up to speed on everything that's new. So I'm constantly having meetings with him, updating him on these changes to these large language models. How does it affect our business? How does it affect the schools where we are going to be deploying this technology?
What is new that's coming out, what's current and what's on the horizon that can help one of these schools? Like, case in point, there's a new tool. You probably may be familiar, maybe not, but it's something called Gemini in Chrome, right? I hated that they use a branded, like two branded products because everybody knows what Chrome, the web browser is. It's the web browser Chrome. And then people think of Gemini as the large language model. I can talk to it, a generative AI, but a different product is called Gemini in Chrome. And what it is, it's sweet. What it does is it is, it's kind of like, I don't say it's magic, but basically once you turn it on, it can see everything on your desktop, everything on your screen.
So if you are in, you know, I don't know, Microsoft Word, but you can't figure out how to do a spell check. You can say, you know what?
And you can turn on the audio piece, say, you know what? Gemini and Chrome, how do you do a spell check? And they will say, hey Eric, well, I see your mouse is at the bottom right. What you do is going to click the left top left click on file and then click Edit and then click spell check, whatever, right? It will see your screen and tell you what to click.
And so what we're thinking about doing is integrating potentially. This is something that the CEO hasn't signed off on, but integrating that into the stack of things that we deliver to our clients.
So imagine, imagine I'm a teacher and we're using school AI as the platform to develop lesson plans. And the teacher gets stuck because they don't do this stuff.
No need to Google it. Just turn it on and say, hey, I'm trying to get this lesson plan sent, it's not doing it. What's wrong? And it would say, hey, teacher, you're supposed to be clicking on the left, but you need to click on the right and then boom, they're done, right?
So we constantly want to stay ahead of the curve. Again, technology agnostic. We want to go with the best technology off the shelf stuff for our clients to make their lives easier.
[00:11:12] Speaker A: Man. Yeah, so many questions come from that.
I guess the first one is that I see you're very excited about this technology, but, you know, a lot of people aren't. They're a little bit afraid that it's going to take either, like their job.
There's also a little bit of controversy in the education space around AI.
Some people think it's, you know, good in terms of that customization approach that they can offer their students. But then some people think that it's not good because studies show that maybe students are interacting too much with technology these days and there's not that opportunity for them to make a mistake and learn from the mistake. So what do you think people should be thinking about this technology overall? And then also in the education, you.
[00:11:59] Speaker B: Know, it's like any new technology, it can be used for good and bad. Like, who thought that there would be some term called cyber bullying, right? You think about when technology is like, oh, we're going to make this thing called Facebook, but be careful because you're going to get bullied. Bullied by a computer. What? It just seemed not possible in year 1998 to have the concept of cyberbullying. So there's always going to be upsides and always going to be downsides, but there's definitely going to be more upsides and than downsides. You know, some of the downsides. You know, I talk to teachers and, you know, especially some of the inner city teachers, and they see AI, particularly ChatGPT as the cheating tool. Because their first introduction to generative AI was the situation. I was talking to this teacher in Detroit and she was telling me that she had a, you know, C minus D student in English by which she's an English teacher.
And then all of a sudden, overnight, he produced this brilliant essay, right?
And it's like, wait a minute. No, no, this is back November of 2023, when ChatGPT first dropped November 2022, three years ago, around that time period. Wait a minute, this does not make sense. So in the minds of teachers, this is a cheating tool. So now they got tools that allow you to check to see if generative AI made it. She knew off the bat, because, you know, so that's the downside. But the upside is You've got a PhD level tutor on your screen and in your pocket that you can take with you wherever you go. A PhD level tutor in every subject, right?
Everything.
Every subject that you can humanly possibly want to learn. You've got a PhD level tutorial that knows that that's number one. Number two, AI. And it's not very difficult to train AI but to teach me in my style. And I like to give you. If you allow me to give you an example, right? So I'm gonna go off, but it's coming full circle on this example. So this example that I'm giving. So there was an episode, there's a 1981 TV show. Do you remember WKRP in Cincinnati? That show from long time ago.
[00:14:12] Speaker A: I don't think I watched it. Yeah, I remember it.
[00:14:14] Speaker B: So it was a dj, DJ Venus Flytrap. And on a particular episode, there was this high school dropout named Arnold, right? He was a gang member. He was about to drop out of school and he was talking to DJ Venus Flytrap. Venus Flytrap said, listen brother, I want to keep you in school.
I make a bet. He's like, what's the bet? You know, how are you doing in chemistry? I hate chemistry. I fail chemistry. So how about this, Arnold? I can teach you everything that you need to know about an atom in two minutes. And if I can do that, will you stay in school? He said, bet. So what, what DJ Venus Flytrup does, He said, okay, forget about the Atom for a moment. I'm gonna tell you about some street gangs in a round neighborhoods. Think this hood, this gang, I mean, it's this neighborhood, round neighborhood and you got these three gangs.
First gang called the Pros. Bunch of good looking, positive happy go lucky guys.
They hung out in the center of this round neighborhood. Also in this neighborhood you had another gang called the New Boys. And they had the New Boy hangout in the center of this round neighborhood. The New Boys just stayed out of trouble.
They were neutral. They didn't have no friends, but they weren't causing no problems. Then you had the last gang, the elected ones. They were some gangsters, right?
They hung around the outside of the city, hated the pros and always wanted to start problems. So the pros figured that out and said, you know what? We gonna make sure that we have equal numbers. So if there's three elected ones circling the neighborhood, we're gonna have three Pros. If there was five pros, then there was five elected ones, right? So then after two minutes, Arnold says, man, your time is up. And all you told me about was some street gangs in a round neighborhood. He's like, that's the atom. So what do you mean? He said, the protons. The pros are the protons, the new boys are the neutrons, that new boy hangout is the nucleus, and the electric ones are the electrons. So he quizzed them. He said, so if I got 16 elected ones, I mean, 16 electrons, how many protons it got? It's got 16.
Are protons positive or negative? Positive. What are the negative ones? Electrons. You got an A, bruh.
[00:16:32] Speaker C: Right?
[00:16:33] Speaker B: So at that point, science is not an abstract concept. It's something that can be relatable.
So coming full circle, what the heck does that have to do with AI? Because AI can be easily trained. It's just tweaked in order to craft messaging and education in terms that the receiver, the student, can understand.
So now you've got, not only do you have this PhD level tutor in your pocket, you got one that understands your flavor.
That's right now, right? That's right now. That can be done right now. And it's scalable.
It's scalable. So that's what we're trying to convey to people in education, to policymakers, everybody.
This technology is not expensive and it can be catered and personalized to reach the student. So I'm going to be quiet and no.
[00:17:32] Speaker A: So, yeah, that's great. And I love the storytelling because I think a huge part of it is to make it more accessible. And what you just said, it really helps kind of boil down the benefit of your company.
So I love that.
I guess the counter to that, though, is there's been talks about there being bias in the data.
[00:17:53] Speaker B: Yes.
[00:17:54] Speaker A: And the bias, for example, isn't trained to recognize black and brown faces. So the context that you were just providing, does that exist? You know, the training model, does that exist? Or is that something that you would have to create in order to provide that level of, like, you know, tangibility for the student?
[00:18:13] Speaker B: We as humans are biased. And you've got AI is being trained on data that is being provided by humans.
So it's not like the data.
The AI researchers are saying, okay, I don't recognize black faces, but the faces that it's for. We're talking about individualization AI, Right. Being able to look at and recognize humans, but you happen to gravitate to things that are familiar.
So if you're training it on a million faces. 950,000 happen to be Caucasian males. Right. It's something that's not inherently biased, but it's life, right? So that is something to be concerned with.
[00:18:59] Speaker C: Right.
[00:19:00] Speaker B: And so one of the things that our founder, Jim Beaks, is one of the things we want to do. He talks about ethically, the ethics behind this ethically deploying AI. Now, I don't deal with the ethics part. He's got some ethics experts. I think Dr. Abdullah is part of that.
I focus in my lane with strictly the technology, but the ethics play is big, right? So when we talk about. We have these conversations with schools, the technology that we deploy has been vetted from an ethics perspective from Dr. Abdullah and some of the other PhD folks who are on the team, not myself, stay in my lane to make sure that we can put our seal of approval on anything that we deploy. So, yes, and that's part of it. Now does AI Moonbeam, do we have. Are we the magic sauce that can eliminate bias in AI? Absolutely not. But again, we are a buffer against it to a certain extent, but it's always going to exist. Like, case in point, like. So I've been. I dabble with.
With creating AI music videos, right?
And like, I would create these videos and I do a lot of urban.
Some not urban, but Afrobeats videos.
And then I also did a few 80s rock videos, right?
And so one of. One of my. One of my buddies was asking me which one is more fun to do. I said, I like the.
I like to do the Afrobeats videos more, but AI can't dance with a damn, right? I mean, dancing like how we get down.
No, he was like, well, what about the rock video? Oh, those are easy, right? Because all you doing is. Is jamming on the guitar. And my buddy was like, that's biased. That's racist. I'm like, well, you might see it that way, but I don't necessarily see it as. I don't have evidence for or against. I don't necessarily see it as outwardly biased, but it is biased, right? So they're racist. They're racist. They're inherently making it so AI can't dance like black folks if they can play a guitar like a white guy. I'm like, I don't necessarily believe that because in one of my music videos for the urban, for the Afrobeats, I. It took place on a golf course because I like the golf. So it's one part where all the ladies are dancing on the golf Course with nice little cute golf outfits. And part of it, I just wanted to have him just do a golf swing.
[00:21:28] Speaker C: Right.
[00:21:29] Speaker B: I'm thinking like, oh, it's going to be able to do a golf swing pretty easily because of who trained it.
[00:21:34] Speaker C: Right.
[00:21:35] Speaker B: Oh, my God.
Dancing is way better than the golf. It was. Absolutely. I probably spent two hours trying to get it to just do a simple golf swing. Could not do it to save his life. Right now, golf is dominated mostly by older white men and white men.
[00:21:53] Speaker C: Right.
[00:21:53] Speaker B: And it couldn't do a golf swing to save its life. So is it biased against, you know, a sport dominated by white folks? I would. I think you would buy. You and I both would say no. So a lot of times, you know, we, as African Americans, we think sometimes that because things are not necessarily working in our favor, that it's, oh, it's the man against us. And sometimes it's just messed up just because it's messed up. And there's no Illuminati plot to try to get us. You know what I'm saying?
[00:22:19] Speaker A: So.
[00:22:20] Speaker B: So kind of go. Coming full circle back to the bias piece. There's definitely bias.
From my understanding and research that I've done, the bias is a product of humans being humans, not a plot by the man to keep us down. And again, that's one man's opinion, but that's what I see. But bias does exist. And, you know, my company that. That I'm on the board for is taking steps to mitigate and ethically deploy AI. We have. We have certain steps. Again, is it perfect? No. But we want to take accountability for that piece of our business. Working with schools.
[00:23:01] Speaker A: Yeah, you know, I definitely appreciate that, and I'm glad that there's some forethought in terms of the ethics for your company. So I definitely appreciate that.
And I agree with you in terms of. It may be it's not intentionally biased. Right. Because when you think about it, it's the people that are training the models. And like, you were saying you'd like to be around. I think you said something like, you. You're like people like you're around people like yourself.
[00:23:26] Speaker B: Exactly.
[00:23:26] Speaker A: You know, you can only kind of relate to things that you know about. Right. And so I think that's probably why some of the underlying data is not able to recognize black and brown faces.
But, you know, I appreciate also that you're targeting the inner city schools, because I think that your company could be a conduit in terms of, like, that approach that you were mentioning where you're saying, hey, we're going to make it more relatable because I think there's still opportunities to tweak the models. Right. And even maybe it's a prompt to say, you know, here's, you know, like, I know there's like an ability with the prompts to provide references.
So, so you could even provide that reference of that movie you were mentioning or that episode you were mentioning, right. And say, I want you to build more content like this. Right. And you might just kind of have to keep tweaking it and providing it more references, but at a certain point it's going to be able to provide some relatable training material for the students and then it could be replicated. And teachers don't, you know, everyone's, you know, teachers might be doing it different in this school versus Disciple, but they can kind of use the same model when they find out that it works. Right. So, so I love that.
But yeah, I appreciate it. And so in terms of like real life practical approach, have you guys gotten into any schools yet or you just kind of still in the C stage?
[00:24:49] Speaker B: Yeah, no, so we have, so we formed a partnership with a school, it's New Horizons Charter in New Jersey.
So the exact stage, because again, I'm not in the day to day, I'm on the board, I advise the CEO. So I'm not in the weeds.
Exactly what, in what stage we are in, but we are in one schools and we're doing a lot. We've got a lot of headway in the state of New Jersey, in the Newark area is where we're making a lot of headway because that's, that's kind of where our main hub is located. So. But the, but we've partnered and we're doing things with the new Horizon Charter School.
[00:25:25] Speaker A: Okay. Yeah. And I know you must feel really good because I always love it when I can give back and do something that I know is going to make an impact on the community and definitely have those testimonials coming out and you'll be able to kind of really show those real life results on your website and that'll just help more people want to come on board.
So, yeah, I look forward to seeing your, your traction.
[00:25:50] Speaker B: Thank you. Absolutely.
[00:25:52] Speaker A: Yeah. And so have you had any trouble though, kind of getting that traction, getting people to know about your platform and if so doing how to write that?
[00:26:01] Speaker B: Yeah. So you imagine like any, any startup, there's going to be some, a bit of a growing pains as we try to, you know, understand our Market a little bit better. One thing that's interesting is, you know, as I, because I'm going to be doing some speaking, I'm originally from Detroit, Michigan and I have a possibility to do some speaking at some schools. Exactly. Wayne county, some of the community colleges in Wayne county, and potentially doing some things there. And what's different about when you start talking about, you know, either, you know, community college, junior college, you know, four year universities, they have budget.
[00:26:33] Speaker C: Right.
[00:26:34] Speaker B: And they're, they're potentially talking, you know, again, talking about, hey, well, you know, let's talk after you come in and speak, let's do some things, blah, blah, blah, blah. One of the challenges is you imagine a lot of these inner city schools don't have the budget.
[00:26:47] Speaker C: Right.
[00:26:49] Speaker B: We could easily pivot to the more affluent school districts and probably make a killing overnight.
[00:26:56] Speaker C: Right.
[00:26:57] Speaker B: But we feel that the mission is more important at this point than, you know, hitting a lick and making a lot of dollars because we can.
[00:27:06] Speaker C: Right.
[00:27:08] Speaker B: So one of the challenges is budget.
[00:27:11] Speaker C: Right.
[00:27:11] Speaker B: That's number one. Another challenge that I mentioned is a lot of educators feel that this is the cheating tool that I've been fighting against for the last few years because these, these young people are further ahead on the technology than some of these teachers who are my age. I'm in my 50s. And a lot of folks who are teachers in their 50s don't really, you know, they're not really on the cutting edge.
[00:27:31] Speaker C: Right.
[00:27:32] Speaker B: So, you know, those are some of the challenges. And we're coming up with plans again.
The, the upside is AI is in the media, it's everywhere. Even if you want to not think about it and shut it off, if you turn on the news, you're going to be hearing about AI, how it's affecting everything, how it's going to be massive, potential massive job loss. So you've got to pay attention.
So when we have these conversations, it's not like, what's AI? I think people understand it, at least the basics of it.
But again, just come back full circle. The challenges are, you know, a lot of these schools don't have budget and there's a lot of misconceptions that we're, you know, this is the teething tool. Why should I want to deploy the cheating tool so they can be more cheating. And then we, so there's an education factor for us to educate some of these administrators and teachers about the benefits. So that's a little tough sometimes.
[00:28:26] Speaker A: Yeah, yeah. I think what I've been pivoting my business in the AI Space as well. And the approach I've been trying to take is helping people identify areas where they are spending maybe too much time and they can use AI to improve their program productivity. So that could be, you know, that could be approaches like in terms of the students, I think I'm hurt. I saw on your website that you're going to help them with lesson plans and things like that, things that are more so repetitive, that doesn't necessarily need a hands on approach so they can spend more time with the students with that hands on approaches, they don't have to spend as much time on the repetitive stuff.
But I know also, like, I actually went back to Detroit a few months ago and I attended this code 2040 conference and there was a Michigan State professor there and he was anti AI.
[00:29:19] Speaker B: Really.
[00:29:19] Speaker A: It was like I stopped using it and I said, I think the tray has left the station. So I think now we need to figure out how to use it, you know, the best we can so that we're not left behind.
But one of the reasons why he was against it is he thought, especially in the education space, that eventually AI will replace teachers. And I know we probably shouldn't be saying this on the podcast. They would replace teachers, especially in the inner city schools when they didn't have the budgets and I'm sorry.
[00:29:48] Speaker B: Go ahead. I'm sorry.
[00:29:49] Speaker A: And so I actually just wanted to hear your thoughts on that.
[00:29:52] Speaker B: Yeah. So I'm trying to figure out what crack this brother smoking right now. I'm just playing.
Okay, so a few thoughts, a few things to unpack there. On the collegiate level. Yes. There's going to be some massive layoffs because young adults don't need to be in a $3 million auditorium in order to learn.
[00:30:15] Speaker C: Right.
[00:30:16] Speaker B: And you don't need a human to necessarily learn.
[00:30:21] Speaker C: Right.
[00:30:22] Speaker B: So from a college level, yeah, you don't really need professors. Now let's come back down, back to the beginning to kindergarten teachers where we are in 2025. There's no way in H E L L that AI can replace a kindergarten teacher. Cannot happen. Will not happen for the foreseeable future.
[00:30:41] Speaker C: Right.
[00:30:41] Speaker B: So you think about from kindergarten to probably around the sixth grade, fifth grade or so, you need humans, Right?
AI for a kindergarten teacher is very different than AI for say a fifth grade teacher.
[00:30:59] Speaker C: Right.
[00:30:59] Speaker B: We still need the fifth grade teacher, but it's going to be more relevant for the fifth grade teacher. So back to the kindergarten teacher. Really, it's just for lesson plans only. AI is not made to interact with a five year old Period.
Humans, especially the human brain at that age, needs to have interaction with other humans, meaning adult humans to nurture. Again, it falls outside of my area of expertise. Dr. Abdullah and some of these other smart folks at AI Moonbeam can write a thesis on this stuff, but we're not trying to replace the kindergarten teacher, but give her tools to make it easier for her to get down on the carpet and interact with those 5 year olds.
[00:31:45] Speaker C: Right.
[00:31:46] Speaker B: So from K through 5, there really is no going to be no disruptment, disruption. As you start to get to sixth, seventh and eighth, especially in the high school, there may be a little bit, but it's a situation where there is currently a teacher shortage because of classroom sizes. And as I talked to teachers, I was talking to some teachers and they were talking about how they may have a class of, you know, 25 students.
[00:32:14] Speaker C: Right.
[00:32:14] Speaker B: Five that are just brilliant, that really shouldn't be in the class, five that need a lot of help and then the 15 in between that the teacher should be focused on. But what she spends most of her time focusing on is those five that really don't get it.
[00:32:33] Speaker C: Right.
[00:32:34] Speaker B: So what AI could do is meet the child where they are. Very similar to that WKRP analogy that I use with the gang references for that round neighborhood.
That would, that that analogy relates well to some people and some people were like, what are you talking about?
[00:32:55] Speaker C: Right.
[00:32:55] Speaker B: And that gangs, that gun. I'm not hearing anything you say.
[00:32:58] Speaker C: Right.
[00:32:59] Speaker B: It just doesn't resonate.
[00:33:00] Speaker A: Right.
[00:33:00] Speaker B: AI would understand that I shouldn't bring up gang references to Becky, but your mom might be good with it. Right. So you know what I'm saying?
[00:33:11] Speaker A: So, yeah.
[00:33:13] Speaker B: So saying all that to say that for, to answer your question more directly, K through 5, AI is not going to be a mass, a mass disruptor in regards to massive layoffs. It's not going to happen.
As children and students get older, we'll see some disruption, especially when you hit the collegiate level.
You know, you don't, you're not going to need to build 3,000 $3 million auditoriums to teach people. You're not going to have to build 10 million dollar dormitories to house and feed them.
[00:33:49] Speaker C: Right.
[00:33:50] Speaker B: So it's going to be a different model. By the time that my kids get in, a seven year old and a nine year old, by the time my kids get to college, it's going to look very different.
[00:33:59] Speaker A: Yeah, I mean, I believe you.
Because this technology is accelerating so, so fast.
[00:34:06] Speaker B: Yes.
[00:34:06] Speaker A: And actually, you know, education is so expensive like college education is so Expensive.
So I think just like anything traditional industry, we've seen a lot of paradigm shifts and reinvention of traditional, like business models, like Uber, you know, with the taxis. And so maybe it's time to disrupt the education space in this technology space, helping to do that. And I love it in a way because I think it democratizes education.
[00:34:35] Speaker B: Absolutely.
[00:34:36] Speaker A: Somebody like in Africa, Right. That might not have access to, you know, all the top tier education, they could, you know, leverage some of these technologies that you're building. Appreciate it. Yeah, agreed.
[00:34:47] Speaker B: Agreed.
[00:34:49] Speaker A: Okay. So I also would love to hear any kind of lessons, unexpected lessons you've learned during the way.
[00:34:56] Speaker B: Yeah. You know, in regards to unexpected lessons, I think one of the surprises, I mentioned this a little bit earlier was a lot of these teachers see generative AI and these large language models as the cheating tool.
And that is the beginning and end of it, period. It's used to cheat. That's it. How can this cheating tool be used to help students when it gives them all the answers and, and writes all their papers?
That's all. That's, that's it.
[00:35:26] Speaker C: Right.
[00:35:26] Speaker B: So I was a little bit surprised that that was the mentality. Right. So that was, that was a big eye opener for me. That was probably the biggest.
And I didn't expect it. I didn't expect it. I more or less thought that, hey, I don't know what this technology can do, but the preconceived notions that this is a cheating tool that I don't want to be have to deal with because I've been fighting it for two years was the biggest lesson that I learned to have to deal with.
[00:35:55] Speaker A: So, yeah, I know you're just starting out, but are there any new features that you all are planning to roll out soon?
[00:36:02] Speaker B: Yeah, just one of the features, I think I mentioned it was in regards to the, the, the Gemini and Chrome feature that will allow a teacher to be directly communicate with AI that tells them where to click when they're creating a lesson plan. So if they're creating a lesson plan or hey, you know what, they click a button. Hey, I can't find Johnny's lesson plan. Can you help me? Sure. Can you click here, click there, or having it integrated with, you know, with Gemini, who can actually search your email and all these other things that we, these, these work streams.
[00:36:37] Speaker C: Right.
[00:36:38] Speaker B: So between, you know, you know, School AI and Khan Academy and some of these other things, Google, Gemini and Chrome having this all kind of interconnected to look like a seamless application that we tie together and we Consult on.
And then also in addition, building custom GPTs and agents. Again, it's going to be for those who don't, your audience who may not know what a custom GPT is inside of ChatGPT. You can not sort of train the large language model on tasks or certain tasks you want it to accomplish and certain data that you want to feed it to do certain things.
[00:37:24] Speaker C: Right.
[00:37:25] Speaker B: And then also AI agents. So for those of your audience members who may not know an AI agent is, think of it as an entity, an AI entity that can go out and do things on your behalf. An example would be, you know, you might say to your AI agent like, you have my credit card, you know, find the cheapest trip to Atlanta the weekend of December 12th and book two airline tickets if you find something other $500.
[00:37:51] Speaker C: Right.
[00:37:52] Speaker B: And it would go out and do its thing.
So we're talking about potentially building agents for these schools, depending on the need of the school. So all that customization is something that we're talking about, but we haven't really officially rolled out anything, but it's something that we're willing to work with schools, again, we're here to help them accept, integrate AI technology.
We'll educate them on it. And potentially another leg of that is the customization of creating custom GPTs and custom AI agents is something potentially we could do.
[00:38:30] Speaker A: Yeah, I mean, I think that's so smart. And it seems like all the tech folks you talk to, that's like the next big thing. And AI is the agents, so it makes sense. And they're also becoming more accessible, like those custom DPTs. But you really do need to know how to do. How to use it.
You can't just kind of like pick it up and start using it. Like. Like most technologies, you know, I can go all the way back to VCRs, and I didn't read any instructions. I just plugged it in, start working with it.
To get the best, though, you do need to have that expertise. And I'm sure a lot of schools are going to start tapping into AI Moonbeam to get the right approach for their schools. So I look forward to.
[00:39:13] Speaker B: I hope so.
[00:39:14] Speaker A: Yeah, I think so. I think so.
You know what? I'll end with a couple. One more question. A couple more questions. One of the things, though, I would love to understand is how do you prevent hallucinations? Because I know a lot of times with the AI agents, they can kind of also go rogue in a bit.
A bit. So how do you prevent that from happening? Are you planning on them Mostly being co pilots to the teachers versus them autonomously operating on things.
[00:39:43] Speaker B: Once we do the assessment and deploy the technology and do the training, it's the teachers. They get a comfort level and they're on their own.
[00:39:51] Speaker C: Right.
[00:39:53] Speaker B: So we do the training and we make sure that they're good before we kind of cut the umbilical cord and let them do their own thing.
In regards to hallucinations, again, that kind of goes back to the platforms between Khan Academy and school AI and things like that.
We vetted those companies because they're. Because of their track record and because of the rigorous testing that they go through.
So from my understanding, the tests we've done and the conversations we've had with these partners, that is not a kind of a show, not a showstopper that we're concerned about because it has not been a problem.
[00:40:31] Speaker C: Right.
[00:40:31] Speaker B: Based on the use case.
[00:40:33] Speaker C: Right.
[00:40:34] Speaker B: Because again, these companies who are partners, remember, we're technology agnostic. We go with the best player. You know, we like a football team. We put the best player in the game.
[00:40:42] Speaker C: Right.
[00:40:43] Speaker B: And roll with them.
[00:40:44] Speaker C: Right.
[00:40:46] Speaker B: So. But that has not been a problem.
And, you know, of course, these technology platforms want to eliminate hallucinations.
[00:40:55] Speaker C: Right.
[00:40:55] Speaker B: Giving false answers confidently. And some of these platforms give the most confident wrong answer. You know, I've been on. I've been on Chat, GPT and Gemini, and it says, yeah, this, that and the other, yeah, the sky is green. I'm like, bro, the sky isn't green, bruh. Oh, you're right, the sky's in green. But you said it so competently. Of course, after you correct them, they're like, oh, yeah, you're right. Like, hell, yeah, I'm right, bro.
[00:41:16] Speaker C: Right.
[00:41:17] Speaker B: But in regards to hallucinations, again, it's an overarching industry issue, but something in our narrow use case of dealing with schools that we have not seen that be a big issue yet. So hopefully never.
[00:41:33] Speaker A: Okay. Yeah. And I appreciate that you're using vetted technology and they kind of. The use cases, I think are important too, because they can only. There's only so many. You probably put parameters around it and there's only so much that it can kind of pull even like the underlying data and the source material. So that helps it to eliminate those hallucinations.
[00:41:56] Speaker B: Exactly.
[00:41:57] Speaker A: Yeah. So I really appreciate you being here, Eric, and thanks for the note that we met in 2011.
[00:42:06] Speaker B: Yes, we did.
[00:42:07] Speaker A: Okay. Yeah, thanks for reminding me.
But if someone wants to try AI Moonbeam, what's the best way to get started?
[00:42:15] Speaker B: Definitely our website aimoonbeam.com come check us out. There's a place where you can fill out a form, contact us and we'll get right back to you and figure out how we can help you deploy AI into your school.
[00:42:33] Speaker A: Okay. And then I'm assuming that they can follow you on social media from the website. Are you on LinkedIn?
[00:42:41] Speaker B: Yeah, I'm definitely on LinkedIn. If they were to do a search for Eric Hamilton, I think I'm, you know, it's like LinkedIn,/n,/eham06 is my LinkedIn name. So Eric, they do a search for Eric Hamilton or the/eham06 is how they can follow me or connect with me on LinkedIn.
[00:43:05] Speaker A: Okay. And I also will put that in the show notes to make it a little bit easier for them to find you. But yeah, definitely appreciate you being here. And one final question.
If you could leave our listeners with one bold prediction or call to action around AI, what would it be?
[00:43:22] Speaker B: I think the bold prediction is that AI is going to be very similar to the Internet in 1999 where it allowed for small companies to reach global markets.
So if you think before then, if you think about say the 1985 and I'm selling sneakers out of my basement and I wanted to sell sneakers in China or Pakistan or South Africa, it's not really possible. That's expensive.
But fast Forward into the 2000s and you do it like all day every day. It's really easy for you to have customers in South Africa and China and Pakistan. Right.
That's the Internet. So the next big wave is AI.
AI is going to. And this is where it's kind of get crazy. So, so, so bear with me while I go through this explanation.
[00:44:29] Speaker A: Yeah, take your time in digital and.
[00:44:32] Speaker B: I'm going to take a step back for a moment to digital.
[00:44:35] Speaker C: Right.
[00:44:36] Speaker B: If you think about in digital technology, the cost of storage, the cost of bits and bytes has gone down to virtually zero to store bits. So what do I mean? So back in the day, you remember those floppy disks, either 3 and a half or 5 and a quarter inch floppy disk. I know some of your young people are like, what the heck is he talking about? Go Google. For those who don't know, just Google five and a quarter, three and a half floppy disk, they could hold a million bytes.
And when I got my first computer in 1983 in the ninth grade, that was huge. You could put one megabyte of data on these disks. My mom was telling me about how back in the day that was unheard of right.
When she was in junior high school, a five megabyte hard drive was about the size of your living room. It was big. And now 5 megs is just 5 floppy disks. In 1985, when you fast forward, I've got 5 terabytes, which is, you know, not a billion, it's 5 trillion bytes, which would be 5 million floppy disks, right? It'd be a million million.
So 5 million floppy disk would definitely fill this room. And I could fit that on my hand, right? So I'm saying that, I like to say that the cost of bits is approaching is virtually zero trillions of bytes that you can fit in your hand. And back in the day, you needed a living room to fit that type of data.
So here's the bold prediction. The cost of atoms, what we're made of, the shirt is made of that, that ceiling lamp. The cost of atoms is going to approach zero, right? And so what do you mean it's going to pro. So basically we're going to have a future where everything is going to be virtually free, right? It's like, what are you, what are you talking about? So the cost to, you know, why are things expensive? Because you have to pay humans.
Okay, so what about when you start to automate things and build robots? Well, you can get robots to build robots.
So then, well, how do you get the metal out of the ground to make a. Well, the robot can do that. Well, what about the energy, the power plant, the sun? Well, you can get robots to harvest the sun and to drill for gas.
So eventually, just like bits, atoms are going to approach zero. So then, so then what does that mean for us? You know, as far as why are we going to be here? Why are we going to work?
[00:47:16] Speaker A: Work.
[00:47:16] Speaker B: So the bold prediction is it's going to be, it's going to get real bad, then it's going to get real good. It's. We're going to, we're about to hit some crazy Mad Max dystopia, but then we're going to hit utopia based on the fact that we as humans are not going to have to work. So, and you might people like, what is he talking about? So if you rewind the clock back, say 10,000 years, our ancestors in Africa, who were just as brilliant as us, probably more brilliant because I know some crazy people today, but who. Our ancestors in Africa, on the plains of Africa, were just as brilliant, these Homo sapiens, brilliant Homo sapiens, just as brilliant as us.
And if I were to explain to them how I live, it would not make sense, because they were hunter gatherers, right? How they lived. If they wanted to eat, they got to go kill that, kill that, that, that bison, buffalo, or that, that, that gazelle. And they got to go, you know, gathering yams. That's how you do it. Now, if I want a steak, I'm going up to Whole Foods and handing the man a piece of plastic. I know. I show him a piece of plastic and he hands me all the steak and yams and everything else. I want to eat that concept. So then that, that my ancient, my ancestor would say, well, Eric, well, how. How are you doing this? Well, I am a digital analytics consultant. So I have this magic screen that I talk to people and they give me money, right? That makes no sense, right?
So if we fast forward 100 years from now and you talk to your descendants, your descendants is going to say, you, you had to work to pay for college. College wasn't free. You had to work to pay for food.
[00:49:02] Speaker A: Food.
[00:49:03] Speaker B: You had to work for a mortgage. And then if you didn't work hard enough, they put you out on the street.
What? That, that makes. No. If you didn't pay taxes, you were punished. You were put in. You were locked into a cell.
If you didn't pay the government, it's going to be a foreign concept, just like our ancient ancestors trying to understand how I drop that platinum visa on my, on my, on my, my, my ribeye steaks, right? It doesn't make. So, so here's the. The big thought, right, is things are going to go towards zero. The cost of everything is going to be zero. And we're going to get to a place of society where it's not about the acquisition of stuff, it's about the bettering of oneself.
That is going to be the challenge. It's almost going to be like the gamification of life, right? So as I play a video game, there was one game that I used to play in the early 2000s. It was called World of Warcraft. It was like Dungeons and Dragons Online, right?
And it was not impressive to gather money in the game so you can acquire all this wealth. That wasn't impressive because that was relatively straightforward. You go and you conquer, you get gold. Big deal. What was impressive to say, you know, I'm a. I'm a level 80 warrior. Ooh, that's impressive, right? Or I'm a, you know, level 90 spellcaster or whatever, right? Those things were impressive. So that is what is going to be impressive 100 years from now from humans. Not to say I'M a billionaire and I got 16 yachts. It doesn't matter. The money's not going to matter later. So the big idea is atoms are going to be, it's going to going to approach zero.
Humans are not going to work because they have to work. Humans are going to work to better themselves, gamification of life and to be a better person.
That is what I think is going to happen.
[00:51:02] Speaker A: Wow, that is a, that is definitely a bold prediction.
And I, I don't know how people are going to feel about it getting worse before it gets better.
[00:51:12] Speaker B: Yeah.
[00:51:12] Speaker A: But it sounds like you think about 100 years from now and it's going.
[00:51:15] Speaker B: To be completely different, probably even sooner. I, I say 100 to give us a little window, a little buffer. So like, oh, 100. I'm not gonna be around 100 years.
[00:51:26] Speaker C: Right.
[00:51:26] Speaker B: But it may be as soon as 20, 25 years from now that we see the dystopia part before we hit the utopia.
So.
[00:51:35] Speaker A: Wow.
[00:51:35] Speaker B: Yeah.
[00:51:36] Speaker A: And so really, it's not, it's almost sounding like you really don't like. Someone actually text me today and said you think an MBA is worth it? So I gave her my whole logic, but it almost sounded like you're saying that we might not need to necessarily go to college.
[00:51:52] Speaker B: Education is still going to be important, let's say post this dystopia period.
It's going to be about bettering yourselves.
It's not going to be about, you know, I think that the, the, the institution of higher education is really going to go away. It's going to be about lifelong education. Lifelong bettering oneself will be the norm. You know, people won't be impressed by things, you know, everybody. If you want a Lamborghini, you can get one. Big deal.
It's it. The robots can go harvest the steel to make one and get you one. You can drive one. Big deal. It's not going to be a big thing living in a big house.
Everybody lives in a big house. This not a big deal. It's going to be the being the best person you can be is going to be impressive because the acquisition of stuff is not going to be impressive.
[00:52:40] Speaker A: Wow, that's a big change for like a country that's founded on capitalism and founded on live to work to live or live to work.
So I appreciate it and yeah, thanks again for joining us. And it was great talking to one techie, you know, from another techie. So definitely have to keep in touch. And thanks again, again.
[00:53:03] Speaker B: Thank you. I appreciate it and thank you to your audience.
[00:53:06] Speaker A: That's a wrap for the Zora Talks podcast. If today's combo put new ideas on your radar, be sure to follow the show and share this episode. New episodes drop on the 15th of every month.
For show notes and links, head to Zora Digital Online and connect with us on social media.
Zora Digital is a digital marketing agency based in Chicago delivering AI driven strategies with human centered results.
I'm Yawande. Thanks for listening and we'll see you next time.