Can AI ever really capture the soul of human creativity, or is it destined to be forever soulless? James "I'm the King of the World" Cameron seems to think AI's got the chops, as he joins Stability AI to push the boundaries of filmmaking magic. Are we on the verge of an AI-fueled creative implosion, or is this just the reset button that art desperately needs? Plus, are we closer to the singularity than we dare to imagine, as AI begins to supercharge its own hardware tech? Will our world be consumed by AI-generated everything, or are we just getting started on a new journey of creativity? Don't worry, we'll break it all down for you on this eye-opening episode of They Might Be Self-Aware! For more info, visit our website at https://www.tmbsa.tech/
00:00:00 Intro
00:01:58 The Dominance Of Midjourney In AI Image Generation
00:11:09 AI's Quest To Capture Human Emotion
00:19:17 James Cameron And The Future Of AI In Filmmaking
00:26:29 AI's Impact On The Value And Perception Of Art
00:34:38 Are We Closer To The Singularity Than We Think?
00:36:49 Wrap Up
Hunter [00:01:58]:
Is Midjourney still your favorite image generator?
Daniel [00:02:17]:
Yeah, definitely. It's the one that I go to. First and foremost, I have a couple of the offline generators flux, which came out not too long ago. Very cool, very impressive. I've got a bunch of Laura's first specific art styles for a couple other offline generators, but it's mostly mid journey. 90% of the time, it's. I just go there.
Hunter [00:02:43]:
What didn't stability have one.
Daniel [00:02:45]:
Yeah. So, like, stable diffusion is, is powering a lot of the different models out there or something. That's kind of stable, diffusion based.
Hunter [00:02:54]:
I never, I never ran stable diffusion locally. Did you ever?
Daniel [00:02:59]:
Yeah. Yep. I've got. Not at the moment, actually. I think I deleted them all, but I had a couple different versions of a couple of these different models. Some people made it really easy to put up a. A UI that the average Joe is able to use. As long as you have a beefy enough computer to kind of run this, or don't mind waiting, you know, five minutes to get an image or something like that.
Daniel [00:03:20]:
It's, it's pretty easy to get set up with these kinds of things, and you can make some really cool results. And there are features that midjourney doesn't have, or some of them didn't have, but you can do like face swapping or you want to enhance this one specific image. Now, mid journey lets you change one region, for example, now, or expand it out. But the offline generators, the ones that, the kind of hacker community put together, those kinds of features existed there first. And if you really want fine tuned control over the AI art that you're creating, depending on, if you like using that phrase for having an image generated by AI, yeah, that would be the way to go for like the finest tune control for me. Who just wants to see a picture of a dog with a top hat and mid journey?
Hunter [00:04:12]:
Yeah, I definitely use mid journey more than any of the others. And flux is probably the one that I use the second most often. Via what x's Grok interface. I've yet to run it locally. It's on the list of things to do and to play with the Lauras where you can, you can fine tune it on a particular style. And I do have some examples. I've worked with a number of artists in the past, and I have some examples where I have large libraries of their art and their permission to potentially train lauras to see if we can't generate things in a similar style and genre.
Daniel [00:04:48]:
Yeah, that's very cool. Especially the consent part from the artist.
Hunter [00:04:52]:
I don't know if I ever told you this, but I. So I actually subscribe. Mid journey has a magazine. I put them all here.
Daniel [00:04:57]:
What?
Hunter [00:04:57]:
There's a lot of issues. I've been a day one subscriber and. Hold on.
Daniel [00:05:03]:
Not everyone's watching this podcast. Please tell everyone what you just showed me.
Hunter [00:05:06]:
I showed you. It's a good call. A large stack, I don't know, 2025 issues of the mid journey physical real world magazine. I was just thinking in my head that I bet there are some people who don't even really know what a magazine is, because magazines are definitely going, they're on their way out. It's a dying thing. You see them. You see them the most at airports and a bookstore and.
Daniel [00:05:38]:
And eleven type places. Sure, magazines are still everywhere, but I think I see what you mean.
Hunter [00:05:44]:
The youth read magazines. Like when I was twelve, I still. I wanted a magazine. I wanted a thrasher skateboarding magazine. And surfing magazines and computer magazines and 2600 famous hacker zine. But like, do twelve year olds today, 13 year olds, 14 year olds? Are they, do they have any interest in. I don't know. Do you know? I don't know any twelve or 13 year olds to ask.
Daniel [00:06:08]:
I don't. I don't have a lot of twelve and 13 year olds that I personally know that I could ask, but I suspect they don't, the kids these days.
Hunter [00:06:15]:
And they feels old. But anyways, back to mid journey.
Daniel [00:06:18]:
So I'll show our audience in the mid journey magazine. Is it just, here's a bunch of images that somebody made with mid journey.
Hunter [00:06:26]:
That's like any text.
Daniel [00:06:28]:
Okay. So it looks like most of the pages.
Hunter [00:06:29]:
Most of that, yeah. And then at the bottom we do have the prompt that went into it.
Daniel [00:06:34]:
Okay, that's really interesting.
Hunter [00:06:36]:
So you can flip through them and see interesting visual styles and it gives you inspiration for things that you won't want to try. So that's some place that I find it valuable. I also just think it's interesting. And watching the evolution and through, is.
Daniel [00:06:50]:
It just image prompt? Image prompt. That's the whole magazine or is there anything else in there?
Hunter [00:06:54]:
There are some light articles about one particular creator going into a bit of detail about how they're using it. Okay. But one thing that's interesting, this isn't.
Daniel [00:07:02]:
Exactly an entirely AI generated magazine. It's not artifact of nothing.
Hunter [00:07:09]:
98% AI person came up with a.
Daniel [00:07:12]:
Prompt and said, show me a picture of, you know, whatever. And then that gets a bunch of upvotes and so on from people who are going on the site and saying, that was a cool one. And that kind of percolates up to the top. And then once a month they take some of these top images. Their prompts presumably just completely, automatically lay that out. One person does an interview. Do you think it's like an audio interview, like taken? Or is it just a discord conversation that was back and forth with, like one of the people?
Hunter [00:07:46]:
More likely the latter than the former. I don't know. One other interesting thing is that every issue on the spine also prints the version. So it's issue twelve, but version 5.05.15 .3 being the version of mid journey.
Daniel [00:08:04]:
Was used for those images. Interesting. Okay.
Hunter [00:08:07]:
It's on the spine. So it's also very interesting to see the evolution. This magazine only started like a year ago or something, but there is a huge difference in the images that were coming out a year ago and the images that are coming out today very noticeable.
Daniel [00:08:19]:
When, when this was made for the first time, presumably people were involved and said, let's make this okay, great. A layout artist or whatever, using some sort of Adobe software probably put together. This is like what the template of a page looks like.
Hunter [00:08:36]:
And here's page makers.
Daniel [00:08:38]:
Yeah, here's how this all comes together. But after a really small amount of human effort, is this then completely hands off of, we just get our one little five minutes of a person talking back and forth on discord to one user.
Hunter [00:08:56]:
The layout format of the magazine has nothing changed in the last year. So it definitely is something that could be heavily, heavily automated. Whether or not, all right, the team there is TBD, I guess. Midjourney famously has a very small team that builds the core product. What is it? It's like somewhere between like six and twelve people work at that company. And maybe it's changed since last I investigated, but a really, really small team. Now what I don't know is, and I wouldn't be surprised, maybe there's an external company that does the magazine and you know, sure they hired them and whatever, maybe they have 50 people working on the magazine and nothing's automated. And from mid Journey's perspective, it's just a marketing cost.
Hunter [00:09:40]:
But the mid journey team, if they were, if they were the people actually putting together the magazine, and maybe they are, you could automate the vast majority of this.
Daniel [00:09:48]:
Last week we were talking about dungeons and dragons and the worry that I personally have of the human touch being taken out of creating an adventure and even just being an editor of a book or layout artist or artist for the art in these books and every other job associated with them. And then here's a physical magazine that you're subscribed to that has nearly no human intervention. Yeah, I'll admit that feels a little scary to me.
Hunter [00:10:20]:
Yeah. And I've been thinking about this, this disruption of the creative industries. And the question is sort of like, what is the future of human creativity? You know, obviously AI has made remarkable strides in the recent years, just like with the last year in the mid journey magazine, from, from the first 1st issue to last issue, there's no comparison in the quality of what it's producing. And now we have AI that's composing music. We've had suno.
Daniel [00:10:51]:
Yeah.
Hunter [00:10:52]:
And there's a new version of Suno since last we talked about it. I don't know if you played with the newer version.
Daniel [00:10:57]:
I have.
Hunter [00:10:57]:
It's continuing to improve, obviously, generating visual art like mid journey writing scripts, that's been a hot topic in the Hollywood arena, just, just like you were talking.
Daniel [00:11:09]:
About LM, where our own job here, if you want to call it a job, can be automated away.
Hunter [00:11:16]:
We pretend that one doesn't exist.
Daniel [00:11:17]:
Okay, great.
Hunter [00:11:18]:
It's just easier that way. But AI is increasingly taking on roles that were once considered the, you know, I don't know, exclusive domain of human talent. I don't think we can deny that.
Daniel [00:11:34]:
We can, we can mash our teeth all we want.
Hunter [00:11:36]:
We try to stop it. We can say that we still, you want your farm to table dungeons and dragons campaigns, that's fine. But there are, there are dungeon masters losing their, losing their jobs because of, of AI. So I don't know, what does that mean for, you know, the artists and musicians, the filmmakers and all of the creators? Is Aihdenk simply enhancing creativity or is it threatening to just replace it entirely? And maybe even more importantly, there's this question, I think, that can a machine truly replicate the depth of emotion and the soul that human creators bring to their work? Because arguably, that is arthem, right?
Daniel [00:12:28]:
And we talked a few episodes ago about having a robot, humanoid or dog or whatever on the scene in, you know, especially dangerous situations. To be a reporter in the field, would that help capture some of the raw emotion of the scene? I think that AI generators are better than I am at writing emotional sort of dialogue, but maybe something over a long term consistency starts to fall off. Still, if you take the average person, like, not a professional writer, and you were to do a writing contest on a subject between chat, GPT and them, I think we're at a point where the average person is going to be run circles around by AI writers, AI artists. Boy, if I were to show what some of my art looks like, it's at the stick figure stage, or barely, barely past that. When I try my absolute damnedest to do the best job that I possibly can at drawing, and I really sit down and I try and I take my time, it looks awful. It still looks like I was joking about trying to be bad on purpose. Obviously, mid journey does a better job making art than I do. I know some people object to calling that art or making, or certainly the two words together, but if you want something that's gonna be pretty to look at that's vaguely about a prompt that had been given to you billion times out of a billion, I'll take mid journey over me.
Hunter [00:14:02]:
Yeah, I think that's the point, is that this isn't a theoretical conversation that we're having. This is happening right now. There was a story in the news last week about a famous indian filmmaker. His name's Ramgopal Varma, but he announced that going forward he's only going to use AI generated music in his films. And he's got a new film coming out relatively soon. And this caused a bit of a stir, you know.
Daniel [00:14:30]:
Has to be just a attention grabbing headline though, right?
Hunter [00:14:33]:
It's a little bit of that.
Daniel [00:14:34]:
He's doing that as a stunt. He has to be.
Hunter [00:14:37]:
I don't know. So he talks about how, like, he throws it down. He talks about how annoying it is to work with musicians and how they're flaky and they don't show up on time and it's really expensive.
Daniel [00:14:51]:
All right.
Hunter [00:14:52]:
And I guess he played hungover, he played with Suno over the weekend. And he's just like, yeah, you know what? That's good enough. And I think it goes to one of, one of your points where you were talking about, you know, mid journey is way better than the average person, you know, when, when. So when you think of generating scores, we think of Hans Zimmer and Michael.
Daniel [00:15:13]:
Jaconeau, I think might be vaguely how you pronounce. Last night he did like the Star Trek movie. He's done a bunch of them. There are a lot of, there are some very famous composers, but there aren't that many that are at this sort of top of their field. And your sort of interpretation here is that maybe those people are better, at least for now. But then one or two rungs down the ladder from them, eh? Suno's is good.
Hunter [00:15:39]:
Yeah. Danny Elfman was the other one I was just thinking about.
Daniel [00:15:41]:
Sure. Yeah.
Hunter [00:15:43]:
When we listen to these things, we say, well, that's no Hans Zimmer. That's no Danny Elfman. Who's the Star wars scorer.
Daniel [00:15:51]:
I should absolutely have this.
Hunter [00:15:52]:
John Williams.
Daniel [00:15:53]:
John Williams, yeah. Jurassic Park Superman.
Hunter [00:15:57]:
Yeah. I just rewatched Jurassic park. Excellent. I have watched the full thing. It really holds up. That's an aside, but that's practical effects for you.
Daniel [00:16:05]:
None of that AI. Well, there's a little bit of CG.
Hunter [00:16:08]:
We agree that, you know, the AI doesn't compete with John Williams today, but maybe it competes with like, the average score you hear watching CSI insert random city here today or a low budget.
Daniel [00:16:25]:
Commercial from a local appliance store.
Hunter [00:16:29]:
Yeah. And so we have this famous indian filmmaker saying, all right, going forward, you know, I'm just gonna use. I'm just gonna use AI generated music. Screw it, I'm all in.
Daniel [00:16:44]:
Well, this is one person who's doing that. And I think as a stunt, maybe there's a future where more like famous directors and so on start getting involved in these kinds of things, but at least in the west. So you're talking about India. Maybe there is a different cultural acceptance of some of these things or lack of awareness where there isn't as much of a hubbub yet or something. I am not aware of it. But like in the west, in America, I think everyone's riled up enough that if you were to see, I don't know, John Williams say for the next one, I'm not going to do it. I trained Suno off of my music and we're going to see what happens. People would be upset.
Hunter [00:17:26]:
Well, it's definitely, it sparked a lot of debate as these stories tend to do. And again, maybe is just an agent provocateur or maybe it's legitimate, but I think at a minimum, the fact that we are debating it is a testament to the incredible capabilities today of the AI that like, yeah, it is, it is a common.
Daniel [00:17:46]:
If it wasn't good enough to be a threat, that's what you're saying. Okay.
Hunter [00:17:50]:
And it raises that concern about the loss of jobs against something we talk about a ton here and the potential devaluation of human creativity in this instance in the music industry. But you know, on the stunt front, there was another story in the last week. James Cameron, or Jim, as his friends call him, a very famous american filmmaker. I was debating like, what is he best known for the thing I just.
Daniel [00:18:16]:
Said, americans would be upset if like american filmmakers were to be doing this.
Hunter [00:18:22]:
Yeah. The person who created the film that maybe inspired this podcast, the Terminator films, I'm referring to.
Daniel [00:18:28]:
Yeah.
Hunter [00:18:29]:
And, and Titanic. And I guess I think Avatar is what he's best known for these days by the, by the masses. But anyways, James Cameron, he recently joined the board of stability, you know, potentially still a mid journey, mid journey competitor in order to move forward the state of the art on generating images and video for the purposes of use in film. And he's previously spoken out against this and against and like taking a lot, tons of caution on, you know, what might be AI's impact on humanity as a whole and focused in on the filmmaking industry, but kind of seems, and by the way, super, super, super rich guy, I don't buy that like, this is a paycheck thing.
Daniel [00:19:17]:
He doesn't need.
Hunter [00:19:19]:
This isn't a publicity stuff.
Daniel [00:19:20]:
He doesn't need stability.
Hunter [00:19:21]:
Avatar three, because he joined the board of stability. He famously funded a giant mission to go down and actually see the wreck of Titanic, where he just spends a billions and billions of his own dollars to go on these journeys to, you know, I don't know, scratch, scratch itches. He has so.
Daniel [00:19:41]:
So you're saying it's not a play for money for him. He really believes that this is the future and that's something to be embraced.
Hunter [00:19:50]:
Yeah, it appears that he's embracing the technology and that he sees its potential in filmmaking. And, of course, stability is over the moon. They see his involvement as a quite the guest. Right?
Daniel [00:20:00]:
Yeah.
Hunter [00:20:01]:
And the goal here is to push the boundaries of what's possible with visual, visual effects.
Daniel [00:20:06]:
So for a money play, on a similar sort of note, Runway. Runway ML, you can make little clips of videos. It's pretty good. It's in the top tier of video creators that are out there right now, especially because OpenAI still hasn't shown us how to get. Gosh, I don't even remember what it's called now. Sora, Runway is going to be offering filmmakers up to a million dollars if they use their thing to make movies. I think the idea is. So the grants are between $5,000 and a million dollars, but they're trying to fund, like, short films and feature length movies using this technology as much as possible, or presumably like, up to the entirety of the movie.
Daniel [00:20:53]:
Well, we've got James Cameron working with stability AI, makers of, like, stable diffusion and so on runways. Giving people up to a million bucks to make a movie using just AI. Yeah. Is this, is it game over for human made movies then? Is that what you're kind of getting at? Or are we saying that this is a tool that we're going to be more and more using? And it's a growing of the jobs that are lost. It's. This becomes more of the workflow that everybody uses all the time.
Hunter [00:21:27]:
Yeah, I think, yeah, pretty much exactly that. It's not just, I guess the difference is that AI is progressing past the point of just being a tool for efficiency. And it's definitely in the realm now of creative partner, something you've discussed, leveraging it as a creative partner, and then in some cases today, it's becoming a replacement for human creators. And that's, those are the edge cases. But I think it's reasonable suggest that we're going to see more edges being.
Daniel [00:21:58]:
Eaten away up pretty quickly, though.
Hunter [00:22:00]:
Yeah. But, you know, there's this fundamental question behind all of it, which is still that can, can AI truly, you know, capture human emotion, you know, art in all its forms has always been a medium for expressing the. The human experience.
Daniel [00:22:19]:
Sure. Emotions in literature class, you read some book, and then you have to think about what the author was thinking about that even if the author has famously said, I wasn't thinking of anything in particular, you still have to write your essay on what the person is thinking.
Hunter [00:22:33]:
On top of essay or the imagery.
Daniel [00:22:36]:
Does that become even more of a meaningless exercise or a meaningless exercise at all, shall we say? If a person didn't write that, what was the computer really trying to evoke about human experience when it wrote such and such?
Hunter [00:22:53]:
Well, yeah. The critics argue that while AI can mimic the styles and the patterns, it lacks consciousness. It is not self aware, and therefore it cannot, by definition, infuse any genuine emotion into its creations.
Daniel [00:23:11]:
But to the point that I just said, sometimes an artist, like I'm especially thinking of a writer, can write something and say, there was no deeper meaning to this. I just wrote this because I thought it'd be cooler because it fit in the story. And then lit teachers for years and years after say, and this is the thing that they were trying to show. An AI could write something that has no deeper underlying meaning, that could still stir emotions in people.
Hunter [00:23:39]:
Absolutely. It is possible, but I don't know whether it's intentional or not. And some would argue that that famous, we'll call them an artist who created something that they said had no meaning, but you say has tons of meaning, and you wrote 18 essays on, you know, perhaps, you know, that meeting was, that meaning was, you know, the result of their human experience up to that point. You know, that was. That was the. The combination of the nature and nurture of their human experience as they express it. Yeah, they didn't spend too much time thinking about it. And yes, you couldn't interpret it that way, but there's this subtlety in nuance that, at least at this point, we believe is part of these human created works.
Hunter [00:24:23]:
And it's hard to imagine the AI bringing that same subtlety and nuance. And they can't bring it from the raw human experience because they don't have the raw human experience. At best, they have interpretations of it.
Daniel [00:24:36]:
Which is the word of the entirety of the Internet, which includes a lot of the human experience as people pour out their feelings onto digital pages.
Hunter [00:24:46]:
And there's also a concern that. So we move into this future where we are relying on all this AI generated content. It's generating our movies, our music, our D and D campaigns, our podcasts. But does everything just become homogenized since the AI models, they're all trained on existing works, aren't they kind of going.
Daniel [00:25:09]:
To mean that there's no new genre, there's no new frontier to sort of conceptualize?
Hunter [00:25:16]:
They're going to perpetuate the current trends and styles rather than fostering some sort of true innovation. I don't think we've really seen true innovation come out of AI yet. If you have an example, I'd love.
Daniel [00:25:29]:
To hear it, but it's made some, some wild art. But has it really innovated in the field?
Hunter [00:25:35]:
Has it started a new movement? Right. Okay. Wow. This is the new style, the new genre, the new way we're going to present things as it originated from AI, solely not from collaborating.
Daniel [00:25:49]:
The skyscrapers haven't started being made in a new style because new architectural wave has started from AI.
Hunter [00:25:57]:
Yeah. And so this growing trend of AI in these creative fields, it's causing a lot of tensions in various industries that see their jobs potentially, well, becoming less valuable at a minimum, and potentially going away. Even in our own field of software development and AI development, there are concerns. There was an article in the Wall Street Journal this week about how hard it is for CS grads to get a job right now. And these are CS grads from top schools with top gpas, and they're just not getting offers.
Daniel [00:26:31]:
Whereas why would you need to hire a junior software development engineer when I can use Claude and plop out a whole project that would have taken a kid a week and a half to put together and took me two minutes.
Hunter [00:26:43]:
There was a moment I remember, I think I'm gonna say around 2010, 2012, where I was working at a startup where first we would hire anyone, you didn't have to have any experience. You just had to, like, want to learn. That was. And then more than that, there was a $10,000 bonus if you, as a current employee of that company, could identify the person to come work there where they were going to get a giant fat salary, $10,000, because you couldn't hire people, you couldn't convince people to come and work on these software development projects even if they didn't know anything. I may have captured a few of these referrals, but that has definitely completely flipped. Anyways, there's this growing tension in lots of different sectors. The, we talked previously about the, is it like SB 1047, the California law that has passed their legislation that requires, puts a whole bunch of requirements on these large foundational models that they define primarily as costing more than $100 million to train, but now, the, interestingly, the actors in Hollywood and even SAG have come out now to try to convince Governor Newsom, who's yet to actually sign this in law, that we need to make this a law. And I'm a little befuddled by why the actors want this to exist.
Hunter [00:28:05]:
I think it's just that they want some, the beginnings of some legislative framework that restricts how far AI can do. And I've read the bill. What it restricts is that AI can't take over the world. Basically. It can't cause mass casualties and mass Armageddon. That's all it's doing. There's nothing in it that's protecting the actors. But they want the beginnings of some.
Daniel [00:28:29]:
Some structure, something on the books of maybe not everything AI all the time.
Hunter [00:28:35]:
Yeah. And so there's, there's a fear that if AI advancement goes unchecked, that it's just gonna undermine these professions that rely on human creativity and expressions, because we're definitely seeing the beginnings of this just now. Beyond just the potential, I would call the economic impact of people losing their jobs, the integration of AI into the creative processes. It raises a lot of ethical issues, which we've discussed on many of the episodes here. We start with the intellectual property rights, because all of this wonderful AI material, or not wonderful AI material was all trained on other artists, artists works, and by and large, they were not compensated for that. Part one. Part two, who owns the work that's actually generated by the AI? I think the US has said you.
Daniel [00:29:38]:
Can'T copyright AI generated anything.
Hunter [00:29:41]:
This is true. But if you go in and modify it, if you use it in a collaborative way, you can still copyright it. And similarly to the copyright angle, you can't get a patent on anything that the AI innovates. The AI wrote the patent. The AI cannot hold a patent. But the collaboration angle is this. But anyway. But who owns, who owns the output? We don't.
Daniel [00:30:04]:
We don't know if an AI could get a patent. I don't mean, like, legally is allowed to, because not. But if that weren't a problem and an AI could successfully write up a patent and it gets accepted, it gets filed, someone had to cut the check to, you know.
Hunter [00:30:22]:
Yeah.
Daniel [00:30:22]:
Go through the whole process. Doesn't that literally mean an AI is creating something new, is being creative? Unless, you know, the people at the patent office were half asleep at the wheel, you're not supposed to be able to get a patent without having made a materially new thing. Even if everything is built on the shoulders of everything. That has come before that. And a lot of stuff is kind of just mixing up the existing ingredients. You shouldn't be able to get a patent unless you really actually made something new.
Hunter [00:30:55]:
I would agree with that. But thus far the examples have all been someone heavily prompting the AI around an area, and the AI then writing that patent and figuring out how it might accomplish something. Just like you go to Claude and say, build my Kubernetes cluster. Someone went to Claude and said, I want a patent for automating the generation of kubernetes clusters based upon forecasted workloads. And it spit out a patent.
Daniel [00:31:25]:
And the idea there being, because the AI didn't think of that prompt, it isn't creating something.
Hunter [00:31:32]:
Yeah, and with AI in the creative fields, there's also the question of how it impacts the value of art. If art can be generated instantly by machines, does it therefore diminish the value of art created by humans? Or we're just going to flood the entire market with AI generated content, making it harder for human artists to get any recognition?
Daniel [00:31:58]:
I'd say that's absolutely true in at least two different ways. So one, which I have friends that have specifically complained about this, and then I went and I saw exactly what they were talking about it, and now I'm complaining about it too, is there are a variety of things that if you search for on the Internet today, most of the image results of it that you get back are going to be AI generated stuff and not necessarily super coherent or good AI generated stuff. So the Internet is being polluted and we're literally diluting. Like, if there is a real artist that has something in there, it's actually harder to find real people that have made real things on certain subjects already. So that's diminishing that. Also, there are a bunch of people who may not be the best artist in the world, but they're on fiverr or upwork or whatever, and they'll help make, whether it be something for your company or you want to have a character drawn for your d and d campaign or whatever it ends up being. And then you just go to a mid journey instead. And that's taking money out of people's hands for sure at that point.
Daniel [00:33:07]:
So I think I very much agree with you, actually diminishing the literal value of art created by humans. The one way where I would disagree with you is I think a lot of, like, modern art is absolutely a scam.
Hunter [00:33:26]:
The giant white painting, the giant white.
Daniel [00:33:28]:
Painting was dot in the center or whatever. I million percent think you can't tell.
Hunter [00:33:33]:
Me AI could ever come up with that.
Daniel [00:33:35]:
I think it absolutely could. And so I think this would be diminishing the value of real art in a very positive way, because I think if we're just taking some guy puts a one stripe on the thing and, oh, that sold for $15 million, there's no way that's not money laundering. And so I want that job to be taken away by AI because that, it's so infuriating that some of these things, like, I'm sure some people really love a lot of modern art. I'm sorry if I made you upset. Most of that stuff I absolutely cannot stand. It seems like it's just money laundering. And so I want that art being diminished in value, but I'm sad about the rest of it being diminished.
Hunter [00:34:17]:
I do think that it could potentially lead to a giant reset where if we flood the Internet with AI generated everything, whether it's genius or crap or whatever, just everything. If 99% of every Internet real stuff, yeah, it's all fake. It's all fake.
Daniel [00:34:34]:
Then Internet theory made truly manifest.
Hunter [00:34:38]:
It's true. And no one disagrees anymore because you can't. Everything you search for, you just get back pure garbage that maybe that will force us to actually innovate and to reset and to re examine what we value in these things and find a new future for it all. I don't know. And we'll continue this discussion on our journey to self awareness, because you've been listening to. They might be self aware, we might be, it might be, maybe not. I actually, I'll go on a slight aside there. I was going down a thread related, unrelated that we've already achieved singularity.
Hunter [00:35:19]:
We just, like, it's not widely recognized, but we've already achieved it. And this started by I was reading about, so Google has some of the largest context windows around these million token context windows, and it's really, really fast. And so people are asking like, why, how are they doing this? How are they doing it so economically? How is it so fast? Someone replied that it's their hardware.
Daniel [00:35:40]:
Yeah. They're not using, like Nvidia, they're using.
Hunter [00:35:42]:
All proprietary hardware that they did with their tpus, their tensor processing units. And if you go in and read about the creation of this hardware, they used a reinforcement learning model to do all of the chip layouts where they were maximizing the efficiency. And these are layouts that no human, when we talk about innovation, no human ever came up with these ideas. And now they keep the reinforcement learning model keeps generating better and better layouts that then they go and produce, which is where a lot of their and.
Daniel [00:36:15]:
The stronger hardware has more juice to think of an even better layout.
Hunter [00:36:20]:
And one of the key requirements for achieving the singularity is that it starts improving itself. The AI starts improving itself. The suggestion is that the AI has started improving itself, and here's it's helping.
Daniel [00:36:34]:
Improve layouts for hardware that powers it. But it's not saying, here's the new code of my new mind that will be ten times as powerful, unless it is. Maybe it is, or at least to.
Hunter [00:36:49]:
Because that's the efficiency of the tensor processing units. Like, I want to be able to process things more efficiently, and that is the problem that it's solving. I want it to be lower power consumption. I want to be able to think faster, consume less, think faster. More, more, more, more. So maybe. Maybe we already are self aware. We don't know yet, but we're.
Hunter [00:37:06]:
We're on the case, gosh darn it. And we will find out. And if you want to find out, like and subscribe, it's the easiest way. It's free for now. Maybe not. Maybe one day we start charging. You never know. I read that podcast are starting to charge for access to their archives.
Hunter [00:37:21]:
Apparently, that's. That's a new trend. I don't know. Today, for today, we're free. One day only sale. Hit the subscribe button and you're locked in. But, yeah, good conversation. Thanks, Daniel.
Daniel [00:37:31]:
Yeah. Hey, thank you, Hunter.
Hunter [00:37:33]:
All right, we'll be back soon for another episode of.