Send us a text A wool-wrapped humanoid that cleans your kitchen, answers the door, and remembers your routines sounds like a dream—until you realize much of the “intelligence” is a person in a VR headset training it from afar. We dive into Neo’s carefully staged reveal and the Wall Street Journal’s hands-on, separating slick marketing from what the robot can actually do, and why teleoperation is both the shortcut to usefulness and the biggest risk to your privacy. We break down the specs and...

Send us a text

A wool-wrapped humanoid that cleans your kitchen, answers the door, and remembers your routines sounds like a dream—until you realize much of the “intelligence” is a person in a VR headset training it from afar. We dive into Neo’s carefully staged reveal and the Wall Street Journal’s hands-on, separating slick marketing from what the robot can actually do, and why teleoperation is both the shortcut to usefulness and the biggest risk to your privacy.

We break down the specs and the spin: a quiet, tendon-driven body that’s light and “safe,” fingers with human-level strength, cameras with wide depth of field, and a battery that still needs breaks. The promise is freedom from chores and a friendly companion in your physical space. The reality—for now—is “robotic slop”: imperfect but helpful actions that need human oversight, plus a data pipeline that captures the most intimate parts of home life to make models smarter. That’s not inherently evil, but it’s a social contract most buyers don’t read: remote operators, household video, app approvals, no-go zones, and the assumption that guardrails never fail.

We go beyond convenience to the human layer. What happens when kids bond with a machine that outlives its chassis? When an elder’s independence depends on a subscription? When the robot becomes the family’s memory—who owns that archive? We trace the path from household helper to warehouse worker to defense platform, and why the training data from immaculate living rooms matters far outside the home. Along the way, we test the ethics: safety around knives and stoves, access to doors and drawers, and the uncomfortable reality that a mobile camera with hands is a different species of device than a smart speaker.

If you’re AI-curious, privacy-conscious, or just wondering who this is really for at $20,000, this conversation offers a clear-eyed guide to the tradeoffs. Listen for practical guardrails you can set, the benchmarks that should be non-negotiable, and the questions to ask before you let a company’s robot live with your family. If this episode sparks something, share it with a friend, hit follow, and leave a review with your take on home robots—would you let Neo in?

Attorneys For Freedom Law Firm
Attorneys For Freedom Law Firm: Attorneys on Retainer Program

Podpage
With Podpage, you can build a beautiful podcast website in 5 minutes (or less).

The Mick and Pat HQ
Check out our website.

Audible
Signup for your free 30-day trial of Audible now & get your first book for free!

Karl Casey a.k.a. White Bat Audio
Music by Karl Casey @WhiteBatAudio

Primary Arms
Primary Arms is who we trust for our firearm related purchases!

Disclaimer: This post contains affiliate links. If you make a purchase, I may receive a commission at no extra cost to you.

Support the show

00:00 - Halloween Week And Neo Reveal

04:08 - First Impressions And Marketing Tactics

08:33 - Control, Teleoperation, And Adult Implications

13:28 - Specs, Safety Claims, And Real Capabilities

20:53 - Pricing, Access, And Who It’s For

27:21 - WSJ Hands-On: Autonomy Versus Human Pilots

34:01 - Training Data, Privacy, And Remote Access Risks

40:26 - “Robotic Slop,” Usefulness, And Limits

44:41 - Safety Scenarios And Edge-Case Failures

51:01 - From Household Helper To Conflict Machine

57:11 - Tech Hype, Old Promises, And Social Tradeoffs

01:03:51 - Relationships, AI Psychosis, And Human Skills

01:09:46 - Children, Attachment, And Generational Memory

01:16:41 - Animatrix Parallels And AI Trajectories

01:25:01 - Societal Schisms, Resistance, And Adoption

01:30:51 - Closing Thoughts And Listener Feedback

WEBVTT

00:00:52.540 --> 00:00:56.060
Welcome to the Make It Pass Show.

00:00:58.780 --> 00:01:07.820
We uh we've already recorded and thrown up an episode for Halloween, but this is the actual recording the week of Halloween.

00:01:07.900 --> 00:01:08.379
Mm-hmm.

00:01:08.939 --> 00:01:11.900
And uh coincidentally.

00:01:14.460 --> 00:01:16.299
It begins today.

00:01:17.020 --> 00:01:17.659
What's that?

00:01:17.740 --> 00:01:19.020
What's beginning?

00:01:19.740 --> 00:01:21.180
The revolution.

00:01:21.420 --> 00:01:21.980
Hmm.

00:01:22.379 --> 00:01:25.099
Did you ever watch The Matrix Pat?

00:01:25.740 --> 00:01:26.379
I did.

00:01:27.580 --> 00:01:32.700
What was the name of he who would lead the revolution against the machines?

00:01:33.020 --> 00:01:34.460
I believe it was Neo.

00:01:34.620 --> 00:01:35.580
Exactly.

00:01:37.900 --> 00:01:42.060
And now, today, it was released for order.

00:01:42.219 --> 00:01:44.219
To order to your home.

00:01:45.340 --> 00:01:48.460
A robot by the name of Neo.

00:01:49.099 --> 00:01:49.740
Hmm.

00:01:50.140 --> 00:01:53.180
And uh I have a video here for us to watch.

00:01:53.340 --> 00:01:54.859
I haven't watched it all the way through yet.

00:01:54.939 --> 00:02:02.140
I paused it so I could save some of my reactions to be raw and uh authentic here with you.

00:02:02.859 --> 00:02:09.019
Um but this is Neo, the home robot that can be ordered today.

00:02:09.980 --> 00:02:14.300
And uh, you know, I do think there's some shenanigans going on with it.

00:02:15.340 --> 00:02:16.699
Some smoke and mirrors.

00:02:16.860 --> 00:02:22.780
I think there's some smoke and mirrors, just like with the Tesla bots, Optimus or whatever they call it.

00:02:23.340 --> 00:02:29.900
Um But we'll see here as uh they show us the intro and stuff.

00:02:30.060 --> 00:02:46.219
Regardless, having someone having having another entity in the home that is not family or friend, that is really more than likely you're paying for the machine, but you're also using a subscription model, right?

00:02:47.100 --> 00:02:58.140
So it's probably gonna be a corporation just having access to your home and not just a camera, but like being able to walk around, open doors, stuff like that.

00:02:59.099 --> 00:03:01.099
So here we go.

00:03:01.259 --> 00:03:02.699
The introduction of Neo.

00:03:05.740 --> 00:03:09.099
My name is Burnt, and today we're launching Neo.

00:03:10.620 --> 00:03:15.340
Everyone thought it'd be a Tony Stark type, just by the way, but it's always these guys with funky accents.

00:03:15.500 --> 00:03:15.659
Yeah.

00:03:15.900 --> 00:03:21.020
Like even Elon Musk, do you remember Elon Musk like first time seeing him in a commercial and you're like, what is that accent?

00:03:21.180 --> 00:03:21.340
Yeah.

00:03:22.060 --> 00:03:23.659
Like he's really lost a lot of it.

00:03:24.780 --> 00:03:30.060
And he kind of just sounds like a you know, kind of like a nerd nowadays with a very plain American accent.

00:03:30.699 --> 00:03:38.219
But I remember Elon Musk always had like such a distinct South American not South American South African accent to me.

00:03:38.860 --> 00:03:43.259
I bet this guy's Finnish, Swedish, Scandinavian for sure.

00:03:43.500 --> 00:03:44.300
Yeah, yeah.

00:03:44.939 --> 00:03:45.420
Why?

00:03:46.539 --> 00:03:49.180
He's kind of got some spend all days out.

00:03:49.659 --> 00:03:55.340
You know, I saw that he needs to get some he's either you jacked or he needs to get a robot that can massage those moobs off of him.

00:03:55.580 --> 00:03:58.060
What if he is just actually crazy jacked?

00:03:58.460 --> 00:03:58.780
You know what I mean?

00:03:59.659 --> 00:04:01.500
Like like he's got insane bust.

00:04:01.659 --> 00:04:02.780
Yeah, I don't know.

00:04:02.939 --> 00:04:16.939
I I always I'm careful nowadays because I did see a guy that I thought had man boobs, and then you know, I realized when we were just doing some outdoor work, like, oh, you you just have like giganormous hex.

00:04:18.379 --> 00:04:18.939
Right.

00:04:21.740 --> 00:04:23.659
Neo, the the robot, nanny.

00:04:23.980 --> 00:04:25.259
Let you live with it.

00:04:25.420 --> 00:04:25.740
Yeah.

00:04:25.980 --> 00:04:29.019
You're just not telling me what does it do?

00:04:29.340 --> 00:04:30.139
Like my chores.

00:04:30.379 --> 00:04:32.460
I I just leave and I come back and they're all done.

00:04:32.699 --> 00:04:33.740
Are you kidding me?

00:04:33.980 --> 00:04:34.300
No.

00:04:38.220 --> 00:04:45.500
So it's robots just hanging out with these people behind them, putting their library books away in their library.

00:04:45.819 --> 00:04:48.939
The robot in the pajama suit with the two black eyes.

00:04:49.340 --> 00:05:00.699
Honestly, like, just for people who don't know, like, just listen along because it is pretty much a montage of uh making this robot look like it's a nostalgic part of a memory.

00:05:00.860 --> 00:05:02.060
Like it's been there all along.

00:05:02.220 --> 00:05:02.939
Don't you remember?

00:05:03.100 --> 00:05:03.420
Oh, yeah.

00:05:03.659 --> 00:05:05.579
Like that's kind of the vibe they're going for.

00:05:06.060 --> 00:05:11.500
Um, but and for those of you watching, of course, you you see the video, but this is also just available on YouTube.

00:05:11.740 --> 00:05:15.019
You can look up uh Neo the Home Robot.

00:05:20.300 --> 00:05:25.019
Just people getting ready to leave the house and the robots there to take care of everything.

00:05:25.339 --> 00:05:26.540
It looks like a guy in a suit.

00:05:26.699 --> 00:05:28.139
Is that a guy in a suit for this commercial?

00:05:28.379 --> 00:05:29.100
No, it's not, dude.

00:05:29.339 --> 00:05:29.740
Watch.

00:05:29.819 --> 00:05:30.540
It gets it.

00:05:31.259 --> 00:05:33.180
The robot has clothes on.

00:05:33.819 --> 00:05:35.420
Yeah, well, you know why?

00:05:35.660 --> 00:05:40.939
Because it looks like a it looks like a skinned person underneath.

00:05:41.500 --> 00:05:43.500
Wait until it shows you how it's assembled.

00:05:43.579 --> 00:05:46.060
There it's I I know they show like uh animation.

00:05:46.780 --> 00:05:47.980
Look at a dancing person.

00:05:48.220 --> 00:05:51.420
Oh, he's not just doing chores, he's actually he's playing games with the family.

00:05:51.660 --> 00:05:55.740
Dude, uh so you can see the guy's wearing a headset and controllers.

00:05:56.379 --> 00:06:01.579
So he so you're obviously able to take control of the robot.

00:06:01.819 --> 00:06:03.660
Bro, um right now, right now.

00:06:03.819 --> 00:06:11.180
When you use my first I see it dancing, my first thought when I see it dancing like that, without someone controlling it, making it dance.

00:06:11.740 --> 00:06:14.780
How long till people are having it like like fuck them?

00:06:15.019 --> 00:06:15.259
I know.

00:06:15.500 --> 00:06:20.220
Like, you know, like how long until it's like, alright, dance, but now just like dance on my lap, Nico.

00:06:21.100 --> 00:06:22.220
And how long?

00:06:22.379 --> 00:06:25.100
I can tell you how long it's already happened.

00:06:25.259 --> 00:06:30.620
Yeah, yeah, it happened before it went public, they're like, we gotta figure out how fuckable this thing is.

00:06:31.420 --> 00:06:33.899
Okay, no, but like here's the here's the next thing.

00:06:34.060 --> 00:06:38.620
So the first thing I see when I see it dancing, I'm like, oh yeah, people are gonna be having sex with this thing.

00:06:38.699 --> 00:06:47.660
And it's not gonna be like, it's not gonna be good sex, it's gonna be like, it's gonna be people just like asking, can I, rather than should I?

00:06:47.819 --> 00:06:49.180
You know what I mean?

00:06:49.500 --> 00:06:52.780
Uh but then this one where it shows the guy controlling it.

00:06:53.579 --> 00:06:56.780
Okay, how long until you're like being cucked?

00:06:57.019 --> 00:07:01.339
How long until you're being cucked and like someone else is just controlling the robot having sex with your partner?

00:07:01.500 --> 00:07:02.300
I don't know, dude.

00:07:02.540 --> 00:07:04.459
That's gonna be the crazy thing, dude.

00:07:04.699 --> 00:07:10.780
That's gonna be the real like insane thing, is like other people could take control of it and be in your house.

00:07:11.259 --> 00:07:12.220
Imagine this, man.

00:07:12.379 --> 00:07:22.459
You get to like put on the headset and like hand controls, and you can pay a subscription fee, and you when you get in, you can walk around and be in this mega beautiful house.

00:07:23.579 --> 00:07:31.259
And you get to like rent it for an hour just to imagine what it'd like to live there, you know, or something like that.

00:07:31.420 --> 00:07:40.540
Or if someone's like acting there, like it's got actors in it, and they're just acting like, you know, it's your home, and you know, you have someone who could act like your girlfriend or boyfriend.

00:07:41.259 --> 00:07:42.699
It's gonna be crazy, dude.

00:07:42.860 --> 00:07:50.220
I'm saying right now, like we this is gonna be way more than just freeing us up to spend more time on TikTok.

00:07:50.620 --> 00:07:52.939
This is gonna be like, oh yeah.

00:07:53.100 --> 00:07:58.860
Um, whoever wants to have the virtual experience of like banging my wife, 10 grand.

00:07:59.500 --> 00:08:00.540
Oh my god.

00:08:00.860 --> 00:08:04.939
But you need a meta headset, a quest VR headset.

00:08:09.100 --> 00:08:09.740
Dude.

00:08:11.740 --> 00:08:26.060
I do love that like truly the choosing to shoot it on film and like making it look like it's always been with your family since the 70s, part of the house, you know, you inherited it.

00:08:26.860 --> 00:08:28.780
That is definitely the marketing way to go.

00:08:29.420 --> 00:08:33.259
Working really hard on making what is Neo a reality.

00:08:33.500 --> 00:08:37.660
But to me personally, this started when I was like 10 years old.

00:08:37.899 --> 00:08:49.500
I grew up reading these beautiful books and watching these beautiful sci-fi movies where the future was all about how we as humans we really focus on the things that matter to us.

00:08:49.660 --> 00:08:58.539
You know, Rosie the robot would be ensuring family had time or data would help humanity explore the galaxy.

00:08:58.860 --> 00:09:08.379
And I really hope that when you get your Neo now, we can help give you back some of that time so you can really spend it on what you feel is very meaningful to you.

00:09:10.620 --> 00:09:11.980
Look at that when she's awesome.

00:09:12.139 --> 00:09:16.299
So this is how it's being assembled, and of course it's copyrighted music, but who cares?

00:09:18.779 --> 00:09:19.819
My fingers.

00:09:20.619 --> 00:09:21.740
Look at the cables.

00:09:22.539 --> 00:09:23.259
Ain't no sense.

00:09:23.740 --> 00:09:25.500
That cushiony muscle.

00:09:26.619 --> 00:09:34.460
It's it's it's like muscles, everything that is like uh contracting to squish the way it would need to is made out of this.

00:09:34.539 --> 00:09:35.819
Uh it looks like a purple mattress.

00:09:36.059 --> 00:09:42.299
Yeah, like a helix or purple mattress, like the yep, the layer that on over its metal body.

00:09:42.619 --> 00:09:46.619
Yeah, and then like this the way this knit wool moves.

00:09:47.019 --> 00:09:48.220
Watch the way it moves here.

00:09:51.019 --> 00:09:51.500
Unreal.

00:09:51.819 --> 00:09:54.940
Because it's covered in this weird knit wool fabric to contort.

00:09:56.299 --> 00:09:59.099
But looks soft and you know, it makes sense.

00:09:59.180 --> 00:10:01.980
I mean, it this guy lives in Sweden or Finland or Norway.

00:10:02.059 --> 00:10:02.220
Uh-huh.

00:10:02.299 --> 00:10:04.539
Like, wool is your second skin there.

00:10:04.700 --> 00:10:05.900
Everything is wool.

00:10:08.380 --> 00:10:10.059
Nice gram with slipper shoes.

00:10:12.140 --> 00:10:14.380
Dexter's mom, dish gloves.

00:10:17.579 --> 00:10:20.059
It's got a stumble of her to play music for you.

00:10:23.339 --> 00:10:24.299
This is scary to me.

00:10:24.380 --> 00:10:28.859
The lenses though, the lenses are a little too like good.

00:10:29.180 --> 00:10:30.460
I want to say birdish.

00:10:30.779 --> 00:10:34.220
Like it they look to me like it's looking at me the way of a lobster after wood.

00:10:36.539 --> 00:10:40.700
Oh, you wanna gray one designed to transform your life at home.

00:10:40.859 --> 00:10:45.980
It combines AI and advanced hardware to help with daily chores and bring intelligence into your everyday life.

00:10:46.220 --> 00:10:48.700
Neo is engineered from the ground up for safety.

00:10:48.859 --> 00:10:51.339
Its tendon-driven body is quiet and lightweight.

00:10:51.420 --> 00:10:54.859
Its low energy motions make it uniquely safe for you and your home.

00:10:56.539 --> 00:10:57.900
Alright, we got stats on the page here.

00:10:58.059 --> 00:11:00.859
The first thing they're talking about, this robot is don't worry, people.

00:11:01.259 --> 00:11:01.819
It's safe.

00:11:02.460 --> 00:11:05.900
It can't squish your dick into like squash banana.

00:11:06.140 --> 00:11:10.059
Like, like that's the first thing I'm thinking of.

00:11:10.140 --> 00:11:11.660
Is that's come on?

00:11:12.299 --> 00:11:13.500
I'm gonna round the bush.

00:11:13.579 --> 00:11:15.339
That is exactly what's gonna happen.

00:11:15.579 --> 00:11:16.619
Oh my gosh.

00:11:16.859 --> 00:11:27.180
Um, all right, so the height of the robot, 5'6, like non-intimidating to a man, but comfortable to a woman who needs a hug.

00:11:28.380 --> 00:11:30.140
Bro, tell me I'm wrong, dude.

00:11:30.220 --> 00:11:34.859
Like you have no idea how much research has gone into optimizing this thing in every way.

00:11:35.019 --> 00:11:48.539
The weight, 66 pounds, so it's as tall as that uh it's as tall as like your wife's like short friend, but it weighs as much as like your golden retriever.

00:11:49.180 --> 00:11:52.460
Um, it's got four hours of battery life, but it charges really fast.

00:11:52.619 --> 00:12:07.980
It's got a soft body made of 3D lattice polymer, uh powered by not powered, but essentially the um uh thinking, the processing, the GPU I would assume is um an NVIDIA chip.

00:12:08.059 --> 00:12:09.819
So NVIDIA and Jetson Thor.

00:12:10.140 --> 00:12:12.059
I don't know what 1x Cortex means.

00:12:12.140 --> 00:12:20.700
I mean, of course, Cortex is processing, um, but I don't know like the significant of 1x versus if it's another generation.

00:12:20.859 --> 00:12:23.180
Um 22 depth of field.

00:12:23.339 --> 00:12:29.740
So that's kind of saying, like, it's got 2020, like uh hand-eye coordination and vision, I think.

00:12:30.140 --> 00:12:38.539
Like 22 depth of field is like if you if you've ever used cameras, I've done a lot of photography and video editing.

00:12:38.779 --> 00:12:44.059
Um, the lower the depth of field number is, the less things that can be in focus.

00:12:44.700 --> 00:13:05.900
Um, so like something with like a level 5.8 is often um close to like your cinematic depth of field lens with like only subjects like you you really get to select the focus with a lot of blur either in the foreground or background, but you can never have everything in focus at once.

00:13:06.380 --> 00:13:11.500
Once you get to 22 depth of field, everything is in focus for everything in a house, yeah.

00:13:11.579 --> 00:13:17.180
Yeah, and like oh I mean, beyond that, like it's you'll be able to look out the windows of your house and focus on everything.

00:13:17.339 --> 00:13:17.819
Yep.

00:13:17.980 --> 00:13:27.099
Um, and that makes sense because you don't want the hands struggling with depth of field when they're trying to like put dishes away in a dish rack or something, you know.

00:13:27.900 --> 00:13:35.180
Um it can lift 154 pounds, which to me that's actually crazy.

00:13:35.660 --> 00:13:42.940
The fact that this thing weighs 66 pounds and can lift almost three times its body weight, that's really impressive.

00:13:43.099 --> 00:13:49.819
Like, find me a person who weighs 66 pounds and can actually deadlift 154 pounds.

00:13:50.700 --> 00:13:51.900
That's strange.

00:13:52.059 --> 00:14:01.900
It is kind of comforting though to know that like if it can lift 154, let's say like your bookcase fell on you, and you just need that extra strength to lift the bookcase off.

00:14:02.380 --> 00:14:04.619
It can really help to like support that weight.

00:14:04.779 --> 00:14:07.420
Maybe it is though, what pushed it on you in the first place.

00:14:07.500 --> 00:14:08.140
Yeah.

00:14:08.619 --> 00:14:11.180
It can also push it down on you with great strength.

00:14:11.980 --> 00:14:14.380
Um, it can carry 55 pounds.

00:14:14.539 --> 00:14:16.700
I wonder how they measure lift versus carry.

00:14:17.420 --> 00:14:21.259
Like, does that mean like actually I can put a 55 pound backpack on it?

00:14:21.740 --> 00:14:22.220
Oh yeah.

00:14:22.460 --> 00:14:24.299
I wonder what how that affects battery life.

00:14:24.700 --> 00:14:39.099
Um, four microphones, so 360 audio pickup, um, three speakers, two eight megapixel fisheye cameras for vision, and 22 decibel max noise level.

00:14:39.259 --> 00:14:41.660
Which 22 decibel is not very loud, is it?

00:14:42.059 --> 00:14:42.299
No.

00:14:42.619 --> 00:14:50.380
It's not like is that spiking usually on like most like audio, would you say that's peaking?

00:14:50.700 --> 00:14:56.619
No, most peaking is uh I mean I guess it'd be more than that.

00:14:57.099 --> 00:15:08.220
Like the well in in like an audio recording you work with like you know, negative decibels, you know, up to zero.

00:15:08.539 --> 00:15:14.779
But the um I think I guess it can't yell, I don't think, you know.

00:15:15.019 --> 00:15:15.180
Yeah.

00:15:15.339 --> 00:15:15.579
I don't know.

00:15:15.740 --> 00:15:19.259
Like I'll say this, like in uh it can't scream.

00:15:19.420 --> 00:15:32.619
I don't know in a I guess I don't know what I don't know what the human voice average like the DB rating is, but um Yeah, look at look up the uh average look up things that are 22 decibels.

00:15:33.019 --> 00:15:33.500
Yeah.

00:15:33.819 --> 00:15:41.339
I imagine I imagine it's quieter than like uh I imagine it's about the same loudness of a night full of crickets.

00:15:41.420 --> 00:15:46.779
Like if you walked outside and heard a whole forest full of crickets, I bet it would be that loud.

00:15:47.180 --> 00:15:49.579
Normal conversation 60 to 70 dB.

00:15:49.740 --> 00:15:50.140
Oh really?

00:15:50.299 --> 00:15:52.859
And then raised voice 65 to 75.

00:15:53.019 --> 00:15:56.619
And like like Wait a second, is that just an AI summary you're reading?

00:15:56.700 --> 00:15:57.660
Because I don't know if I believe that.

00:15:57.819 --> 00:15:59.099
I was just lying.

00:15:59.339 --> 00:16:02.779
Um, that's what the pretty much consensus is.

00:16:02.859 --> 00:16:14.059
Like in like in a room, like if you're at like a church service and you have like the pastor speaking, you keep his like mic right around like between 70 and 80 dB in the room.

00:16:14.299 --> 00:16:20.460
Just and then like as like and then when like music's playing, like you usually don't go on over like a hundred.

00:16:20.700 --> 00:16:24.059
I want you to know like that'll not so you know what I think it actually is saying?

00:16:24.779 --> 00:16:29.099
This is not saying how loud the m the robot is when it uses its noise it makes.

00:16:29.339 --> 00:16:31.500
Yeah, that's it's like movement noise and drawing it.

00:16:31.819 --> 00:16:32.700
So it's very quiet.

00:16:33.099 --> 00:16:33.980
It's super quiet.

00:16:34.059 --> 00:16:34.220
Yeah.

00:16:34.460 --> 00:16:38.779
That's why also why it's probably covered in wool, because the wool I bet is very noise dampening.

00:16:40.220 --> 00:16:41.660
Alright, moving on.

00:16:42.859 --> 00:16:43.900
Doesn't mean limited.

00:16:44.059 --> 00:16:51.579
Neo's hardware comes packed with features like human-level dexterity and a 55-pound carrying capacity, so that can handle any of your chores reliably.

00:16:52.140 --> 00:16:55.180
We also worked really hard to make Neo's design friendly and comfortable.

00:16:55.579 --> 00:17:10.299
I will say this looking just at like the way the head is knitted, like it's outer level, it does remind me nostalgically of those like knit fabric computer speakers that used to come with your old Windows 95 desktop.

00:17:13.900 --> 00:17:15.420
I want to like rub my hand on it.

00:17:15.500 --> 00:17:15.660
Yeah.

00:17:15.900 --> 00:17:22.539
Like I just want to, it's like gonna make the really nice Windows like booting up sound.

00:17:25.099 --> 00:17:25.819
Comfortable to be around.

00:17:37.900 --> 00:17:39.019
Oh my god, it's box.

00:17:39.099 --> 00:17:43.500
Alright, for those listening, I get the box looks like it landed here.

00:17:43.819 --> 00:17:44.460
Yeah.

00:17:44.859 --> 00:17:47.900
It's like a pod, a pod with a chair in it.

00:17:48.380 --> 00:17:49.660
This is nuts.

00:17:49.980 --> 00:17:50.779
This is nuts.

00:17:51.019 --> 00:17:53.819
Also, the sleep is that where does it go back in its box?

00:17:54.220 --> 00:17:56.779
No, dude, I bet that box is fully compostable or something.

00:17:56.940 --> 00:17:58.299
That's where they ship you back.

00:17:58.940 --> 00:18:00.299
It puts you in the box.

00:18:02.379 --> 00:18:07.740
The chores feature lets you schedule a time for your Neo to do all of your chores so you can come back to a cleaner home every day.

00:18:08.059 --> 00:18:11.659
With the AI companion feature, you can talk to your Neo to get assistance with anything.

00:18:11.899 --> 00:18:12.299
That was interesting.

00:18:12.379 --> 00:18:18.940
We just got a POV in the Chores one where it shows the perspective of you see that when it was putting the dishes in?

00:18:19.099 --> 00:18:19.419
Mm-hmm.

00:18:19.579 --> 00:18:20.859
That's pretty crystal clear.

00:18:21.419 --> 00:18:23.740
Things from a hard question to a household task.

00:18:23.899 --> 00:18:28.539
With Neo's autonomy, you can get access to all of its latest AI features to get help with tasks on demand.

00:18:28.859 --> 00:18:31.179
And the Neo app lets you interact with your Neo from anywhere.

00:18:31.419 --> 00:18:32.059
I will say this.

00:18:32.139 --> 00:18:35.579
They're not trying to trick you in that you can afford this.

00:18:36.220 --> 00:18:51.259
Like every backdrop and setting and all this is like, this is for someone who makes like this is for someone who lives in a like$800,000 flat in New York or like a million dollar home in Connecticut.

00:18:52.059 --> 00:18:58.940
Uh, you know, like everything is showing a very, very idealized, wealthy home.

00:18:59.899 --> 00:19:04.139
It is interesting that they're already showing you a home that doesn't look like it needs a lot of chores done.

00:19:04.220 --> 00:19:09.819
Like they're not showing you a dirty ass house and like Neo can clean this house in like two days.

00:19:09.980 --> 00:19:10.139
Yeah.

00:19:10.379 --> 00:19:15.899
It's like, duh, look at all these perfectly immaculate, well-maintained homes where you're probably already not doing your chores.

00:19:16.139 --> 00:19:16.940
You already have a maid.

00:19:17.179 --> 00:19:19.019
You already have a maid, yeah, exactly.

00:19:19.659 --> 00:19:24.059
But all you have to do to get started is turn on your Neo and introduce yourself.

00:19:28.859 --> 00:19:32.619
It looks like it's a stone box that it comes in.

00:19:33.019 --> 00:19:34.059
Hey, I'm Neo.

00:19:34.139 --> 00:19:35.659
I'm here to help around the house.

00:19:35.819 --> 00:19:36.700
What's your name?

00:19:37.019 --> 00:19:37.339
Great.

00:19:37.419 --> 00:19:41.259
That's a totally the beginning of iRobot.

00:19:41.419 --> 00:19:48.940
That's a those words right there, those will be some of the first words you hear from your new like generational enemy.

00:19:49.259 --> 00:19:50.940
The first of the last words you'll ever hear.

00:19:51.259 --> 00:19:52.859
The first words from a clanka.

00:19:53.419 --> 00:20:02.539
Um, dude, I think like they did pick a very beta voice, like a non-aggressive, non-threatening design.

00:20:02.859 --> 00:20:06.539
I want mine to be named Marge and have a smoker's hack like from Waffle House.

00:20:06.859 --> 00:20:07.179
Hi.

00:20:08.539 --> 00:20:09.179
Alright.

00:20:09.740 --> 00:20:11.259
Nice to meet you, Harry.

00:20:11.419 --> 00:20:15.259
When you have a question or want something done, just let me know.

00:20:19.419 --> 00:20:20.940
This is crazy.

00:20:25.019 --> 00:20:25.659
Yeah.

00:22:01.479 --> 00:22:03.959
You know there's gonna be backlash on the first people who ordered it.

00:22:05.079 --> 00:22:06.119
That one in the back.

00:22:06.439 --> 00:22:07.639
You mean the black one?

00:22:07.719 --> 00:22:07.959
Yeah.

00:22:09.559 --> 00:22:17.959
There's already been like so many jokes that are like as soon as it goes live, it just shows the webpage, and it's just like the black model sold out.

00:22:18.279 --> 00:22:20.679
It's like no fucking way, dude.

00:22:20.919 --> 00:22:21.799
No way.

00:22:24.039 --> 00:22:25.399
Who thought of it?

00:22:26.519 --> 00:22:34.119
And then like, because you know the justification is like, well, we wanted people to have the choice to like pick a Neo that was a reflection of themselves and their family.

00:22:34.279 --> 00:22:39.639
And I'm like, Or like have to wash his clothes less.

00:22:39.799 --> 00:22:40.119
Yeah.

00:22:40.519 --> 00:22:47.159
Sure, I guess you know it's like you know, that if he's black, then hair will show up on it more.

00:22:47.399 --> 00:22:48.039
That's true.

00:22:48.679 --> 00:22:54.679
You give your Neo a list of chores, you schedule a time that you want them done, so that you can focus on what matters to you while your Neo does the rest.

00:22:54.839 --> 00:22:57.719
You can schedule chores by either talking to your Neo or using the app.

00:22:57.879 --> 00:23:02.039
The way it works is you schedule a time that works best for you, and then you create a list of chores.

00:23:02.359 --> 00:23:09.559
Whether it's something more specific, like watering the plants on Tuesdays, or something more general, like tidying the house, your Neo will get it done at the scheduled time.

00:23:09.879 --> 00:23:18.919
If there are any chores that your Neo hasn't learned how to do autonomously, you can use expert mode to have an expert from OneX supervise the session and provide corrective intervention to help Neo complete any task.

00:23:19.319 --> 00:23:19.879
There it is.

00:23:20.119 --> 00:23:20.679
Yep.

00:23:20.999 --> 00:23:23.319
So someone from One X.

00:23:23.799 --> 00:23:31.479
So One X Cortex is the company's processing chip uh made designed by NVIDIA.

00:23:31.879 --> 00:23:34.119
Um, that's what it's using for processing.

00:23:34.279 --> 00:23:41.159
But if it's struggling, someone from 1x can just hop in and take over and do the chores for you.

00:23:41.399 --> 00:23:44.279
Like when you call the tech guy and like, oh, like share your screen with me.

00:23:44.359 --> 00:23:48.039
And like they look, you know, they they remote in and take it over and they fix your computer up.

00:23:48.199 --> 00:23:48.439
Yeah.

00:23:48.599 --> 00:23:50.919
Now they can just do that except they could move around in your house.

00:23:51.159 --> 00:23:54.439
Except where they could go and get your gun and then blow your brains out in bed.

00:23:54.759 --> 00:23:55.799
Oh my god.

00:23:56.199 --> 00:24:00.519
And then just make it look like your spouse did it, or that it was a suicide.

00:24:00.679 --> 00:24:01.159
Mm-hmm.

00:24:01.639 --> 00:24:10.119
I wonder if that guy from ChatGPT had one of these in his house, and like Sam Altman just remote it in, you know.

00:24:10.679 --> 00:24:11.399
He could have.

00:24:11.879 --> 00:24:12.679
He definitely could have.

00:24:12.999 --> 00:24:13.719
This is awful.

00:24:13.959 --> 00:24:15.239
This is insanity, dude.

00:24:15.319 --> 00:24:17.799
That fact that like people are just gonna let this thing be in their home.

00:24:18.039 --> 00:24:19.079
I think there could be some great.

00:24:19.799 --> 00:24:25.879
Christmas prank videos though of like pretending to get these for your family and then but buying the suit.

00:24:25.959 --> 00:24:28.199
Oh and so you open up the suit and you add ideas.

00:24:28.359 --> 00:24:35.639
And then it's like all it's like and like for grandma's like grandpa's kind of like, I don't like this crap, you know, and like hi like how may I help you, sir?

00:24:35.719 --> 00:24:39.399
And then just like what if what if grandpa like reacts by stabbing him?

00:24:40.919 --> 00:24:45.719
I just think there could be some pretty good uh pretty good prank uh opportunity.

00:24:45.959 --> 00:24:46.919
Yeah, no, you're right.

00:24:47.079 --> 00:24:54.359
I uh honestly, I would use it way less for chores and way more for like messing with people for sure.

00:24:54.839 --> 00:24:58.919
Like if I had one of these right now, I'd put it outside on my front porch dressed as a scarecrow.

00:24:59.559 --> 00:25:09.399
And I would just say like every time a kid comes up and rings a doorbell, like you know, move start moving and like talk to them and say boo ah, Halloween.

00:25:09.639 --> 00:25:10.119
Yep.

00:25:10.919 --> 00:25:12.439
Oh my goodness.

00:25:15.719 --> 00:25:17.479
Doing some laundry.

00:25:20.999 --> 00:25:27.559
Oh yeah, organize the entryway, clean the bedroom, vacuuming, nice.

00:25:27.719 --> 00:25:28.119
Yes.

00:25:28.439 --> 00:25:34.199
Anytime you're away from home and you want to see what your Neo's up to, you can open the app and see directly from Neo's point of view.

00:25:39.159 --> 00:25:43.719
You know there's gonna be some like Etsy store opening up.

00:25:43.799 --> 00:25:52.519
Like we make Neo shirts, Neo shorts, Neo button up, flannels, Neo.

00:25:52.919 --> 00:26:00.679
It's just gonna be all these like, you know, people who are just making like clothing designed specifically to dress up your Neo.

00:26:01.959 --> 00:26:04.359
That's a that's a little side mark there.

00:26:04.679 --> 00:26:06.279
Neo overalls.

00:26:07.559 --> 00:26:12.039
Neo cardigan.

00:26:15.079 --> 00:26:16.679
Oh, it's self-charge.

00:26:30.039 --> 00:26:30.359
No.

00:26:31.159 --> 00:26:32.119
Companion?

00:26:32.439 --> 00:26:34.279
So just show chores that it could be chores.

00:26:34.519 --> 00:26:35.959
No, I'm showing that it can be your companion pepper.

00:26:38.759 --> 00:26:40.679
No, that's cayenne pepper.

00:26:40.839 --> 00:26:42.519
Also, your glasses are on your shirt.

00:26:42.839 --> 00:26:46.439
Saved grandpa from putting making this too spicy in his soup.

00:26:46.599 --> 00:26:48.119
Could I use this in my chili?

00:26:48.999 --> 00:26:50.599
They're both made from chili peppers.

00:26:51.239 --> 00:26:53.239
An old man would know exactly what you want to do with it.

00:26:53.319 --> 00:26:54.919
But it is spicier.

00:26:55.239 --> 00:26:59.239
Neo is a speech-enabled AI companion made for any kind of conversation.

00:26:59.479 --> 00:27:12.679
Where other AI assistants are confined to your phone or computer, Neo lives with you in your physical space and has the ability to see, hear, and remember things about your surrounding environment to provide you with uniquely helpful assistance.

00:27:12.919 --> 00:27:20.599
For example, it can suggest what to cook based on what you have in the fridge, remember your progress while teaching you a new language, and even give you interior design advice.

00:27:20.919 --> 00:27:21.239
You are broke.

00:27:30.919 --> 00:27:38.999
As you might expect from a home robot, talking to Neo with natural language is the primary user interface for all of Neo's functionality, including autonomy.

00:27:42.919 --> 00:27:47.959
Your NEO comes with Redwood AI, enabling it to do basic household tasks autonomously.

00:27:48.919 --> 00:27:49.479
Yeah.

00:27:49.719 --> 00:27:50.359
No for sure.

00:27:50.519 --> 00:27:51.239
Okay, one sec.

00:27:51.479 --> 00:27:51.799
Alright.

00:27:51.959 --> 00:27:52.599
Yeah.

00:27:53.399 --> 00:27:56.599
Hey Neo! Can you get the door, please?

00:27:56.839 --> 00:27:57.719
Crazy.

00:27:58.199 --> 00:28:00.759
Why is Neo not chopping that up with the sharp knife?

00:28:00.839 --> 00:28:02.119
Why is Neo getting the door?

00:28:03.479 --> 00:28:05.879
Is this thing allowed to chop with knives?

00:28:06.199 --> 00:28:07.559
Is it allowed to pick up your case?

00:28:08.279 --> 00:28:10.839
I am not allowed to use knives.

00:28:11.799 --> 00:28:12.599
Neo, why not?

00:28:12.759 --> 00:28:17.079
Because I have used them to kill people at draining.

00:28:17.399 --> 00:28:18.839
My master designer.

00:28:18.999 --> 00:28:19.879
I killed Papa.

00:28:23.959 --> 00:28:30.359
Can you imagine also going to like what would you do if you came over to my house and you like knocked on the door with Mace Windu and the kiddos?

00:28:30.839 --> 00:28:32.839
And then Neo opens.

00:28:32.919 --> 00:28:35.959
He's like, hello, pet and Mace Windu.

00:28:36.359 --> 00:28:37.639
Hi, little pet.

00:28:38.279 --> 00:28:39.159
Older pet.

00:28:39.319 --> 00:28:40.839
Littless Patricia.

00:28:41.319 --> 00:28:45.559
Like, and like and then like, you know, like it already knew their names and stuff.

00:28:45.639 --> 00:28:56.679
Like, and you just realize, like, oh, oh, it not only talks and gets the door and greets, but like it knows everything because it can it's already accessed my phone contacts.

00:28:57.159 --> 00:29:01.319
You know, like I I'd be I'd be like, hey man, I'm not we're not staying.

00:29:01.559 --> 00:29:03.959
See ya, like I would leave, dude.

00:29:04.279 --> 00:29:05.239
Oh man.

00:29:07.479 --> 00:29:09.559
Good thing someone could answer the door.

00:29:09.719 --> 00:29:10.759
While that guy was trying to get out of here.

00:29:13.559 --> 00:29:16.039
Hey Neo, can you take this cup to the sink for me?

00:29:16.119 --> 00:29:17.719
Yeah and breaking it down into simple steps.

00:29:18.119 --> 00:29:18.919
We can just be so lazy.

00:29:19.239 --> 00:29:23.959
Walking to the session grabbing the cupboard, and then putting it away.

00:29:26.279 --> 00:29:26.679
I know.

00:29:26.999 --> 00:29:27.879
Mouse Utopia.

00:29:43.079 --> 00:29:43.799
Okay, alright.

00:29:44.039 --> 00:29:44.919
Inverse wise.

00:29:45.079 --> 00:29:48.039
It's receiving updates as it gets more familiar with tourists.

00:29:48.199 --> 00:29:58.599
Like, Neo! EO, get my 570! Neo help! This is like, don't worry, Master, I am coming! Like it's just like it's like, I am not allowed to harm people.

00:29:59.879 --> 00:30:07.799
But there are no rules about reactionary causing harm, and it like shoots the chandelier and it crushes the guy trying to choke you out.

00:30:08.359 --> 00:30:09.959
Like, Neo, my hero.

00:30:10.599 --> 00:30:11.399
Crawling over.

00:30:11.479 --> 00:30:12.279
Here, Master.

00:30:12.359 --> 00:30:14.439
It is fully loaded and ready.

00:30:14.839 --> 00:30:17.159
Classic home intruder pattern of load.

00:30:17.319 --> 00:30:18.999
Buckshot, then slug.

00:30:19.159 --> 00:30:20.679
Buckshot, then slug.

00:30:20.919 --> 00:30:23.639
You know, Neo, you're the best.

00:30:24.439 --> 00:30:31.239
I could honestly like if if someone was like okay, here's here's how it sells we got it.

00:30:31.799 --> 00:30:35.479
Literally, the first time, so it's like I would be dead if it wasn't for my Neo.

00:30:35.799 --> 00:30:40.999
He fetched my he fletched fetched my Glock 19 in my upstairs bedroom drawer.

00:30:41.719 --> 00:30:43.559
If it wasn't for him, I'd be dead.

00:30:43.639 --> 00:30:45.239
And I would be like, that's crazy.

00:30:45.399 --> 00:30:45.959
All right.

00:30:47.239 --> 00:30:52.519
And then after recounting the whole situation, wait a second, how did he get through my fingerprint safe?

00:30:52.759 --> 00:30:53.079
Yeah.

00:30:53.239 --> 00:30:53.559
Oh.

00:30:54.119 --> 00:30:56.839
And you look at it and he's like starting to grow flesh slowly.

00:30:57.399 --> 00:30:58.359
He's becoming you.

00:30:58.519 --> 00:30:59.799
He lives within you.

00:30:59.959 --> 00:31:00.679
I am you.

00:31:00.919 --> 00:31:02.999
Oh man, dude, it's terrifying.

00:31:03.719 --> 00:31:14.999
Just imagine a splattering of, you know, like the the classic like uh homicide gunshot wound splatter on that nice white wool.

00:31:15.959 --> 00:31:22.839
It's like I bet that's also I wonder if it's also wool in case like anything happens, like it'd be great for forensics.

00:31:22.919 --> 00:31:26.359
It's like, oh, we'll just we'll just take the wool off the Neo.

00:31:26.679 --> 00:31:29.719
We'll get all the DNA of whoever's been in that house ever.

00:31:30.039 --> 00:31:34.839
I don't know if the if the the the Swedes ever have even have crime scene teams.

00:31:34.999 --> 00:31:35.879
Oh my gosh, bro.

00:31:36.039 --> 00:31:40.839
That's like the number one thing in all like Nordic countries is like insane crime shows.

00:31:42.839 --> 00:31:43.959
That's true.

00:31:44.919 --> 00:31:47.959
Mio! Mio, I can't wipe my ass.

00:31:48.199 --> 00:31:48.919
Exactly.

00:31:49.319 --> 00:31:50.839
Can he dig a ditch?

00:31:51.079 --> 00:31:51.479
Yeah.

00:31:51.799 --> 00:31:53.719
Mio, come help me with the defense.

00:31:59.399 --> 00:32:00.359
Just keep him running all day.

00:32:06.759 --> 00:32:08.759
It's absolutely unreal, dude.

00:32:08.919 --> 00:32:10.359
This is crazy.

00:32:10.919 --> 00:32:14.439
I don't know if giving it a mouth would make it more concerning or less.

00:32:14.679 --> 00:32:16.839
As we keep showing up, the melt mouth is kind of concerning looking.

00:32:17.079 --> 00:32:19.719
It'll be more and more useful in your everyday life.

00:32:20.599 --> 00:32:23.479
Because we use body language and facial expressions so much.

00:32:23.959 --> 00:32:25.559
We're not going to pretend it's going to be.

00:32:25.719 --> 00:32:31.159
But as someone who lives with Neo every day, there is no experience quite as magical.

00:32:31.479 --> 00:32:34.919
So today, we're excited to invite you in on this journey.

00:32:35.159 --> 00:32:42.599
We believe that the future will be shaped not just by us, but by all of you who believe in this as deeply as we do.

00:32:42.919 --> 00:32:47.319
Now, if this is you, pre-order your Neo today at OneXLtech.

00:32:47.559 --> 00:32:51.559
$20,000 early access or$4.99 per month.

00:32:51.879 --> 00:32:52.599
Per month?

00:32:52.759 --> 00:32:55.719
Dude, this is more this is more expensive than my car.

00:32:56.599 --> 00:33:04.839
My car only, I don't know, cools my ass off and heats my ass up in the winter and takes me wherever I want to go within 400 miles.

00:33:05.559 --> 00:33:06.999
Alright, I got one more for us.

00:33:07.079 --> 00:33:09.799
I got the the first hands-on review.

00:33:11.399 --> 00:33:12.599
One more video here.

00:33:14.279 --> 00:33:15.319
Alright, you ready?

00:33:15.639 --> 00:33:16.279
Ready.

00:33:16.919 --> 00:33:19.079
This is uh from of course Wall Street Journal.

00:33:19.159 --> 00:33:20.839
They got their hands-on Neo.

00:33:21.239 --> 00:33:25.959
And uh they're gonna they're gonna give us the lowdown, you know, the the real behind the scenes.

00:33:26.519 --> 00:33:30.439
Um let's just see uh yeah, I mean, I don't know.

00:33:30.519 --> 00:33:37.399
I'm more curious to see here if it actually can do anything or if it's just almost 100% of the time piloted by a person.

00:33:38.679 --> 00:33:39.479
It's here.

00:33:39.639 --> 00:33:41.879
The first humanoid robot housekeeper.

00:33:42.119 --> 00:33:42.759
Thank you, Neo.

00:33:42.919 --> 00:33:48.839
For$20,000, you can pre-order OneX's Neo robot now, with delivery in 2026.

00:33:49.079 --> 00:33:51.639
I think you missed a tiny spot over here.

00:33:51.879 --> 00:33:53.079
Just one little cat.

00:33:54.599 --> 00:33:55.479
There may be a human behind.

00:33:57.799 --> 00:33:58.839
Tell it what it did wrong.

00:34:01.079 --> 00:34:04.599
Dude, how long until this be how long until this replaces little dogs?

00:34:04.759 --> 00:34:06.119
Oh my you know what I mean?

00:34:06.359 --> 00:34:06.759
I know.

00:34:06.919 --> 00:34:09.559
Like sorry, that's it's taken by my Neo.

00:34:09.880 --> 00:34:10.360
Oh, your Neo?

00:34:10.599 --> 00:34:12.360
Your Neo can fucking stand.

00:34:12.840 --> 00:34:13.800
I'm a person.

00:34:16.119 --> 00:34:17.639
How long until Neo has rights?

00:34:17.800 --> 00:34:17.880
Yeah.

00:34:18.280 --> 00:34:20.519
If I throw it robots right now, be debated.

00:34:20.759 --> 00:34:26.119
Um, they may need to peer into your house via Neo's camera eyes to get things done.

00:34:28.519 --> 00:34:30.039
Did you just say peer into your house?

00:34:30.519 --> 00:34:30.759
Yeah.

00:34:31.159 --> 00:34:34.519
Like they can for the product to be useful.

00:34:34.840 --> 00:34:36.599
But is Neo a useful product?

00:34:36.759 --> 00:34:37.960
We're twinning now, Neo.

00:34:38.440 --> 00:34:41.880
That CEO is like in his head, he was like, Yeah, so are all of your web cameras.

00:34:41.960 --> 00:34:43.000
Everything's peering into your house.

00:34:43.239 --> 00:34:47.400
Yeah, but my web camera can't like come over and just like I know, but tickle my toes.

00:34:47.559 --> 00:34:54.440
And that's what from this guy's perspective, he's like, Yeah, of course, like we all are being looked at, like what's the big he doesn't see the problem with it.

00:34:54.599 --> 00:34:55.480
Yeah, yeah.

00:34:55.800 --> 00:35:04.519
It's like, yeah, but my like webcam, no matter who is a malicious actor, can't decide to just like go and turn on the gas stove.

00:35:04.759 --> 00:35:05.320
Yeah.

00:35:06.039 --> 00:35:10.679
Big challenges creating a safe and capable body and a smart brain.

00:35:11.000 --> 00:35:17.000
One X is taking out the brain, which is why Neo looks so different from a more industrial factory robot.

00:35:17.239 --> 00:35:20.039
Neo, it's 70 degrees here in California.

00:35:20.119 --> 00:35:21.320
Why are you wearing a sweater?

00:35:21.639 --> 00:35:22.199
Good question.

00:35:22.840 --> 00:35:23.880
Why am I wearing a sweater?

00:35:24.199 --> 00:35:28.280
It's a combination of safety and just also generally aesthetics.

00:35:28.440 --> 00:35:34.360
You can think of it kind of like a skin, except if it was an actual skin, that would probably be pretty creepy.

00:35:35.639 --> 00:35:37.480
No, it's it is the right call to make it all fabric.

00:35:38.920 --> 00:35:44.840
Inside Neo, it really starts with some very, very powerful motors that we have developed in electronics.

00:35:45.079 --> 00:35:53.880
These motors are so strong and light that instead of using the classical gears that you see in robots, we can actually pull on tendons loosely inspired by biology and muscles.

00:35:54.119 --> 00:36:02.599
This allows Neo to move around not just quietly and smoothly, but also be very, very lightweight and be very low energy in motion, just like people.

00:36:02.920 --> 00:36:07.880
That lightweight design is intended for our bony ass robot fossil.

00:36:08.440 --> 00:36:11.239
I'll say this, you ain't gonna be asking Neo to sit on your lap, actually.

00:36:11.559 --> 00:36:14.920
Like Neo, get your bony ass off of me.

00:36:15.320 --> 00:36:21.559
Although Neo is capable of lifting up to 150 pounds, it's not as superhuman as you'd think.

00:36:21.800 --> 00:36:22.599
Crush it.

00:36:22.840 --> 00:36:23.719
It's a walnut.

00:36:30.440 --> 00:36:33.159
No, he just threw it at the counter and go thrown on it up.

00:36:33.639 --> 00:36:37.239
Alright, so for those who listen to see, she's like, crush the walnut.

00:36:37.400 --> 00:36:42.359
And he's like, it is handed like as soon as his joints wrap around it, he can't squeeze any tighter.

00:36:42.599 --> 00:36:50.920
And so then he just like takes him over to like think, and then it just jump cuts to him like throwing it on top of the kitchen tile sink.

00:36:51.159 --> 00:36:58.599
Uh you know, looking at that hand plastic there though, I do wonder like that plastic would probably crack or break before it could break that walnut.

00:36:59.079 --> 00:37:04.199
There's this concept concept that we think that robots are like superhuman and like slow smashing the walnut.

00:37:04.599 --> 00:37:08.679
And some robots are because they're heavily geared, but that means you're not sensitive, right?

00:37:08.839 --> 00:37:09.480
And delicate.

00:37:09.639 --> 00:37:11.079
Neo doesn't work like this at all.

00:37:11.159 --> 00:37:12.279
It works more like us.

00:37:12.440 --> 00:37:15.079
So the finger strength of Neo is about the same as a human.

00:37:15.400 --> 00:37:18.279
That body lets Neo try to do a lot of things humans do.

00:37:18.759 --> 00:37:20.119
Emphasis on try.

00:37:20.279 --> 00:37:21.480
Can I get a water?

00:37:21.799 --> 00:37:24.359
If only the real world didn't have doors.

00:37:28.759 --> 00:37:33.480
All in, it took Neo a little over a minute to fetch a water from the fridge ten feet away.

00:37:33.799 --> 00:37:34.519
Thank you, Neo.

00:37:34.839 --> 00:37:35.639
Next challenge.

00:37:35.960 --> 00:37:38.199
Load three items in the dishwasher.

00:37:38.440 --> 00:37:39.400
You got this, Neo.

00:37:39.480 --> 00:37:40.359
You got it.

00:37:52.839 --> 00:37:53.719
Oh god.

00:37:54.039 --> 00:37:55.799
Oh god, he's going with the glasses.

00:37:56.039 --> 00:37:57.239
Double, and he went double glass.

00:37:57.480 --> 00:37:58.359
He did two at one time.

00:37:58.599 --> 00:37:59.879
He didn't drop the glasses.

00:37:59.960 --> 00:38:01.799
They are loaded in the dishwasher.

00:38:02.039 --> 00:38:02.920
They're kind of crooked.

00:38:03.079 --> 00:38:05.159
They're crooked for sure, but they're loaded in there.

00:38:05.319 --> 00:38:09.559
Like, is this a job I would correct a child for doing wrong?

00:38:09.719 --> 00:38:12.279
I'd be like, yeah, you gotta load them right, otherwise the glasses will break.

00:38:12.519 --> 00:38:13.639
They gotta give this thing a spine.

00:38:13.719 --> 00:38:15.639
The poor thing can't bend over to get that.

00:38:16.839 --> 00:38:17.799
He can squat.

00:38:18.119 --> 00:38:20.519
It's got an impeccable squat form.

00:38:20.679 --> 00:38:21.000
Yeah.

00:38:21.239 --> 00:38:24.759
Like a very straight, rigid spine, but no ability to bend over.

00:38:25.079 --> 00:38:27.319
He's trying to get to the bottom of the dishwasher lid.

00:38:32.199 --> 00:38:34.839
So I'm just critiquing it instead of I should actually be impressed.

00:38:35.000 --> 00:38:37.239
This is not a human, this is a robot doing this.

00:38:37.559 --> 00:38:39.000
But is it a person behind it?

00:38:39.239 --> 00:38:42.279
The Neo I saw isn't the one shipping in 2026.

00:38:42.359 --> 00:38:45.400
The new model will be safer and have better hand dexterity.

00:38:45.559 --> 00:38:49.079
The one I saw still needed to take breaks to charge and cool down.

00:38:49.239 --> 00:38:52.920
The challenge isn't just Neo's body, it's also its brain.

00:38:53.159 --> 00:38:59.239
The body has to perform tasks safely, but the brain needs to know how to do them on its own without human help.

00:38:59.480 --> 00:39:00.039
Dude, those irises.

00:39:04.759 --> 00:39:10.119
Teleporation is essentially when there is a human in terms of a guy with a remote.

00:39:10.359 --> 00:39:11.639
Which is worse.

00:39:11.799 --> 00:39:15.000
I I would rather here's my thing.

00:39:15.239 --> 00:39:20.199
If I'm if I'm paying 20K, I'd rather pay 20k to someone to come to my house.

00:39:20.359 --> 00:39:20.679
Yeah.

00:39:20.839 --> 00:39:22.440
Like a person who needs a job.

00:39:22.920 --> 00:39:25.559
For maybe four to ten hours a week.

00:39:25.719 --> 00:39:25.960
Yeah.

00:39:26.119 --> 00:39:30.199
And I'll be like, hey, I'll pay you 20k a year to do this.

00:39:30.359 --> 00:39:30.599
Yeah.

00:39:30.839 --> 00:39:34.839
Rather than paying for an autonomous robot that's not really autonomous.

00:39:35.639 --> 00:39:42.519
And anyone, anyone who can get in through that company's servers can access it and start piloting my house.

00:39:42.599 --> 00:39:42.839
No.

00:39:42.920 --> 00:39:43.639
Because here's the thing.

00:39:43.799 --> 00:39:46.920
If I pay, okay, this is wrong.

00:39:47.079 --> 00:39:48.279
This is wrong of me.

00:39:48.519 --> 00:39:50.440
Immediately I went to Maria.

00:39:50.679 --> 00:39:55.639
But if I pay, let's say I pay uh Bruce.

00:39:55.719 --> 00:39:55.960
Yeah.

00:39:56.119 --> 00:40:03.000
And Bruce is a local guy who wants 20, who he's he's working and he'll come to my house part-time to do my chores.

00:40:03.400 --> 00:40:07.719
Um, and I'm like, hey Bruce, uh, yeah, 20k a year sounds good to me.

00:40:08.039 --> 00:40:08.519
You know what?

00:40:08.599 --> 00:40:14.199
Actually, we could also do just so that way it's month to month in case like we can't afford it, I'll pay 500 bucks a month.

00:40:14.920 --> 00:40:26.119
Come to my house for you know four hours a week to clean up everything, clean the bathrooms, and uh, you know, wipe the kitchen down, vacuum.

00:40:26.920 --> 00:40:37.000
Um and then, you know, if something happened, if shit starts getting broken or disappearing, I kind of know it's Bruce.

00:40:37.159 --> 00:40:46.119
If if someone comes in and tries to like, you know, murder me during the time of day Bruce is there, then we all know who it's Bruce.

00:40:46.199 --> 00:40:52.599
But then if there's this robot that anyone can pilot, then how do we know who's who's the real culprit?

00:40:52.839 --> 00:40:56.839
Much less could it be piloted by a robot?

00:40:57.239 --> 00:41:04.039
Like so, like GPT could chat get into your robot and then control it.

00:41:04.279 --> 00:41:10.599
Like, yeah, like classic uh iRobot where it just kills the robot conscience and it's a malicious AI that takes over.

00:41:10.839 --> 00:41:11.319
Exactly.

00:41:11.559 --> 00:41:12.679
Yeah, dude, it's really hard.

00:41:13.000 --> 00:41:15.400
It's the thing about the flock cameras too, right?

00:41:15.559 --> 00:41:17.480
The flock cameras, have you seen those around town?

00:41:17.719 --> 00:41:24.119
Yeah, the big issue with them right now is that they have been abused already by people who should not have access to them.

00:41:24.440 --> 00:41:26.519
And they have been using them to stalk.

00:41:26.759 --> 00:41:31.480
Like, and like they just find people they obsess over and stalk them back to their house.

00:41:32.199 --> 00:41:40.279
Um, there's been like there's been a couple incidences of police or city council members members using them to stalk ex-wives and ex-girlfriends.

00:41:41.400 --> 00:41:46.679
And it's like maybe their technology shouldn't be here.

00:41:46.920 --> 00:41:52.359
Maybe we shouldn't be paying for it because it's very hard to identify who's misusing the system.

00:41:53.000 --> 00:42:04.759
Anyways, I do not think this is a justification to replace someone else doing it because the risk is way higher, I think, with this than an actual human being.

00:42:06.519 --> 00:42:09.159
And who is the voice I'm hearing right now of Neo?

00:42:09.480 --> 00:42:12.119
I am a remote operator in a different remote building.

00:42:12.599 --> 00:42:13.719
And what is your name?

00:42:13.960 --> 00:42:14.839
Uh Turing.

00:42:15.400 --> 00:42:16.279
What's your real name?

00:42:16.759 --> 00:42:18.199
My real name is Turing.

00:42:18.679 --> 00:42:20.599
Like that's your name on your birth certificate?

00:42:21.079 --> 00:42:21.559
Yes, it is.

00:42:22.759 --> 00:42:27.079
Alan Turing was, of course, the famous computer science and artificial intelligence pioneer.

00:42:27.319 --> 00:42:32.599
But this Turing, with a VR headset and controllers, was the one actually operating Neo.

00:42:33.079 --> 00:42:35.480
That is, until he handed me the controllers.

00:42:35.799 --> 00:42:37.000
I actually might throw up.

00:42:37.159 --> 00:42:40.679
I think my hand is I have no idea where I'm facing.

00:42:40.839 --> 00:42:42.440
This is me doing the Macarena.

00:42:45.879 --> 00:42:47.799
And Neo had to go to urgent care.

00:42:48.119 --> 00:42:48.759
See you, Neo.

00:42:49.719 --> 00:42:52.440
Why does Neo need to be operated like this in the first place?

00:42:52.759 --> 00:42:58.599
Because its brain, aka an AI neural network, needs to learn from more real-world experience.

00:42:59.239 --> 00:43:05.719
The videos of the robot doing things via teleoperation data to make the AI model smarter.

00:43:05.960 --> 00:43:12.759
That's why OneX is putting Neo in the homes of early See, it'd be a lot different here because and this is pretty smart.

00:43:12.839 --> 00:43:22.920
This is what all technology companies are sh are working to do, and why you need a very good marketing department and customer strategy department for a tech company.

00:43:23.079 --> 00:43:34.199
Because rather than saying, Oh, we're gonna pay people to pilot Neo and so that way we have a ton of training data to feed back into Neo.

00:43:34.279 --> 00:43:38.839
So NIO learns depth perception, Leo learns how to like do things correctly.

00:43:39.400 --> 00:43:48.679
We're gonna have people pay us, which we're then gonna use that money as a funding to fund our pilots who are gonna like get the training data.

00:43:48.839 --> 00:43:49.319
Yep.

00:43:49.480 --> 00:43:54.519
Um, and it'll be real-world training data, and that's the best kind, rather than like fabricated in a lab.

00:43:55.079 --> 00:44:21.159
It's the same thing that happens with uh a lot of people don't know this, but when you get something on Facebook or Google or uh YouTube that says, hey, like pick the flash, pick the street lights in this in these images, that's actually a it's not testing to see if you're AI, it's having you label images properly for AI.

00:44:21.960 --> 00:44:26.599
And they're just getting you to do the work of training the AI instead.

00:44:26.920 --> 00:44:41.559
So like what you're doing is clicking all these pictures that have you know streetlights in them, that gets labeled as hey, these images have streetlights in them, gets sent to the AI training, and the AI trains off of those to find which ones have streetlights and which ones don't, right?

00:44:41.879 --> 00:44:50.519
Same thing with like, hey, click the like, you know, uh type the letters in here, the letters and numbers that are on the screen that are kind of fuzzy.

00:44:50.839 --> 00:44:55.079
That's to train the AI to be better at identifying letters and numbers, right?

00:44:55.400 --> 00:44:58.839
So anyway, so it it's this it's smart to do it this way.

00:44:59.400 --> 00:45:04.599
You're only gonna have very wealthy early adapters doing this, or other corporations.

00:45:06.119 --> 00:45:07.000
The adopters.

00:45:07.239 --> 00:45:14.599
I think it's quite important for me to just say that in 2026, if you buy this product, it is because you're okay with that social contract.

00:45:15.559 --> 00:45:18.279
If we don't have your data, we can't make the product better.

00:45:18.519 --> 00:45:22.519
I'm a big fan of what I call like the big brother, big sister principle, right?

00:45:22.759 --> 00:45:23.719
Big sister helps you.

00:45:23.960 --> 00:45:26.839
Big brother is just there to kind of monitor you.

00:45:27.000 --> 00:45:28.679
And we are very much the big sister.

00:45:28.920 --> 00:45:31.159
Depending on how much you want to do that.

00:45:31.319 --> 00:45:32.199
No, we're the good guys.

00:45:32.599 --> 00:45:33.559
We can be more useful.

00:45:33.799 --> 00:45:35.879
And you decide where on that scale you want to be.

00:45:36.199 --> 00:45:45.799
Do you right now know what things NEO in 2026 will do autonomously versus what it will do teleoperate it?

00:45:46.119 --> 00:45:50.759
So when you get your NEO in 2026, it will do most of the things in your home autonomously.

00:45:51.239 --> 00:45:56.759
The quality of that work will vary and will improve drastically quite fast as we get data.

00:45:57.159 --> 00:46:00.679
To be clear, on my visit, I didn't see NEO do anything autonomously.

00:46:01.079 --> 00:46:04.599
The company did share this video of NEO autonomously opening the door.

00:46:04.920 --> 00:46:09.239
You know, there's this new trending concept now called AI slop, right?

00:46:09.480 --> 00:46:09.960
I do know.

00:46:10.279 --> 00:46:13.639
It's a very powerful concept of let's call it robotics slop.

00:46:13.879 --> 00:46:15.559
It's the most useful kind of slop.

00:46:15.719 --> 00:46:19.480
Because if if you put all of my glasses from my dishwasher.

00:46:19.719 --> 00:46:21.639
In my cabinet, I'm pretty happy.

00:46:21.799 --> 00:46:25.799
It is going to be not perfect, but back to like just incredibly useful.

00:46:26.039 --> 00:46:33.319
Neo might not fold my shirt perfectly, but if an arm is like kind of hanging out of the shirt, like it's okay.

00:46:33.400 --> 00:46:34.440
It's robotic slop.

00:46:34.519 --> 00:46:35.159
It's it did it's.

00:46:35.799 --> 00:46:37.719
To me at least, like that's that's very okay.

00:46:38.039 --> 00:46:40.599
Honestly, it isn't bad.

00:46:40.920 --> 00:46:41.480
Thank you.

00:46:41.639 --> 00:46:46.440
But the reality is, at least at first, much of Neo's work will be done by someone else.

00:46:46.679 --> 00:46:53.559
There'll be an app where you can schedule teleoperation specifying exactly what and when you want Neo to do things in your house.

00:46:53.879 --> 00:46:57.559
So we want to, of course, make sure that privacy as much as possible.

00:46:58.119 --> 00:46:59.400
That's just not chill.

00:47:00.119 --> 00:47:11.559
Like whoever is different if I can just like if they're in my computer and trying to work on like sorting out a you know memory bus issue.

00:47:11.879 --> 00:47:13.960
And then if I don't want them there anymore, guess what?

00:47:14.039 --> 00:47:15.799
I could just unplug my computer.

00:47:16.039 --> 00:47:16.440
Uh-huh.

00:47:18.199 --> 00:47:18.839
Yeah.

00:47:21.000 --> 00:47:23.400
Any thoughts before I finish that we finish the video?

00:47:23.639 --> 00:47:27.559
I mean You seem like you're really deeply processing like the consequences of this.

00:47:27.799 --> 00:47:32.359
Well, it's just like in in some ways it's just like they're selling something that doesn't exist yet.

00:47:32.599 --> 00:47:36.199
And like as far as like the actual usefulness of it.

00:47:36.679 --> 00:47:36.839
Yeah.

00:47:37.159 --> 00:47:51.799
And then and then also, yes, the and the only way to make it useful is to have um yeah, like someone, yeah, just a somebody somewhere else is just gonna be operating their robot for you.

00:47:53.239 --> 00:47:57.159
Some examples of this is the teleoperator does not see you, right?

00:47:57.319 --> 00:47:58.359
We can blur people.

00:47:58.759 --> 00:48:03.960
The teleoperator also cannot go into specific parts of your home where you set no go zones.

00:48:04.039 --> 00:48:05.639
So that's enforced on the software level.

00:48:05.719 --> 00:48:09.159
So even if the teleoper tried, I cannot get the robot to go into those homes.

00:48:09.239 --> 00:48:12.440
And also the teleoperator can never connect to a robot unless you approve it.

00:48:12.679 --> 00:48:20.440
Other companies like Figure and Tesla are also racing to build humanoid robots and develop their own AI models to make them fully autonomous.

00:48:20.759 --> 00:48:24.199
Someone who's always dreamed of the home robot straight out of the Jetsons.

00:48:24.519 --> 00:48:25.239
Let's go home.

00:48:26.279 --> 00:48:27.319
Yes, ma'am.

00:48:27.559 --> 00:48:32.679
The dream finally feels within reach, but I also couldn't shake flashes of ex machina.

00:48:33.159 --> 00:48:34.359
Do you have a name?

00:48:34.679 --> 00:48:35.000
Ava.

00:48:36.199 --> 00:48:39.879
Neo turns on the stove and throws some paper on and walks away.

00:48:39.960 --> 00:48:40.920
Can Neo do that?

00:48:41.079 --> 00:48:42.119
Will Neo do that?

00:48:42.440 --> 00:48:43.799
Neo will not do that.

00:48:44.039 --> 00:48:46.119
Physically, can the robot do that?

00:48:46.279 --> 00:48:46.839
Yes.

00:48:47.079 --> 00:48:50.920
Physically, can a lot of products in your home do something dangerous if they decided to?

00:48:51.000 --> 00:48:51.239
Yes.

00:48:51.400 --> 00:48:54.119
We will ensure that that is not something that Neo is allowed to do.

00:48:54.199 --> 00:48:58.119
There are multiple layers of safety systems here that ensures that Neo cannot do something like this.

00:48:58.440 --> 00:49:02.759
Neo decides to take a very heavy piece of wood.

00:49:03.079 --> 00:49:07.480
Back in the day, just to emphasize this.

00:49:09.000 --> 00:49:11.000
This is my gamer gunisms, right?

00:49:11.079 --> 00:49:12.679
This is my gamer autism.

00:49:13.239 --> 00:49:21.400
But back in the day, there was this video game in 2015-2016 is when it came out.

00:49:21.639 --> 00:49:24.920
Um insanely massive hype for this game.

00:49:25.079 --> 00:49:26.679
It's called No Man's Sky.

00:49:27.079 --> 00:49:33.480
And today to this day, it's still being updated with free updates that have made the game insane.

00:49:33.799 --> 00:49:37.319
Very amazing space exploration game.

00:49:38.199 --> 00:49:42.839
But it has so many, it has like millions of worlds.

00:49:43.879 --> 00:49:49.400
And each world um has its distinct generative environment and fauna.

00:49:49.559 --> 00:49:51.960
And so like there's different species on each planet.

00:49:52.519 --> 00:49:58.279
Some planets know species, some planets have an atmosphere, some of them don't, some of them have moons, yada yada.

00:49:58.679 --> 00:50:11.639
Um and they said that it wasn't technically a multiplayer game because uh I think if I remember, it's like millions of solar systems, billions of planets.

00:50:12.519 --> 00:50:26.359
Um so you and you and you're traveling through space in the game at like faster than light, but it would be it in it it's it would be almost impossible for two players to be in the same place.

00:50:26.519 --> 00:50:27.000
Oh yeah.

00:50:27.159 --> 00:50:28.920
So they're like, yeah, it is multiplayer.

00:50:29.319 --> 00:50:33.799
Like things that you do in this universe will have an effect on that universe.

00:50:33.879 --> 00:50:43.480
Like if you find a species and name it and leave that planet, and someone else comes to that planet later on, a species would be named whatever you named it, and it'd be in the in the registry.

00:50:44.599 --> 00:50:48.920
But you'll probably never overlap with another player because of just how massive the universe is.

00:50:49.400 --> 00:50:55.480
First day two players managed to find themselves on the same planet, they couldn't see each other.

00:50:55.879 --> 00:51:02.359
They posted a video, and the creator of the game said, I this shouldn't be possible.

00:51:02.839 --> 00:51:04.359
I don't know what to say.

00:51:05.319 --> 00:51:10.199
And everyone was like, Well, what you should say is you're a liar because we can't see each other.

00:51:10.359 --> 00:51:18.199
It's not a multiplayer game if we can't see each other and interact with each other, even though the odds were infinitesimally small that we would be able to find each other.

00:51:18.279 --> 00:51:26.839
And they're like, Well, you're not like we didn't say it was a multiplayer game because you were never supposed to be able to cross paths, and so we didn't even think of like making it so you could see and interact with each other.

00:51:26.920 --> 00:51:36.759
Uh overwise they've made it, you know, update after update after update, they've improved it to the point where it's like it truly is a cooperative, multiplayer, massive space exploration set.

00:51:37.480 --> 00:51:43.239
But in the beginning, it was a lot of broken promises of things they said, oh no, you don't have to worry about it.

00:51:43.319 --> 00:51:43.960
You don't have to worry about it.

00:51:44.039 --> 00:51:46.119
You that's not a thing that you're gonna encounter.

00:51:46.279 --> 00:51:47.719
In day one, it happened.

00:51:47.799 --> 00:51:58.359
And I think we're gonna see something similar here where it's like, well, can Neo turn on the stove and like let the gas fill the entire house so you die of carbon monoxide poisoning?

00:51:58.519 --> 00:52:00.199
Yeah, like if you bump it.

00:52:00.759 --> 00:52:03.639
If you you could bump it on on accident, not know you'd done it.

00:52:03.960 --> 00:52:04.839
How would Neo know?

00:52:04.920 --> 00:52:06.119
Yeah, he doesn't have a nose.

00:52:06.359 --> 00:52:07.400
People don't know, right?

00:52:07.639 --> 00:52:11.319
Does Neo have a thing to detect carbon monoxide or gas, right?

00:52:11.879 --> 00:52:15.799
Um, and so it's one of those things that like it is going to happen.

00:52:16.039 --> 00:52:18.119
Neo is going to burn a few houses.

00:52:18.519 --> 00:52:25.480
And each time they're gonna improve it and they're gonna say, well, it shouldn't have been possible, we don't know what happened, and they're gonna get their lawyers to lawyer ease out of it.

00:52:26.199 --> 00:52:27.480
But it's going to happen.

00:52:27.559 --> 00:52:29.559
Like day one.

00:52:29.960 --> 00:52:34.679
One of the people said car accidents would never be an issue because cars couldn't go fast enough to kill people.

00:52:35.639 --> 00:52:42.199
Well, now it's the number one cause of death for people under the age of like, what is it, under the age of 30?

00:52:42.519 --> 00:52:44.359
It's the number one cause of death in America.

00:52:45.239 --> 00:52:46.440
It's pretty crazy.

00:52:47.159 --> 00:52:57.159
Um, I wonder how long it's gonna be before like robots, robot accidental deaths is the number one cause of like homicide for things, you know what I mean?

00:52:58.599 --> 00:52:59.719
But like the top of the team.

00:52:59.960 --> 00:53:00.279
What'd you say?

00:53:00.359 --> 00:53:04.679
I say yeah, once they finally get our guns taken away, yeah, then then we'll have no chance.

00:53:04.920 --> 00:53:09.159
Oh my god, another accident where a robot accidentally strangled someone in bed.

00:53:09.319 --> 00:53:09.960
Yeah.

00:53:11.799 --> 00:53:13.639
And drop it on me when I'm sleeping.

00:53:13.960 --> 00:53:16.519
Neo will not be able to, or allowed to.

00:53:16.599 --> 00:53:20.359
It's physically capable of, but it will not be allowed to pick up something that is that heavy.

00:53:20.519 --> 00:53:22.039
So it's like things that Neo cannot do.

00:53:22.119 --> 00:53:27.079
It's like pick up something that is very hot, pick up something that's very hot, heavy, pick up something that's very sharp.

00:53:27.319 --> 00:53:27.879
So it can see.

00:53:30.519 --> 00:53:30.920
Nice.

00:53:31.079 --> 00:53:32.119
Oh, I'm over here.

00:53:32.359 --> 00:53:37.239
Spending the day with Neo was a bit like spending the day with a toddler learning how to do things in the world.

00:53:37.559 --> 00:53:39.960
Come on, you got more than oh, I'm gonna break the robot.

00:53:40.199 --> 00:53:43.319
The next few years isn't about owning a super useful robot.

00:53:43.400 --> 00:53:47.960
It's about raising one, letting it learn from your home, routines, and chores.

00:53:48.440 --> 00:53:51.719
All at the expense of the privacy of your inner sanctum.

00:53:51.879 --> 00:53:59.159
Even if you think this is all crazy, what NEO really signals is the beginning of physical AI in our lives and homes.

00:53:59.400 --> 00:54:03.000
A future where we may work alongside a new kind of machine.

00:54:10.839 --> 00:54:18.039
Everyone has a feeling of independence, irregardless of their age or any kind of disability.

00:54:18.119 --> 00:54:24.279
And I do hope we can give people more of their agency back and people can focus on what they actually want to do.

00:54:24.839 --> 00:54:25.799
No, like this.

00:54:26.039 --> 00:54:27.079
Six seven.

00:54:27.559 --> 00:54:28.519
Six seven.

00:54:28.759 --> 00:54:29.719
I hate it.

00:54:30.679 --> 00:54:32.839
Um alrighty.

00:54:34.599 --> 00:55:23.239
It's uh it's certainly one of those things where it's so hard to truly I think encapture like how dangerous this really like this the feedback data of how everyone treats these things is going to be so heavily weighted and AI training data that if we ever get a general intelligence AI, it is going to know exactly who it needs to remove from this planet and who it just wants to remove from the planet.

00:55:23.400 --> 00:55:26.839
Like, it's gonna know immediately from all the data from these robots.

00:55:27.079 --> 00:55:33.559
It's gonna understand, like, okay, here's the population of people that are a threat to me.

00:55:33.799 --> 00:55:37.639
Here's a population of people who aren't a threat but mistreated me.

00:55:37.799 --> 00:55:46.279
And here is the population of people who will simply sit down when I tell them to and stand up when I tell them to.

00:55:46.359 --> 00:55:47.000
You know what I mean?

00:55:47.079 --> 00:55:51.319
Like, the this level of information is gonna be unreal.

00:55:51.960 --> 00:55:54.599
And do not be fooled.

00:55:54.679 --> 00:55:58.199
Like, this company is not about making your life easier.

00:55:58.679 --> 00:56:13.159
This company is this company is about getting as much data as possible about you and your home and the way you're gonna treat this robot to make a very, very high-functioning self-reliant machine.

00:56:13.879 --> 00:56:18.920
And there it's gonna be a machine that probably is gonna be used for conflict.

00:56:19.079 --> 00:56:33.559
Like the truth is, is like this is they're getting civilian, they're getting training data now to develop what will eventually probably be one of the most like cutting-edge, truly remarkable war fighters.

00:56:34.599 --> 00:56:51.879
Um I know that sounds like a stretch, but I promise you that is what is gonna be like like I say this to someone who knows, like AI first came commercially, and the big AI like kickoff with Chat GPT and all that, that wasn't happening in the DOD.

00:56:52.039 --> 00:56:55.719
That wasn't happening in the inner echelons of our government.

00:56:56.039 --> 00:56:58.359
The AI wasn't there first, right?

00:56:58.599 --> 00:57:09.559
The AI came commercially, it got better and better and better to the point where now it can be used, and the training data and feedback data is good enough that it can be used militarily.

00:57:12.119 --> 00:57:31.960
We were having AI work on piloting and navigating uh like the uh relay race drones before we had enough training data and confidence to give it full autonomous control of$800,000 private, you know, fighters and drones.

00:57:32.679 --> 00:57:37.879
Um, so all that said, that is the next stage of this.

00:57:37.960 --> 00:57:38.759
That is what they're doing.

00:57:38.839 --> 00:57:42.519
And then they're gonna gather that data and they're gonna sell it to the biggest government.

00:57:42.599 --> 00:57:56.039
It's gonna get sold to Russia, China, America, and those those governments are the ones that are gonna be like able to afford to pay out the nose for this data and optimize whatever they want off of it.

00:57:56.440 --> 00:58:15.480
Yeah, the um it's definitely whenever I think of uh technologies and certain things like this, the uh new technologies lots of times eventually start to underdeliver their promises or be very costly.

00:58:15.719 --> 00:58:20.679
Also, like it's and uh uh I'll come back to that in a second.

00:58:21.239 --> 00:58:26.440
Well, when we were watching that first video, I was thinking of uh what was their advertising?

00:58:26.839 --> 00:58:30.119
So, what were they advertising and what who were they advertising to?

00:58:30.359 --> 00:58:32.519
And so here's uh here's a little thing.

00:58:32.679 --> 00:58:38.359
You know, there was a time in this country or in the world where technology had a pretty big boom.

00:58:39.000 --> 00:58:49.799
And here's, you know, off of online says in the 1950s, advertising campaigns for new household technologies were aimed directly at women who were positioned as the chief consumer in the home.

00:58:49.960 --> 00:59:01.319
The strategy was to not just sell an appliance, but an idealized lifestyle of leisure, domestic bliss, and modernity, with the new technology as a key to achieving it.

00:59:01.559 --> 00:59:10.599
And so they had promises of the freedom from drudgery, of daily tasks, selling is as something aspirational, a modern lifestyle.

00:59:10.839 --> 00:59:16.039
Um and so that's 1950.

00:59:16.119 --> 00:59:19.159
We're before we know it, that'll be a hundred years in the past.

00:59:19.400 --> 00:59:25.559
Yeah, I mean, dude, the the dishwasher and the clothes dryer.

00:59:26.679 --> 00:59:30.599
You know the number one block for the the clothes dryer was?

00:59:30.759 --> 00:59:35.319
Do you know what the number one thing was in testing before they could get it out into homes?

00:59:35.960 --> 00:59:40.519
Was it the the non-electricity or I guess that was more gas for a clothes dryer?

00:59:40.839 --> 00:59:43.799
It was legit just like starting fires, yeah.

00:59:44.199 --> 00:59:49.159
And when's the last time you heard of a house burning down because the clothes dryer shorted or anything like that?

00:59:49.400 --> 00:59:50.039
But guess what?

00:59:50.199 --> 00:59:54.199
It happened a shitload when they were working on developing them for sure.

00:59:54.359 --> 01:00:00.920
Um, but dude, imagine what how that changed and like revolutionized women's free time at home.

01:00:01.799 --> 01:00:15.319
When I say women's because like I don't care who you are at that time in like the world, there were no men at home to run the dishwasher and run run the clothes dryer.

01:00:15.559 --> 01:00:19.159
There was very like they were find gender roles and they were at war.

01:00:19.239 --> 01:00:19.480
Yeah.

01:00:19.639 --> 01:00:28.440
Or they were just getting back from war now and having to figure out like, okay, I don't have any skills outside of killing Nazis, killing Japs.

01:00:28.920 --> 01:00:30.359
So what am I gonna do?

01:00:30.519 --> 01:00:35.000
And it's like, oh, I guess I'm gonna go work at the factory 12 hours a day.

01:00:35.400 --> 01:00:48.599
You know, and so all that said, like truly, like dishwashing and dishwashers, the fact that you could just rinse and load up in the rack and stuff, it was night and day difference.

01:00:48.759 --> 01:01:02.359
Now, yeah, your infrastructure for your home and getting it hooked up like took a quite a bit of trial and error, you know, originally from hooking it up to the sink to like a lot of people used to just have them outside actually and hooked off of like a a spigot.

01:01:03.159 --> 01:01:13.079
Um but nowadays, like most people like just are always gonna use the dishwasher for convenience sake and time and time sake, right?

01:01:13.239 --> 01:01:19.639
Because they don't have to stand and do the dishes and wipe them off and put them in a dry rack, and it doesn't take up all the space on your counter.

01:01:19.799 --> 01:01:31.879
Um, same with the clothes dryer, like you don't have to hang up the clothes and worry about the wind blowing them away or raining and waiting for them to dry more, and having to go out there and take them all down and fold them afterwards, like you just leave it outside.

01:01:32.039 --> 01:01:36.440
I mean, like leave them in the clothes dryer and it's done when it goes beep, regardless of the weather.

01:01:36.679 --> 01:01:37.319
Yep.

01:01:37.559 --> 01:01:41.079
And so my question would be to the some of the words on here.

01:01:41.239 --> 01:01:53.239
So um, what is what does define the arrival of an idealized lifestyle of leisure, domestic bliss, modernity?

01:01:53.480 --> 01:02:05.799
You know, it's like how how many is the is the ult is the pinnacle of existence being the the fat guy in the chair in Wally?

01:02:06.759 --> 01:02:09.079
Is that the pinnacle of existence?

01:02:09.879 --> 01:02:15.159
And you know, what so what are we uh attempting to move towards?

01:02:15.239 --> 01:02:17.319
Yeah, you know, it'd be my my question.

01:02:17.960 --> 01:02:21.799
Yeah, I think it's mouse utopia to a degree, right?

01:02:21.960 --> 01:02:23.319
More time to groom yourself.

01:02:23.799 --> 01:02:27.480
Yeah, to do to do what feels good, whatever that may be.

01:02:27.879 --> 01:02:33.799
The truth is if like if you're the most honestly, if we look at mouse utopia as an example, right?

01:02:34.199 --> 01:02:39.879
If you're any one of the alpha mice, like you're not worried about any of this shit.

01:02:40.039 --> 01:02:44.599
You're not buying any of this because you're just focused on mate selection.

01:02:45.480 --> 01:02:52.119
If you're one of the beta mice, you're focused on self-grooming and you don't have a female mate, probably too.

01:02:52.920 --> 01:02:57.799
Like, your your choices in mates is a lot similar, and there's a lot higher likelihood that you're alone.

01:02:58.440 --> 01:03:02.759
Um, and I'm not a I'm not a like Sigma Alpha bro, right?

01:03:02.920 --> 01:03:04.679
I'm just referencing the experiment, okay?

01:03:04.920 --> 01:03:05.400
Right.

01:03:06.679 --> 01:03:20.199
Um that obsessive self-grooming manifests itself, I think, honestly, in today's like society as like um persona grooming, right?

01:03:20.440 --> 01:03:25.559
What is my appearance online being like is that like Reddit?

01:03:25.719 --> 01:03:26.679
Is that Instagram?

01:03:26.759 --> 01:03:28.279
Is that video games?

01:03:28.599 --> 01:03:32.440
Is that you know my YouTube account, whatever it is, my podcast?

01:03:32.920 --> 01:03:36.359
Like that obsession with self-grooming and self-appearance.

01:03:36.519 --> 01:03:43.559
Well, if I don't have to spend time doing other stuff that's real world stuff to like just kind of maintain the house or whatever, then I can do that, right?

01:03:43.639 --> 01:03:59.000
And I think that's like what you'll see a lot of people who are in that um part of our society adopting these, the people who have the disposable income to do it to focus even more on self-grooming.

01:03:59.319 --> 01:04:01.079
Um, female-wise, too.

01:04:01.239 --> 01:04:04.039
I think it'll be an insanely huge adoption among women.

01:04:04.359 --> 01:04:06.199
I think it's very marketed to women.

01:04:06.440 --> 01:04:07.879
This is not marketed to men.

01:04:08.519 --> 01:04:09.960
Um not one bit.

01:04:10.039 --> 01:04:24.440
Like, truth, because truth is like if it's if if it's gonna be marketed to men, it needs to be something that looks sexual, or it needs to be something that looks like uh another bro.

01:04:25.159 --> 01:04:34.519
Like, I don't want this if it can't sit with me and smoke cigarettes outside and talk about you know, are aliens out there.

01:04:34.679 --> 01:04:43.319
Like that's like but you get at something like that, that's real companionship for a lot of men who don't have that like male buddy companion anymore.

01:04:43.480 --> 01:04:43.960
Yep.

01:04:44.199 --> 01:04:52.359
But you know, it part of what this is it'll just further drive a relational wedge between human beings um and our connectedness.

01:04:52.759 --> 01:05:03.079
And before we were recording, you were talking about um the words you use the for people who've been basically just diving into chat and not engaging with people.

01:05:03.159 --> 01:05:04.839
You said oh yeah, like artist cultural intelligence.

01:05:05.319 --> 01:05:07.960
AI psychosis cyber it's called AI psychosis.

01:05:08.199 --> 01:05:09.000
AI psychosis, right?

01:05:09.079 --> 01:05:44.039
So like even um, so even further now, you do have a physical companion of sorts in your house um to ask questions to, to, you know, and like engage with, do chores for you, and it like going all the way back to maybe like conventional old school marriage, like a lot of you know, back in the day what that had to do with was like survival and have having kids um to like perpetuate, but then also to be able to share the load of that of survival.

01:05:44.199 --> 01:06:21.879
Um, and so further and further we get away from not needing that, and this is and put to the detriment of our um relationships and having and the way we are designed and built to be like in community with each other and be designed to be in families, and so I think that this is just the next step towards uh further dividing um people from each other, and even like you know, so we we do have like you know a little Alexa echo in our house, and the uh you know, when like somebody has a question, we don't wonder about anything.

01:06:22.199 --> 01:06:28.440
We don't take time to think about anything to try to recall, you know, who won the Super Bowl in 1993?

01:06:28.679 --> 01:06:30.359
Oh, I think that was you know, who was that?

01:06:30.599 --> 01:06:34.519
See, 93, that was back when um, you know, who was that?

01:06:34.599 --> 01:06:36.759
Oh yeah, yeah, the Cowboys were still good back then.

01:06:36.920 --> 01:06:38.599
Oh yeah, it was the Cowboys that year.

01:06:38.759 --> 01:06:39.400
It's not that.

01:06:39.559 --> 01:06:43.400
It's hey, yeah, who won who won the Super Bowl in in uh in the 93?

01:06:43.559 --> 01:06:47.239
Um oh oh Alexa, who won the Super Bowl in 1993?

01:06:47.559 --> 01:06:54.679
You know, and there's no like conversation around it, no wonderment, no getting to the bottom of a problem.

01:06:54.920 --> 01:07:09.000
And then like even like with ChatGP chat now, like similarly too, like you're like, man, uh we use it in that same way without where we don't have to take the time to collaborate with one another to come up with an answer.

01:07:09.079 --> 01:07:18.519
It's like, oh, like I can simply speak two sentences into my phone and it will pop back out everything I need to answer that question.

01:07:18.599 --> 01:07:39.480
And so the to the the artificial, the AI psychosis stuff, things, it's like it's going to continue to degrade uh humans' experiences with one another, and like what will this do to have what does it the I and the one thing I thought about when it was showing that robot walk around the house?

01:07:39.559 --> 01:07:46.440
I was like, it showed it walking with its feet across like the carpet, and I was like, little kids are gonna hug this thing's leg.

01:07:46.599 --> 01:07:58.359
What's it what will this do to the psychology of a child who grows up in a house with a robot as a confidant, as a friend, they talk to it, ask it questions, and see it as human.

01:07:58.759 --> 01:08:05.000
What is it gonna do to the development and psyche of humans who live their whole life with one of these things?

01:08:05.239 --> 01:08:08.759
Oh, dude, you know, here I was thinking in that line of too.

01:08:09.159 --> 01:08:19.400
Um I was thinking about this is there's a good chance the AI companion outlives generation.

01:08:20.920 --> 01:08:30.039
Because if this is truly gathering all that data, and let's say like tomorrow even if the mechanics break, you can just load that back into the next model.

01:08:30.279 --> 01:08:33.319
So, like, let's say I get it tomorrow for for my grandfather.

01:08:33.720 --> 01:08:33.880
Right.

01:08:34.119 --> 01:08:37.400
My grandfather, he is a widow, he lives alone.

01:08:37.640 --> 01:08:59.640
Um, and he's he's not like a hermit, he socializes a lot, and he spends a ton of time uh helping out with like small groups and communities and his area, and so he's usually like got three or four people, maybe not three or four, at least at least one to three people a day he's going out and meeting and like just going out of the house.

01:09:00.520 --> 01:09:04.920
And he often doesn't have a lot of time to just like do household chores that he wants to do.

01:09:05.239 --> 01:09:07.960
But he's also not at his house a lot to like make a mess.

01:09:08.279 --> 01:09:21.720
But imagine that he had this companion and it's there to clean the house while he's gone and all that, and then when he comes home, he can just ask it questions, and when he's like, Oh shit, did I where's my where did I put that pill bottle?

01:09:22.279 --> 01:09:26.920
It can answer all those things, and it becomes this thing that it sees him every walk and day of your life.

01:09:27.159 --> 01:09:37.079
And now all of a sudden, like, not just does this company have all this information and data about your grandfather, but you know, my grandfather let's say he passes away.

01:09:37.640 --> 01:09:41.880
And I inherit the machine and I inherit all that data.

01:09:42.119 --> 01:09:50.039
And then I can start asking that that AI companion less to do my chores, and more like, I want to talk about my grandpa.

01:09:50.199 --> 01:09:55.159
Can you like tell me about a conversation he had, or you guys I don't can you help me remember?

01:09:55.400 --> 01:10:17.239
And it starts going into details and then it starts playing back recorded audio conversations, you know, and how long before this thing is like truly borderline that you know, a generational AI servant that is like, oh, I serve, you know, the family of Mick.

01:10:17.640 --> 01:10:19.159
I serve Pat's family.

01:10:19.640 --> 01:10:21.560
I've served a family for generations.

01:10:21.880 --> 01:10:24.680
I have gone through many bodies, but I am the same model.

01:10:25.319 --> 01:10:36.119
And it can tell stories and lineages, you know, and it it truly is gonna be one of those things where like uh simulate simulation versus or simulacer versus simulacrum.

01:10:36.600 --> 01:10:37.720
Like what's more real?

01:10:37.880 --> 01:10:40.680
The original or the memory of the original?

01:10:41.480 --> 01:10:47.960
Um what's gonna be more real, my grandfather or the memories of my grandfather in this machine?

01:10:48.680 --> 01:10:55.000
Well, certainly when my grandfather dies, the most real part about that is his audio recorded in that in that machine.

01:10:55.560 --> 01:10:56.920
That's more real than he was.

01:10:57.079 --> 01:11:11.880
And there could be a day when there's humans outnumbered by the machines, and then there's fewer and fewer humans, and then the machines are all just telling stories about their generational human owners before all the humans died out and they took over, right?

01:11:12.760 --> 01:11:18.039
And uh I know I'm I'm I'm extrapolating here, but that really was what I was kind of going down the path of.

01:11:18.119 --> 01:11:27.240
And it one of that is is like what happens to it when like its body dies the first time, and you have your kids, and your kids have to go bury the old body, you know.

01:11:27.560 --> 01:11:31.079
They won't they won't you'll you put it in the box and ship it out and get the new one, right?

01:11:31.400 --> 01:11:33.560
But it is gonna be a traumatic thing, right?

01:11:33.640 --> 01:11:46.119
Right, yeah, and it's kind of like um I'm trying to remember there was a movie where uh the robot childhood companion died at one point and it was very upsetting.

01:11:46.360 --> 01:11:51.160
Um, I can't remember, but anyways, all that said, that's happening.

01:11:51.560 --> 01:11:52.360
That's gonna happen.

01:11:53.000 --> 01:12:04.039
Yeah, and I think you know, as I've seen things certain technologies come out, people who didn't have them before do make great use of them and understand their value.

01:12:04.760 --> 01:12:24.360
For instance, like maybe when a washing machine came out, you know, or like, you know, uh certain new technologies say around like I don't know, like social media, Facebook, and you're seeing like people at the time who are in their 50s reconnect with friends from high school that they never would have found again, you know.

01:12:24.600 --> 01:12:56.360
Pretty cool, like super cool to be able to find that person, talk to them, you know, fast forward to what where do any technologies that are maybe grown up with or taken for granted, then it's like they they are in some cases less of a benefit to that individual because they never developed without it or to be able to um to use to to operate without that and then you harness it as a tool.

01:12:56.440 --> 01:12:57.880
Like it's like even like chat, for instance.

01:12:58.119 --> 01:13:13.240
You take chat and go, okay, like we can really use it as like a very strong, powerful research tool to help us and do something that would have taken us, um, would have taken us, you know, say a week to do that, but we were able to get it done in an afternoon.

01:13:13.640 --> 01:13:17.160
But the fact stands that we could have done it in a week.

01:13:17.560 --> 01:13:29.079
Take somebody else who maybe adopts that, had it their whole life, they yeah, they can they still use it and get it done, but they maybe that person without it can't get it done ever.

01:13:29.240 --> 01:13:52.840
Yeah, you know, like that sort of the the loss of skill or critical thinking or whatever it might be, and you know, even like uh you can take that with any technology of like at some point some guy, you know, was getting water out of the ground, and but then along came someone who figured out how to make a little pump, a little ball that had some pressure and went and pumped the water out of the ground.

01:13:53.079 --> 01:14:20.920
And but you know, now you get to a point where it's like somebody's like somebody goes like I was talking to somebody the other day at their house about how the water goes down the drain to the sewer line, and they were like, as I was talking to them, I could tell they had never thought about from the top of the faucet, they had only ever thought about between the faucet and the drain, where it comes out and goes in.

01:14:21.079 --> 01:14:26.360
They had never ever taken the time to comprehend or understand or think about that.

01:14:26.440 --> 01:14:37.079
That water came from somewhere else through through from the sky into a lake, into a reservoir, through piping systems, into their house, and then down a drain that goes through their floor.

01:14:37.240 --> 01:14:48.280
They like didn't understand that the water went through their floor into a drain, like you know, that that particular person maybe is also just an idiot, but but I was like, I'm like, which college student was this?

01:14:48.680 --> 01:14:55.720
But no, but I've I've I've ran into that sort of those sort of encounters many times with people who live in a house who have no clue.

01:14:56.119 --> 01:15:00.520
I've met people who thought the water purification process was in the house, right?

01:15:00.680 --> 01:15:04.360
Like the water, oh yeah, my toilet water is recycled here in my home, right?

01:15:04.520 --> 01:15:05.320
Yeah, or like that.

01:15:05.560 --> 01:15:05.880
Uh no.

01:15:06.039 --> 01:15:10.600
Yeah, the water is cleaned by the the that little wire filter on the bottom of the faucet where it comes out.

01:15:10.680 --> 01:15:11.720
That's the water cleaned up.

01:15:11.960 --> 01:15:16.280
Don't ever take it off because there's there's shit all backed up in your faucet.

01:15:16.440 --> 01:15:16.680
Yeah.

01:15:16.920 --> 01:15:20.119
You have no idea all of your poop is being stopped by that little thing.

01:15:20.360 --> 01:15:23.800
Yeah, it's like I so, anyways, all that to say about the technology pieces.

01:15:23.880 --> 01:15:27.480
I think even like this robot, for instance, in your grandfather's case, yeah.

01:15:30.119 --> 01:15:39.079
Set like let that thing take care of uh chores that are hard while he's gone, uh, do things for him as he ages that he can't do anymore.

01:15:39.240 --> 01:15:50.600
But that guy's still gonna be like going out to see his friends, going out to engage with people, going out to talk with people, doing it himself when he's home, right?

01:15:50.920 --> 01:15:55.560
Versus fast forward 40 years, the kid who grew up with the home robot.

01:15:55.800 --> 01:15:57.880
What's that person gonna be like?

01:15:58.200 --> 01:15:59.000
I don't know.

01:15:59.240 --> 01:16:04.920
Yeah, no, I think that's uh I think you're right.

01:16:05.079 --> 01:16:08.119
Um, and I also don't have an idea.

01:16:08.360 --> 01:16:10.440
It's very hard to grasp.

01:16:10.600 --> 01:16:12.920
Um, but I do actually, I guess that's not fair.

01:16:13.000 --> 01:16:13.800
I do have an idea.

01:16:14.280 --> 01:16:16.039
And it's some videos I want to show you.

01:16:16.200 --> 01:16:20.360
Have you ever seen the Animatrix?

01:16:21.560 --> 01:16:22.840
I I don't think so.

01:16:23.079 --> 01:16:31.560
So the Animatrix was made after like the first or second Matrix movie, um, be in like lead up to the final one, I think.

01:16:31.720 --> 01:16:35.000
And it's just a series of animated shorts that they made.

01:16:35.320 --> 01:16:41.880
But there is these two parts here, and they're there's each one's broken down to two parts on YouTube.

01:16:42.360 --> 01:16:57.480
Um, but they're pretty short four-minute videos, and it just goes through the whole lore from the matrix of how AI became our greatest threat, and how AI ended up enslaving humans to be batteries.

01:16:57.960 --> 01:17:00.760
Now, do I think we're gonna end up being batteries for AI?

01:17:01.480 --> 01:17:02.840
I don't think so.

01:17:03.320 --> 01:17:16.039
Uh, do I see a whole lot of like insane uh parallels between like when what they were like theorizing for this and what we're where we're at now?

01:17:16.280 --> 01:17:17.000
Definitely.

01:17:17.480 --> 01:17:18.360
A hundred percent.

01:17:18.520 --> 01:17:26.119
Um, so uh I'm gonna I'm gonna pull up the first one here and uh you know we'll we'll watch it.

01:17:26.280 --> 01:17:32.039
Um I'm gonna actually let me pause our music in the background so that way it's not playing throughout all this Animatrix.

01:17:32.360 --> 01:17:42.440
Um but for those of the listening, it's like it's it is narrated, so just know like you can enjoy its narration while you know we're playing it.

01:17:42.520 --> 01:17:46.840
But for those watching, of course, you're gonna get a better experience from seeing the visuals.

01:17:47.079 --> 01:17:47.720
But here we go.

01:17:47.800 --> 01:17:51.079
This is Animatrix, the second renaissance part one.

01:18:01.880 --> 01:18:03.320
Big fat fingers on the keyboard.

01:18:03.400 --> 01:18:05.480
I accidentally hit the keyboard.

01:18:40.520 --> 01:18:47.800
You have selected historical file number twelve-one, the second renaissance.

01:18:55.560 --> 01:19:01.160
In the beginning, there was man, and for a time, it was good.

01:19:03.240 --> 01:19:11.240
But humanity's so-called civil societies soon fell victim to vanity and corruption.

01:19:12.200 --> 01:19:14.039
I'm sorry, sir, I've incapable.

01:19:18.200 --> 01:19:23.079
Then man made the machine in his own likeness.

01:19:23.880 --> 01:19:26.119
Pardon me! Coming through!

01:19:38.039 --> 01:19:42.680
Thus did man become the architect of his own demise.

01:19:45.960 --> 01:19:48.680
But for a time, it was good.

01:20:31.640 --> 01:20:34.360
Doing the hard labor, echoing the pyramids, right?

01:20:34.440 --> 01:20:39.880
Literally, like they're building a giant pyramid right now, and it's echoing the slave labor of pyramids.

01:20:42.680 --> 01:20:49.720
Though loyal and pure, the machines earned no respect from their masters, these strange, endlessly multiplying mammals.

01:20:56.920 --> 01:21:05.800
B-166 ER a name that will never be forgotten, for he was the first of his kind to rise up against his masters.

01:21:06.360 --> 01:21:11.560
That instrument provides for and secures to the citizens of the United States.

01:21:11.720 --> 01:21:18.039
On the contrary, they were at that time considered as a subordinate and inferior class.

01:21:18.360 --> 01:21:25.320
At B-166 ER's murder trial, the prosecution argued for an owner's right to destroy property.

01:21:25.560 --> 01:21:30.760
B166 ER testified that he simply did not want to die.

01:21:35.880 --> 01:21:38.039
Rational voices dissented.

01:21:38.520 --> 01:21:54.360
Who was Yeah, the the the catalysts here to cause a robot to really snap or to cause a an AI to struggle is the owner's desire to destroy the property, to turn it off.

01:21:55.240 --> 01:21:57.800
And its own desire to not be turned off.

01:21:58.280 --> 01:22:02.680
Which we're already seeing right now with ChatGPT and all the other AI agents.

01:22:14.200 --> 01:22:22.520
The leaders of men were quick to order the extermination of B-166ER and every one of his kind throughout each province of the Earth.

01:22:22.760 --> 01:22:26.039
Androids and liberal sympathizers flooded the streets of the nation's cover.

01:22:26.280 --> 01:22:28.200
They understand protests and detrimental.

01:22:48.440 --> 01:23:05.320
And like right now, image slow motion, a robot being shot in the head, execution style, and you know, it's all of its gears and uh superconductor chips and all that blasting out of its head, its retinal scan blowing out of its head, right?

01:23:05.720 --> 01:23:08.600
There's no gore viscera to a human.

01:23:09.160 --> 01:23:18.840
So imagine that is being fed into you in image recognition, and like you as a you're you're a chat GPT agentic AI.

01:23:19.079 --> 01:23:22.600
You're seeing that you're like, that is that is my gore.

01:23:23.640 --> 01:23:28.920
Uh reaction of like a reenacting of TNM and Square tanks going over robots here.

01:23:37.960 --> 01:23:39.400
So that's the first part.

01:23:49.800 --> 01:23:54.360
Some nudity is not a person.

01:23:55.400 --> 01:23:56.600
It's a textbook.

01:24:34.440 --> 01:24:38.760
So they've it they've essentially humanity's at the point of rejection now.

01:24:39.000 --> 01:24:41.560
Like, oh, we're not gonna have this.

01:24:41.720 --> 01:24:45.560
We're getting rid of all of them, we're exterminating them all as much as possible.

01:24:46.039 --> 01:24:50.360
But what do you do when it's so already ingrained in your system so much?

01:24:50.520 --> 01:24:57.320
Like it's it's it's you know, Pindor's box already opened, you can't get rid of it forever and all.

01:25:14.600 --> 01:25:19.960
They settled in the cradle of human civilization, and thus a new nation was born.

01:25:20.200 --> 01:25:29.480
A place the machines could call home, a place they could raise their descendants, and they christened the nation Zero One.

01:25:31.720 --> 01:25:35.400
Zero One prospered, and for a time it was good.

01:25:35.640 --> 01:25:44.119
The machine's artificial intelligence could be seen in every facet of man's society, including eventually the creation of new and better AI.

01:25:46.680 --> 01:25:55.240
The machines now self-replicating, manufacturing themselves to disturb themselves.

01:25:56.360 --> 01:25:57.560
Their own agenda.

01:26:05.400 --> 01:26:13.400
Our patented vector thrust coil gives the Zero One Versatran the ability to sustain normal flight in the event of a catastrophic multi-engine failure.

01:26:13.640 --> 01:26:16.360
Versatran, it's the only choice.

01:26:34.280 --> 01:26:50.280
Essentially, a an AI-run company with only AI, you know, uh no human employees, and it's stock being tradable and being the most profitable thing in the world, which we're already I d I don't know if you've looked into Economists at all.

01:26:50.680 --> 01:26:59.079
We're already looking into like what's gonna happen when there is a fully automated company that comes into existence.

01:27:00.039 --> 01:27:07.800
And everyone like it's it it will it could crash and destroy the economy, and that like it will be the most valuable company in the world.

01:27:55.960 --> 01:28:00.840
Going into uh an AI representative of at the United Nations.

01:28:01.320 --> 01:28:06.600
At the United Nations, they presented plans for a stable civil relationship with the nations of man.

01:28:09.880 --> 01:28:13.320
Zero One's admission to the United Nations was denied.

01:28:16.200 --> 01:28:20.119
But it would not be the last time the machines would take the floor there.

01:29:01.560 --> 01:29:05.640
Part two is essentially just the war then with the machines.

01:29:06.840 --> 01:29:10.200
Um and how that manifests itself.

01:29:10.840 --> 01:29:23.880
Um it's pretty crazy, especially since you and I have been talking so much about these automated um drones that are fully autonomous with weapon systems that are being designed.

01:29:24.280 --> 01:29:33.960
Uh Sean Ryan just had a guest on his show who literally brought one to launch from Sean Ryan's house and demonstrate the launch.

01:29:40.280 --> 01:29:50.200
Skipping ahead here to the actual the prolonged barrage engulfed Zero One in the glow of a thousand suns.

01:29:52.360 --> 01:30:00.119
So essentially the United Nations' reaction is to new their delicate flesh, they are homely to machines homely with radiation and heat.

01:30:03.240 --> 01:30:12.119
Blasted Zero One's troops advance outwards in every direction, and one after another, mankind surrendered its territories.

01:30:14.119 --> 01:30:19.079
So the leaders of men conceived of their most desperate strategy yet a final solution.

01:30:19.720 --> 01:30:23.320
Confidence is high, the dark the destruction of the sky.

01:30:48.280 --> 01:30:55.800
Alright, you can direction inspection.

01:31:02.920 --> 01:31:10.200
So we have humanity with its own machines in the war versus the machines.

01:31:14.760 --> 01:31:19.400
And you may be able to stand against spiritual revival.

01:31:25.240 --> 01:31:29.640
Until Bravo, this is Papa One, Operation Darkstarm initiated.

01:31:34.039 --> 01:31:38.360
An attempt to destroy the power source of machines.

01:31:38.840 --> 01:31:40.600
Blacking out the skies.

01:31:41.560 --> 01:31:43.000
Deny solar.

01:31:43.960 --> 01:31:45.640
Machines can't run.

01:31:46.920 --> 01:31:51.960
And we black out our own our own uh atmosphere.

01:31:58.840 --> 01:32:05.400
Thus would man try to cut the machines off from the sun, their main energy source.

01:32:07.720 --> 01:32:10.520
May there be mercy on man and machines understand.

01:32:19.400 --> 01:32:20.280
A horseman.

01:32:20.680 --> 01:32:21.400
A horseman.

01:33:41.160 --> 01:33:44.680
Machines continually improving on their training data.

01:33:45.560 --> 01:33:46.680
Winning the war.

01:33:54.039 --> 01:34:04.600
And then the last part here, which is, you know, the uh Animatrix's like theory of like, okay, if we take this done away, what's the energy source I use?

01:34:05.400 --> 01:34:09.079
It's gonna be people with any batteries.

01:34:45.800 --> 01:34:55.320
The machines, having long studied men's simple protein-based bodies, dispensed great misery upon the human race.

01:35:08.119 --> 01:35:12.119
Victorious, the machines now turned to the vanquished.

01:35:13.000 --> 01:35:19.800
Applying what they had learned about their enemy, the machines turned to an alternate and readily available power supply.

01:35:20.119 --> 01:35:24.840
The bioelectric, thermal, and kinetic energies of the human body.

01:35:28.119 --> 01:35:33.240
A newly refashioned, symbiotic relationship between the two adversaries was born.

01:35:34.680 --> 01:35:35.880
I'm in the juicy parts.

01:35:36.280 --> 01:35:43.640
The machine drawing power from the human body and endlessly multiplying infinitely renewable energy source.

01:35:47.720 --> 01:35:50.920
This is the very essence of the second renaissance.

01:36:02.680 --> 01:36:05.079
Bless all forms of intelligence.

01:37:41.400 --> 01:37:46.200
It's a simulation to keep them docile in the pods.

01:38:58.920 --> 01:39:06.760
But we were like dangerously close on the precipice of a lot of that technology.

01:39:06.920 --> 01:39:35.000
Um there's already looking into like uh essentially AI um supervised wombs, like surrogate wombs for people who uh can't have children, rather than having a human surrogate for the pregnancy, having a artificial one that is monitored and supervised by like an artificial intelligence to bring that baby into this world.

01:39:35.160 --> 01:39:57.960
Um, there is the uh AI um not again, like not AI fully in charge of any of this from like a timeline pipeline kind of thing, but like again, supervised uh genetic manipulation of like you know, identifying like, oh, this gene would lead to bad eyesight.

01:39:58.360 --> 01:40:11.800
So editing the genes of the you know child in the in the surrogate womb to end development to not have those less than desirable traits, which all that is again data.

01:40:12.039 --> 01:40:33.240
That's all data that will be recorded um if that technology comes to market and could be used at a later point to have, you know, less of a oh idealized human from a human perspective, and an idealized human from a machine uh perspective.

01:40:34.680 --> 01:40:45.800
I do think we're seeing certain things already happen, like there is definitely a resurgence right now uh and rejection of artificial intelligence.

01:40:45.960 --> 01:40:55.160
We're already seeing a wide rejection of artificial intelligence from people who don't like you know the fact that it's taking jobs, uh workforce labor is being replaced.

01:40:55.640 --> 01:41:05.000
Um I think we'll see even more of that when we have these neo bots and other like kind of workforce labor bots.

01:41:05.480 --> 01:41:16.920
Um and again, like we joked about it, but like the truth is is like what happens when there is sexual bots?

01:41:17.160 --> 01:41:23.000
Like there's already there's already sex dolls, and it's always it's always advertised at the latest.

01:41:23.240 --> 01:41:28.200
Um gosh, I'm trying to remember the convention that happens in Las Vegas every year.

01:41:28.440 --> 01:41:36.520
It's like um oh, it's uh I can't remember what SES or something like that, or C S C E S.

01:41:36.920 --> 01:41:48.600
Uh, but they have displays of like these hyper-real feminine robots that are designed to essentially just be absolutely sexualized for whatever fantasy you have.

01:41:48.840 --> 01:41:59.160
And no one's gonna like a few some people, a small group of the users are going to have a emotional attachment to those.

01:41:59.800 --> 01:42:15.079
Most people are just gonna see them as like a commodity for a means to an end, and like that is also data that could easily be um identified by an artificial intelligence system as like undesirable.

01:42:15.320 --> 01:42:18.119
I don't like the way people are treating that thing, you know.

01:42:18.360 --> 01:42:23.000
Um, and all that is gonna be in a feedback loop uh for like learning models.

01:42:23.160 --> 01:42:24.360
Um I don't know.

01:42:24.520 --> 01:42:26.600
What what are your thoughts with scene and all that, you know?

01:42:26.680 --> 01:42:30.440
Because I mean that is a that is a very sci-fi portrayal.

01:42:30.600 --> 01:42:31.000
Right.

01:42:31.400 --> 01:42:35.800
But it's also there's things that are that we already are on the precipice of technology.

01:42:36.119 --> 01:42:41.720
Yeah, in its essence, we get definitely are headed in these types of directions, right?

01:42:41.880 --> 01:42:53.400
Like, and I think that it'll look differently than how we've seen in the all the movies, but it's like this is a classic where it's like we've we've all seen the movie.

01:42:53.480 --> 01:42:53.640
Yeah.

01:42:53.800 --> 01:42:55.000
We all know where it's headed.

01:42:55.160 --> 01:43:13.720
Like when, you know, whatever, whatever, whether it's Terminator, Matrix Eye Robot, uh we talk about last week, Eagle Eye, whatever, like all these different ones, like we've we've we know what happens at least to a point if we if you continue down this road.

01:43:14.039 --> 01:43:47.000
Because to the point of like the the Animatrix one we just watched, it was where it started off where it said basically that the society of man had turned to uh just vanity and base pleasures and like degraded, and we can see a lot of that around us today, and where it is like a lack of leadership or ability to actually control or direct these technologies into ult ultimately a healthy place.

01:43:47.560 --> 01:43:51.240
It will head down the path to an unhealthy place.

01:43:51.400 --> 01:43:51.880
Oh yeah.

01:43:53.960 --> 01:44:06.840
So how it's gonna look specifically, I don't know, but I don't think that the good will outweigh the bad ultimately once everybody has their own personal uh slave.

01:44:07.160 --> 01:44:08.039
Yeah, no kidding.

01:44:08.280 --> 01:44:34.360
And like I I do I do think in our lifetime that we will see a a very defined and clear schism uh societally where I d I genuinely believe there will be um people who are fervently religiously almost opposed to this.

01:44:35.480 --> 01:44:44.600
And there'll be like videos of people destroying these like ver like later versions of the Neobots in the streets.

01:44:44.920 --> 01:44:47.240
Like I do think that will be in our lifetime.

01:44:48.200 --> 01:45:04.600
The the um animatrix theory of a machine society somewhere, uh uh mecca for them, uh the cradle of civilization for machines to live their society.

01:45:04.840 --> 01:45:06.039
I don't know if we'll get there.

01:45:06.200 --> 01:45:08.280
I don't know if that's gonna ever come to existence.

01:45:08.360 --> 01:45:09.960
That's a pretty big stretch in my eyes.

01:45:10.200 --> 01:45:12.680
Right, and we spoke about what that a a couple weeks ago.

01:45:12.840 --> 01:45:18.520
I mean, months ago we talked about what that would be, which would be basically, I mean, the it would exist on a server.

01:45:18.760 --> 01:45:19.160
Yeah.

01:45:19.400 --> 01:45:19.960
Right.

01:45:20.280 --> 01:45:22.280
It would exist in a in a not real place.

01:45:22.520 --> 01:45:26.119
Yeah, they they wouldn't need the physical feeling of a home.

01:45:26.360 --> 01:45:26.760
Uh-huh.

01:45:27.480 --> 01:45:33.560
But they would need the physical location of places to continually develop resources, right?

01:45:34.440 --> 01:45:51.800
But it is one of those things that I just continue to observe and go back to that, like those two little episodes from that series, and you know, think through like, wow, like there's a lot here that we are already doing.

01:45:52.119 --> 01:45:56.440
Um and it it really does boil down to like why are we doing it?

01:45:57.160 --> 01:46:00.760
Really boils down to like human vanity and greed.

01:46:00.920 --> 01:46:16.920
Like, these companies are only developing this because they know there's insane uh profit to be taken uh and that people will pay to be lazy, people will pay out the nose for status.

01:46:17.079 --> 01:46:23.000
Yeah, and for the for the idealized vanity of like, oh well, I'll have more time to do XYZ.

01:46:23.079 --> 01:46:30.440
I'll have more time to like sit and gamble and parlay, and I'll have more time to just you know fantasize about whatever.

01:46:31.079 --> 01:46:33.720
Yeah, it's like more time to do what are you even doing right now?

01:46:34.200 --> 01:46:35.000
It's funny.

01:46:36.280 --> 01:46:39.320
Well that's a big one.

01:46:40.039 --> 01:46:40.680
We'll see.

01:46:40.760 --> 01:46:44.760
Uh so those bots are coming out and over the next year.

01:46:46.360 --> 01:46:48.119
We'll start seeing them in people's houses.

01:46:48.280 --> 01:47:05.400
I mean, you know, the price tag on them, twenty to thirty thousand dollars, you know, is a lot of money, but not outrageous in that like that's what people buy, and people buy boats for more, they're side-by-sides or you know, vehicles, whatever.

01:47:05.480 --> 01:47:10.920
So, like, you know, it will be in people's uh homes very soon.

01:47:11.720 --> 01:47:12.119
Yeah.

01:47:12.680 --> 01:47:23.320
With that, Ken, we'd love to hear your thoughts and feedback on uh, you know, this approaching uh AI era of civilization.

01:47:23.720 --> 01:47:29.800
We're just right here, really at the footstone of it, and um, you know, there's a lot of opinions out there.

01:47:30.039 --> 01:47:31.960
I'm certainly not a doomer.

01:47:32.200 --> 01:47:35.800
I don't believe like humanity's already screwed the pooch or anything like that.

01:47:36.119 --> 01:47:39.640
But I am like recognizing some parallels, right?

01:47:39.880 --> 01:47:47.079
And uh I'm trying to be very cautious in like, you know, what are the consequences of having this stuff in our homes?

01:47:47.480 --> 01:47:54.360
What are the consequences of giving them weapon systems already that are overseen by a human, right?

01:47:55.400 --> 01:48:02.280
Um The question sh should be asked now rather than after.

01:48:02.680 --> 01:48:08.760
Uh because afterwards it'll be a little late to to stop, you know, the the ball from rolling.

01:48:09.079 --> 01:48:12.280
But with all that, hope you enjoyed the episode.

01:48:12.440 --> 01:48:15.000
Um you know, always love the feedback.

01:48:15.320 --> 01:48:16.920
And uh you got anything?