Skip to Main Content
PCMag editors select and review products independently. If you buy through affiliate links, we may earn commissions, which help support our testing.

These Human-Like 'Synths' Will Make You Do a Double Take

Dr. Suzanne Gildert, Founder and CEO of Vancouver-based Sanctuary, wants her robots, known as synths, to be as life-like as possible. We talked to her about building cognitive machines and why she created a synth that looks just her.

July 22, 2019
Sanctuary AI's Nadine synth with Dr. Suzanne Gildert

Dr. Gildert with the alpha2 'Nadine' synth (Image Credit: Holly Peck at Sanctuary AI)

There are now many types of robots and just as many theories on how to build them, but Dr. Suzanne Gildert, Founder and CEO of Vancouver-based Sanctuary, wants her robots to be as life-like as possible.

"I don't think you can have intelligence without some kind of body—even if it is an abstract sense of a body," she said in a recent interview. "All the concepts we have in our head come from the type of data we ingest, which means it is a function of our sensory perception and therefore our body."

As such, Dr. Gildert created what she calls "synths," which not only look (a lot) like us, but learn by emulating us. In her lab, Dr. Gildert slips into an exo-skeleton and stands next to one of the in-development alpha3 units, teaching it to move through tele-operation. By being embodied, Sanctuary AI's synths "understand" their role within an environment—just like we do—through experiencing, recording, and replaying until they "get it" via reinforcement learning.

We spoke to Dr. Gildert about building cognitive machines and why she created one that looks just her. Here are edited excerpts from our conversation.

At Sanctuary AI, you're not building astromech droids or countertop smart speakers. You're going for the whole 'they look like us' non-biological sentient machines. Tell us why.
[SG] That's right. We believe that, in order to understand the human mind, we require a human form factor, because our mind uses data coming in through our senses to craft our subjective experience of the world. Therefore, if the type of data going into an AI mind doesn't match the type of data going into our own brain, that AI will never be human-like, even in principle. So to ensure that the data is human-like, the body must be human-like.

That makes sense. As an aside, you've called your humanoids synths. Is this in reference to Humans, the TV series?
No. Merely a happy coincidence. To be honest, we came up with the term independently, but I am aware that it was used in the show.

It's a cooler word than robot, especially as it intimates a humanoid concept.
Right. We wanted a word that wasn't robot—because that's too broad—or an android, which has become synonymous with the operating system.

And 'replicant' sums up the less than benign aspects of our future silicon cousins.
[Laughs] Indeed.

You're teaching the synths to move through emulation. I've done this when I tele-operated a Sarcos Guardian GT tank-like robot, and it felt amazing. Can you explain why you deploy this form of training?
In reinforcement learning—and other machine learning paradigms—it's hard to learn to do things from scratch. We could let the synth move randomly and then let it learn from trial and error: this is known as "pure RL." However, that wouldn't be good, because the synth would quickly damage itself and the environment. So you need to have a "good starting point" for motion paths. Teleoperation, using an exo-suit and illustrating movement, provides that to the synth.

Dr. Suzanne Gildert, CEO of Sanctuary, and a synth

Dr. Gildert with the alpha2 'Nadine' synth (Image: Daniel Marquardt and Sanctuary AI 2019)

Understood. Do you get extended proprioception after a 'synth fluid motion' training session while wearing the exoskeleton? I felt like my arms could reach 50 feet after my experience at Sarcos.
Yes, I do get a weird feeling—and the occasional strange dream—that I am the synth after being in an immersive suit for a while.

Let's look under the hood for a moment. Is the underlying platform ROS?
We do use the ROS operating system for some of the message passing, but it is only one part of the system. The rest is built in-house, and I can't share the specifics of that.

Fair enough. How many DoF (degrees of freedom) do your synths have and how tall are they?
The current model we are working on—the alpha3 system—is 5 foot 7 [inches] and has 38 DOF: six per arm, six per hand, three torso, two neck, and nine in the face.

What's the synth skin made of and do you print them in the lab?
The synths' skin is made of silicone, but silicone isn't ready to be 3D printed—yet! But we do 3D print the chassis in its entirety in-house. We use carbon fiber printing to make parts that are strong enough to withstand the forces encountered in a human skeletal system.

Is it true we can now incorporate bio—or bio-identical—material into 3D printers to create a synthesis of us and them?
Technically, yes, but it's still early days, and only done within the medical research community right now. Sadly it's not practical, and not used in robots, as yet, because it's hard to keep biological tissue alive. But this is definitely something I'm interested in for the future. I think we can learn a lot from biology. Scientists have solved a lot of the power-density problems and self-healing problems, already. In the future, I think we'll combine the best parts from biology and the best parts from mechatronics / electronics.

Sanctuary AI synth robot

Sanctuary's alpha2 'Nadine' synth (Image: Daniel Marquardt and Sanctuary AI 2019)

On that note, do you see us merging in the future?
At some point, yes. I can already see robots becoming more human-like. Not just in the way they think, but in the actual physical construction of them too. For example, more soft or pliable/compliant materials are being used, including biocompatible polymers. So I think that, pretty soon, we'll be able to put robotic parts in humans, and biological parts in robots, and, in the future, it won't be "us and them"; there will be a spectrum of everything in-between.

I'm totally down for a cyborg upgrade when my bio-self starts to degrade.
I'll bear that in mind [Laughs].

Do you have any robots in the field right now?
Right now, we're "pre-commercial" and developing functionality under contract with corporate partners so no, there are no current commercial deployments at this time. Having said that, we did do an initial study with Nadine (alpha2) at Science World British Columbia, a museum here in Vancouver, just to see how people would react to an AI system "learning" in front of them.

Sanctuary AI synth robot plays Connect Four

Dr. Gildert and Nadine play Connect Four (Image: Daniel Marquardt and Sanctuary AI 2019)

What tasks did you set out for that alpha2 test?
We had the synth behind a table and showed the museum's visitors how it learns, by playing games, like the game Connect Four. We found that people take just 15 seconds to make up their mind about the synth, and whether they'd view it as a friend/helper or otherwise.

At the risk of sounding sexist, the current model is rather pretty. Did you also experiment with less visually appealing versions?
That's a good point. And yes, we did. Essentially we can make a synth in whatever character form our corporate partners require, so it fits into their environment and branding needs. We've tested out much more androgynous forms, and body shapes that are medium-sized, or less streamlined. We did do one with a raw silicone face, kinda pasty, and it looked like something either in a Japanese Geisha makeup or Data from Star Trek.

In your own quest to leave a digital legacy that's more personal than most, is it true you're creating a synth of yourself?
Yes, I am building "SuzanneSynth." She's still pretty primitive and I don't make any claims to say she's anything like me, in personality, as yet, but she does look like me . This is very much an exploratory side project for me—encompassing both art and science, in a way.

Dr. Suzanne Gildert, CEO of Sanctuary, and her synth

Dr. Gildert and her synth (Image: Daniel Marquardt and Sanctuary AI 2019)

Tell us what you're hoping to find, or achieve, with SuzanneSynth.
Well, I've always been passionate about something called "extreme lifelogging" and, with this project, I'm exploring whether you can implant real human experience memories into the "mind model" of a robot. Then, over time, have that system believe that it actually experienced the event.

Isn't that the central plot device for Blade Runner?
Yes, in a way. I guess I'm like the woman who makes the memories/backstories for the replicants, which they then perceive as their own.

But these are your own memories—you're not scraping social media data, or [Tyrell Corporation] family members' myths, to populate your synths?
[Laughs] No. Obviously I'm using my own memories from this, and, as I own all my own data, I can do whatever I want with it.

So as certain memories fade, as happens in us bio-beings, you'll be able to let SuzanneSynth fill you in on the adventures you've had?
That's one way of looking at it, yes.

Maybe one day your synth will pass the Voight-Kampff Test.
We're years off that possibility. But I do remember seeing the original Blade Runner when I was at university, back in England, before I did my doctorate in Experimental Physics [at the University of Birmingham], and was very struck by that scene.

And now you're building it.
I guess. [Laughs]

I'm sure you're already on Hollywood's radar then. On that note, what's your business model? I can't just buy an alpha2 online, right?
Ah, no, you can't. We offer our corporate partners a Labor as a Service (LaaS) business model after the completion of a stage-gated research engagement, where we focus on user experience and functionality testing.

Can you mention any of your current corporate partners?
Sadly, no. All of our engagements are currently confidential. But I can say we are pursuing additional corporate engagements in numerous market verticals.

For more, Dr. Gildert is speaking at the AGI conference in Shenzhen on Aug. 6.

CES 2018: Sophia by Hanson Robotics
PCMag Logo CES 2018: Sophia by Hanson Robotics

Get Our Best Stories!

Sign up for What's New Now to get our top stories delivered to your inbox every morning.

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.


Thanks for signing up!

Your subscription has been confirmed. Keep an eye on your inbox!

Sign up for other newsletters

TRENDING

About S.C. Stuart

Contributing Writer

S.C. Stuart

S. C. Stuart is an award-winning digital strategist and technology commentator for ELLE China, Esquire Latino, Singularity Hub, and PCMag, covering: artificial intelligence; augmented, virtual, and mixed reality; DARPA; NASA; US Army Cyber Command; sci-fi in Hollywood (including interviews with Spike Jonze and Ridley Scott); and robotics (real-life encounters with over 27 robots and counting).

Read S.C.'s full bio

Read the latest from S.C. Stuart