Transcript: Boosting Factory Utilization: The Key to Reviving US Manufacturing?

Table of Contents

Interview

[00:00:00] Episode Intro

[00:00:00] Audrow Nash: Hey everyone. Welcome to the Audrow Nash Podcast. Today we're diving into the world of robotics and manufacturing with Saman Farid, the founder and CEO of Formic. Ever wonder how robots could revolutionize American factories? Saman and his team are tackling this, head on making robotic automation accessible to small and medium sized manufacturers across the United States. They're not just selling robots. They're providing fully managed solutions that are transforming how factories operate. In this episode, we'll explore how Formic is using cutting edge AI to streamline robot deployment, why focusing on uptime and utilization is critical for success, the challenges facing American manufacturing and how robotics can help.

If you are interested in robotics, ai, or the future of American manufacturing, you're in for a treat.

Let's jump in.

[00:00:53] Introduction to Saman Farid and Formic

[00:00:53] Audrow Nash: Hey Saman, would you introduce yourself?

[00:00:55] Saman Farid: Sure. My name is Saman Farid. I'm the CEO of Formic. prior to Formic, I built and sold two software companies and then was a, VC for about 10 years where I was investing in a lot of robotics and ai. and, saw a lot of challenges with the way that people are building robotics technologies and trying to get them into market.

So it started formic to, really accelerate the, event of the robot revolution.

[00:01:24] Audrow Nash: Hell yeah. Love it. So tell me a bit about Formic, the, like general details of it. How many people, when were you founded, round of funding that you guys are at? These kinds of things?

[00:01:39] Formic's Mission and Impact

[00:01:39] Saman Farid: Yeah, so we started Formic, about three and a half years ago. we are, we've raised total about $60 million over, three rounds. and we have about 60 employees right now. We are, deploying robots in, a variety of different types of environments. We are currently already deployed in about a hundred manufacturing facilities across the us, and we're continuing to scale that business very quickly.

[00:02:11] Audrow Nash: That's awesome. Yeah. It's so exciting. You guys are in the, scaling phase right now. so tell me about Formic. Tell me what are you guys do, what problems all you are you solving? how does your system work?

[00:02:23] Saman Farid: Yeah. what we do at Formic is we try really hard to improve the accessibility of robotics and automation for manufacturing facilities. what that means is, we drive down all of the peripheral costs related to deploying a robot by automating them with ai. So that means automating the site evaluation process, automating the configuration and deployment process, operating all of the kind of post deployment, service and maintenance processes, so that we can provide, full end-to-end, drop-in robots, that customers pay for by the hour.

and what that allows them to do is, not have to become robot experts themselves. a lot of our customers, can show up and just, quickly access, the best, possible robots for as long as they need, whether that's one month or 10 years. and, we can solve a bunch of their problems, very quickly.

in a more practical sense, we. are able to use technology to drive down the cost and improve the speed of deploying robots by about seven to 10 x.

[00:03:41] Audrow Nash: Wow.

[00:03:42] Saman Farid: we're able to, turn that into kind of instant operational deployment. Speed. Yeah.

[00:03:49] Audrow Nash: how do you, benchmark that? Like how, do you get seven to 10 x and that's bonkers. Big. So 700 to a thousand percent

[00:03:56] Saman Farid: That's right. Yeah. Yeah.

[00:04:01] Challenges in Robotics Deployment

[00:04:01] Saman Farid: there's a lot of stats around it, but for, currently in the United States for a typical, robot deployment, The, timeline from decide to do a, robot deployment to actually see the system installed is typically on the order of magnitude of four to six months.

all of the steps, involved related to that kind of site preparation, system installation, procuring all the parts and components, programming, things like that. so we're, we are typically doing that in about two weeks. so that's a massive savings, on kind of deployment speed.

in terms of costs, it's, there's a lot more kind of fluctuation depending on the type of system. the benchmark that we use is, similar systems. and, we're able to. as, you're, sure familiar for a typical, robot deployment in the us, 60 to 70% of the cost involved is in custom engineering services.

that custom engineering can be anything, right? It can be figuring out what the thickness of the concrete is that needs to go, beneath the system. It could be configuring end of arm tooling. it could be, designing all the safety configurations and interlocks and safety fencing and where the safety scanners need to be.

And all of them are site specific. So these are not things that you could say, oh, I have a AI powered robot that's much smarter, so I don't need that anymore. Like, all of those things regardless, need to happen. and we are able to reduce that portion, of the deployment to near zero, because we've automated most of it, not by making the robot more intelligent, but by focusing on those specific processes where there's a lot of time and energy, that needs to go into, robot deployment.

those are the kind of the two axes that we think about. the other, the third one is total cost of ownership of a robot. So the kind of initial upfront deployment cost of a robot is typically, only one portion of, your, if you're, a factory and you wanna deploy robots, the upfront cost is one portion.

the really big cost is actually the ongoing, service and support, right? your needs change, your programming needs to change your configuration, hardware configuration needs to change, your safety configuration needs to change. And then robots, wear and tear is natural, right? And as those, all those things start to happen, there's a huge cost that's typically incurred on the order of magnitude of 10 to 15% of the cost of the system per year, to keep that system up and running in a kind of perform in a performative state.

one of the areas where, all the software and tooling and kind of operational excellence that we've built, drives that down significantly as well.

[00:07:01] Audrow Nash: That's really cool. How do you expect, so when you were saying, like safety related things might change, what does that mean? Like, why would safety things change, during the course of operating? Is it just like new standards, they're doing new processes, or how, why are some of these things changing?

I sus

[00:07:20] Saman Farid: Yeah, so like the, I'll give you a very simple example. if you are just picking up boxes, if you're picking up a 10 pound box, over and over again, there's a certain kind of safety regime associated with that. if you're picking up a 40 pound box or a 60 pound box or 80 pound box, just that, even if it's the exact same operation and the exact same movement, the, safety constraints associated with that suddenly change a lot.

because if that box drops when, let's say the kind of, compressed air, stops coming to that system. The system loses a vacuum and that box drops, the damage that's caused by 40 pound bucks or 10 pound box are very different. that's a very trivial and small example, but, the production needs in a manufacturing facility are not static, right?

There's always things that come up. And, as those configurations need to develop, you need to readjust, you need to recalculate all of your, safe distance calculations. You need to, figure out what happens. if somebody enters the cell prematurely, does the, robot go into emergency stop?

Or does it slow down? How does it slow down? what are the things that it's interlocked to? What are the safe, what are the safety scan scanners that you need to have in order to support that? there's all these things that need to, be addressed. And, there, yeah, like the maximum speed that robot can operate will be, different.

if you're stack, let's just say you're stacking that box, into a pallet configuration, the height, the maximum height that you're able to reach, will have to change. So there's all these things that, are just fundamental physics related things that you have to address.

And, I think that there's a lot of optimism around AI powered smart robots that will solve all of that for you. Unfortunately, like that's not really the case, right? I think, the AI on the robot will make it easier and easier to program the robot to do that task. but at the end of the day, there's still all of these other considerations that go beyond, can I make robot move from A to B?

From A to B? there's all these other con considerations that need to be taken into account.

what we've found is that there's not a lot of, emphasis being placed on those things right now.

[00:09:41] Audrow Nash: Yeah. And so that sounds really cool.

[00:09:46] Case Studies and Real-World Applications

[00:09:46] Audrow Nash: before we get into all of the really cool things you guys are doing regarding configuration, just to make the whole thing a bit more tangible, do you wanna tell me some case studies, that show the diverse sets of problems that you guys are helping to solve?

[00:10:02] Saman Farid: Yeah, so we're, we're deployed right now in, like I mentioned earlier, about a hundred plus manufacturing facilities. one of the things I'm very proud of is, about 70% of those customers, never had robots, before Formic.

[00:10:19] Audrow Nash: That's sick. Hell yeah.

[00:10:22] Saman Farid: yeah, I think we're really trying to, increase

[00:10:26] Audrow Nash: by that 30% too.

[00:10:28] Saman Farid: Yeah, there, there are definitely some that

[00:10:30] Audrow Nash: nice.

[00:10:31] Saman Farid: it's, the, story of that 30% is actually a little bit more sad than the 70%, because typically the ones that have had robots in the past, actually

[00:10:41] Audrow Nash: been a bummer.

[00:10:41] Saman Farid: yeah, they have a robot graveyard, in the corner somewhere. That's a source of embarrassment.

because they've deployed some robots, some, halfway through they didn't work or they spent a bunch of money on 'em and they ended up getting stuck with something that didn't work. there were very few that have high functioning robots, when we arrived. but that's representative of kind of all of us manufacturing and 90% of American factories don't have, even a single robot.

We have. Yeah, it's, unbelievable. And I think the main thing is also it's, untenable, right? I think for America to have, a manufacturing base that sustains through, the 21st century and is competitive, we are drastically under tooled, right?

These, manufacturers are stuck with,

[00:11:35] Audrow Nash: percent.

[00:11:35] Saman Farid: really, high labor costs, really low overall utilization of these facilities, and actually just completely labor in, unavailability. these fa these facilities are, really, struggling.

[00:11:50] Audrow Nash: Yeah, for sure. And I'm, we'll talk about that for sure. 'cause yeah, labor shortages are such a pervasive thing. but that I think are driving a lot of interest in robotics, but, okay, so you have a hundred, roughly a hundred of these. Manufacturing like companies or what, a hundred sites or how are we referring to it?

[00:12:12] Saman Farid: Yeah. Facilities, yeah. Sites.

[00:12:14] Audrow Nash: A hundred facilities. and so what are some of the examples of what you're actually doing? What are you helping them with in those facilities?

[00:12:22] Saman Farid: so our biggest segment is, producing consumer packaged goods. so we're helping robots that we're putting in robots that help produce things like, chocolate chip cookies and tortillas and matcha powder for Starbucks and,

dog food. our robots make windshield wiper fluid, all kinds of things that are, very commonly found in, target or CVS or Walgreens or Walmart.

so those types of products, have, upfront kind of food production part of it, and then a lot of packaging related work that needs to happen. So putting items into boxes, putting boxes onto pallets, moving box, moving pallets, into trucks or into, shelving units. those are all.

Parts that are very highly labor intensive. so we do a lot of tho those kinds of things with robots.

big category for us is metal fabrication. and we have a lot of facilities that we serve that make, components for automotive, aerospace, and defense. you know what a lot of people don't often realize is that, for every one kind of GM or Ford plant, there are typically, 200 to 500 suppliers, that have to have facilities that make all the components that go to the GM and or Ford plant for the final assembly.

and there's this kind of

[00:13:57] Audrow Nash: Don't those suppliers also have suppliers? Like it just keeps going for a long

[00:14:01] Saman Farid: without a doubt. Without a doubt. and in aerospace it's even more, right? So for every single, Boeing plant, for example, you would have, about two to 3000, immediate suppliers, that are making components.

[00:14:16] Audrow Nash: Our world is unbelievably globalized. It's just the craziest thing. with so many, I dunno, factories making so many specialized things and the networks and efficiencies of scale, it's absolutely unbelievable.

[00:14:30] Advanced Technologies and Automation

[00:14:30] Saman Farid: yeah. I love that. I don't know if you've, have you ever heard of this, this example that like, nobody, in the world today knows how to make, a pencil? I.

and, let, alone, a more complicated product, like a car or a, or airplane.

I think we are in a world where we're inevitably interdependent, right? I think anybody who spent time in manufacturing knows that, and, you can't just, improve one part of the, of the supply chain and say, oh, yeah, we, solved that bottleneck. that's not how it works, right?

you have to put, provide a solution that all those, few hundred suppliers and all of their suppliers can use if you really wanna see sustainable benefit.

[00:15:35] Audrow Nash: yep. Okay. So it's, so when you would do metal fabrication, what? Parts are you involved with? are you just moving the metal from here and there? Are you putting it into bending machines? Are you cutting it? Are you like, where, what are the specific tasks that your robots are helping with?

Because it sounds like there's quite a bit of like logistics related things that it's you're helping with, but I wonder in some of these other domains.

[00:16:01] Saman Farid: So in CPG, a lot of it, a lot of it is packaging and logistics. in metal fabrication, the most common use cases are machine tending. and so that could be, tending a CNC lathe, a CNC milling machine, a press, which is doing, the bending, motions, welding, and inspection.

these are all kind of very common use cases. for example, one of our deployments, they're making, metal parts that end up in, in a Chrysler vehicle. and, they have to take thousands a day of these individual components. Cut them to the right length, put them into a CNC machine, let the CNC machine run a certain operation, then they take out the part and then put it into a lathe, have it do a secondary operation, and then take it out.

debur it, inspect it and then put it into, a box for kind of the next step of the manufacturing process that historically used to be a person that would stand there all day and kind of pick something up, put it over there, take it out, put it there. there's this kind of very repetitive process that was also fraught with a lot of error.

and, and it was also very low utilization because imagine that process. You actually have, six or seven machines that are tied up,

[00:17:29] Audrow Nash: Yep. They're not being used in parallel.

[00:17:31] Saman Farid: Yeah. And, yeah, they're not being used in parallel and you're very limited by the schedule of that, of the, of when you have labor available.

and in, in typical American manufacturing facilities, especially in the metal fabrication world, they run one shift a day. So one shift a day means they're running about 175 hours, a month, which is, about 30% of the or, depending on how it calculated 20 to 30% of the available hours, in a given

[00:18:06] Audrow Nash: you're saying for an individual, like the math you did was 40 times four and a half to get one seven or something like that, and that's one person would work 175 hours

[00:18:16] Saman Farid: One, one

[00:18:17] Audrow Nash: kind of thing, but

[00:18:18] Saman Farid: because a lot of these

[00:18:19] Audrow Nash: shifts or something

[00:18:21] Saman Farid: but yes,

[00:18:22] Audrow Nash: it was labor, if labor could sustain it.

[00:18:25] Saman Farid: Exactly right. There's not, enough labor available. and so most American manufacturing facilities, especially most metal fabrication facilities in the US do run one shift a day. so what that means is, like there's just massive under utilization.

They're se 75% of the time these facilities are sitting idle. All of those machines that I mentioned, the CNC lay, then the mill, the inspection process and the de debriefing machine, all of those are sitting idle 75% of the

[00:18:57] Audrow Nash: Most of the time. Yeah. That's wild.

[00:19:00] Saman Farid: when we put that robot in there. What happened was, step one, we helped fill an empty headcount that they couldn't hire somebody for.

So win already. But, step two, which was, even better, was then they realized, actually I can run this robot 24 7. So they would, they would concentrate their labor on all the upstream processes to prepare the materials. They would load up the robot, before they went to, to, to bed, or left, for the day.

They would load up a bunch of blanks. They would go home, and they'd come back in the morning and the robot had run another 10 to 15 hours of additional work. So now all the machines got higher utilization. the entire facility, ends up being much more productive with the same amount of labor.

and their part quality improved. They're, they're able to win more business with their customers because now they actually have more throughput. So there's all of these kind of secondary and tertiary effects that come from. being able to automate,

[00:20:04] Audrow Nash: very much. Do you? So the way I'm imagining it is, okay, we're understaffed because we don't have many people. So then what we can do, like there's not enough people for all the labor, all the, jobs that we need. So now in this machine tending example, you have a person who is moving it from one machine to the next and probably, I wonder if they're doing like.

So you start on machine one, it does the process. You move the part to machine two. Do you start another pro, part on machine one and you just keep them moving? So every machine is as fully utilized as possible? Or does the person go and complete the part than start over? 'cause I'm imagining how robots could be more efficient, but

[00:20:48] Saman Farid: so you're supposed to do what you just described, which is load the second machine. Load the first

[00:20:55] Audrow Nash: As soon as the first one.

[00:20:56] Saman Farid: Yeah. And have them

[00:20:57] Audrow Nash: So everything is working as much as possible. So you have one in operation, then two, then three, then four, and then you have the max number of machines. hand. Say you have seven machines, you have seven parts now in circulation as they're moving through.

[00:21:09] Saman Farid: But the in practice, the typical American manufacturing facility has a 200% per year, labor turnover rate, meaning people,

[00:21:21] Audrow Nash: as extreme as other industries, but that's pretty bad.

[00:21:24] Saman Farid: Yeah, for every headcount they're hiring twice a year. which means that the person who's standing, in front of that machine, probably was only working for the last two or three months, in that, at the most.

and most likely they're at the beginning of their journey. So what that means is to train somebody to do what you're describing, which is oh, let me, make sure that every system is at max. Utilization is very hard, and most of the time they end up being sequential. and there's just, so many things, about that process that can be done much better when, a robot can actually consistently meet that performance through.

[00:22:07] Audrow Nash: you, so I'm imagining, okay. You have a robot, that robot knows roughly how long every machine that it's tending takes, and it has a few of them that it needs to do in series. You can probably do pretty intelligent scheduling. Is this something that you guys are doing? So you go basically, okay.

I just started this one. I just started that one before that. I know when those are gonna be done. I'm gonna just you'd have one arm moving around doing one step at a time. pretty

[00:22:36] Saman Farid: all of that ultimately boils down to what is your average cycle time, on every part. and so that's something we track religiously, and then is automatically improved. So we have, we actually surface all of that kind realtime production data to our customer.

that shows

[00:22:54] Audrow Nash: they love

[00:22:54] Saman Farid: exactly what your kind of part, rate was on every different component, in your process. and, being able to optimize it, is. Like one of the, one of the key

[00:23:09] Audrow Nash: Oh, killer.

[00:23:10] Saman Farid: what step do you do when, do you debu before you load or unload, right?

there's all these things that you need to figure out, in terms of what the sequence of operations should be. and, yeah, absolutely. Like you said, with a robot, like you can create a recipe, right? And then you can make sure that you hit it every single time.

The other thing that, that benefits from this, right? is part quality, right? Part quality goes up very significantly because, it's very common for operators who are standing in front of those machines to load a part in, an imperfect way. and they either tighten it too much or too little, or, they didn't realize that the tool needs to be changed or whatever it might be.

There. there's a lot of different things that you need to pay attention to. and if those don't all happen, you end up with subpar parts. And worst of all, you often don't realize you're ending up with subpar parts until, it's way too late, right? You've, created a 2000, parts from today, somebody checks them and realized, oh, actually we missed something.

[00:24:06] Audrow Nash: yeah. 'cause you're just doing it the same way over and over again. with an error in your process for this kind of thing. But yeah, in a robot you could just tell it the exact way to do it and then it would just keep repeating that, which would be very nice.

[00:24:21] Saman Farid: That's right.

[00:24:22] Audrow Nash: so with the cycle time, do you have, so it's really wonderful that the robots can work 24 7 as opposed to just one shift, a day.

And it's crazy because people aren't as efficient 'cause they're getting trained. 'cause the turnover is so intense. Do you have any idea how the cycle time of your systems, and I'm sure this is case by case, like application by application, but do you, have a range of how your cycle time compares to, person cycle time on various tasks?

[00:24:57] Saman Farid: Yeah. what we're seeing today is on average, 30%. th 30 to 40% improvement in cycle time.

[00:25:05] Audrow Nash: Wow.

[00:25:06] Saman Farid: but, that improvement is generally the smaller one of the two. The main, the, bigger improvement is the increase utilization, right? the fact that you can just run it for more hours in the day, because, like you, what you're doing there is, you're getting kind of 200% benefit or 300% benefit if you go from

[00:25:30] Audrow Nash: Oh yeah, totally.

[00:25:31] Saman Farid: shifts or three shifts.

[00:25:33] Audrow Nash: Yeah, I'm imagining 130% of, the products are being produced and then you're also doing it for three times as much time. So that's 400, I don't know, 400% roughly, which is bonkers. That's really awesome. As a, as an improvement.

[00:25:52] Saman Farid: Yeah. And,

typically right, the constraints once you do that are, upstream and downstream. Because you, you

[00:26:03] Audrow Nash: You hit some other bottleneck. 'cause this part's highly optimized. I see.

[00:26:07] Saman Farid: yeah, Exactly.

[00:26:08] Audrow Nash: But then you can probably, grow out to those if the types of tasks are well suited for what you're doing. 'cause it sounds like, I'm imagining the path of you guys, maybe you started in logistics and then grew into these manufacturing tasks.

'cause a lot of it's just, assembling things in a sense. like assembling your

[00:26:30] Saman Farid: We, actually were very focused on manufacturing from the beginning. didn't, we tried not to do too much on the logistics world, right? But I think

[00:26:39] Audrow Nash: because there are many companies doing it.

[00:26:41] Saman Farid: There's already a lot of automation there. And also, in the logistics industry, the adoption rate of robotics is generally also quite a lot higher.

and we felt if we were gonna have a bigger impact, we wanted to focus on the industry where, like just robots make such a bigger difference, right? the labor availability in a warehouse is actually much higher than the labor availability in a manufacturing facility because these are typically much harsher environments.

[00:27:13] Audrow Nash: Oh,

[00:27:13] Saman Farid: yeah. Yeah.

[00:27:15] Audrow Nash: very interesting. But, so the first set of things in the consumer packaged goods sounded more like logistics tasks. Did you grow into that?

[00:27:26] Saman Farid: yeah,

[00:27:27] Audrow Nash: you, end up doing those tasks?

[00:27:28] Saman Farid: yeah. So it's not even, those tasks are not strictly logistics in the sense that we're doing,

[00:27:33] Audrow Nash: All right. I'm just thinking palletizing relating.

[00:27:37] Saman Farid: So palletizing

[00:27:38] Audrow Nash: think of that very much as logistics

[00:27:41] Saman Farid: Yeah. Yeah. we, yeah, I think by that definition, that would be a logistics task, although, like a lot of the other processes that we do in manufacturing facilities, there's, case packing, and palletizing.

there's, also a different inspection applications. There's depalletizing for raw materials coming in. but you're right, generally what we, call those are material handling, tasks. because when I think of logistics, generally I'm thinking about a ASRS systems or, typically what they're doing is moving pallets around from

[00:28:19] Audrow Nash: I don't know what the, ASRS.

[00:28:21] Saman Farid: automated storage and retrieval system. So those are like the kind of giant shelving units where, you say, Hey, I want, a toothbrush, and then, the robot will go up to the right shelf and pull it out.

[00:28:33] Audrow Nash: Oh yeah. Okay.

[00:28:35] How Formic Collects Site Data

[00:28:35] Audrow Nash: So what are the, core technologies that you, before we get to the configuration side, what are the core technologies in your robot operation, that you guys have, like core capabilities that you end up using in various places across all of your facilities?

[00:28:55] Saman Farid: Yeah. I'll break it down a little bit by phase. so in the upfront phase, a lot of, the software that we put to work is in automating the, gathering of mission parameters, right? So if I walk into a factory tomorrow and, they say, Hey, I wanna automate this task. the immediate next thing that has to happen is I need to ask them about 300 to 500 questions, right. It's what's the thickness of the concrete here? What's the ceiling height? what's the power that you have available? Do you have compressed air? what,

[00:29:30] Audrow Nash: Do they always know those?

[00:29:33] Saman Farid: and no. So that's the other, that's the other

[00:29:36] Audrow Nash: thickness,

[00:29:36] Saman Farid: Yeah, you,

[00:29:38] Audrow Nash: they're renting the space. I, don't imagine they know that part of the building, but Yeah.

[00:29:43] Saman Farid: yeah, exactly. there's the, reality is like, of the 300 questions that I need to ask, probably 70% of them they don't know. and there's, it's like even, simple things, right? what are your box sizes and weights and, what are your typical rates that you need to hit?

What is the cycle time that your human is currently hitting and how do I need, how much do I need to improve that in order for the robot to be able to do better?

[00:30:07] Audrow Nash: Oh my gosh.

[00:30:08] Saman Farid: just, all of those, are, just very, difficult to know because the, your production is just changing so fast. So you have a production plan somewhere on somebody's laptop, right?

But in reality, by the time that you show up on that site, like all that information is outdated and it's, a, a act of God to go and gather all of that information. we have found that, I. for, robot automation in general, it doesn't matter what kind of robot you're trying to deploy.

the first thing is you need to know what the robot needs to accomplish. and so gathering, gathering all that data, is something that technology can assist very significantly. So we use a combination of, like lidar scanners to get all the dimensional data about the facility, within, two minutes.

we get, we use, 2D cameras to observe some of the existing processes and automatically figure out what are all those manual steps that need to happen on a typical day.

[00:31:11] Audrow Nash: when you say 2D camera, you just mean like a normal camera, not a depth camera. Okay.

[00:31:16] Saman Farid: so we put in, a couple of cameras. We observe the process, over a certain period of time. and we gather most of the application data that we need around, like what is the actual operation? What are, the steps that need to happen? what are the rates and timing and things like that.

and then, we've created a bunch of different tooling for data ingestion where we can quickly connect to their ERP or CRM systems, or whether it's an Excel file or other kind of UNR scribbles that are on a piece of paper. Whatever kind of structured or unstructured data that may or may not have.

We feed that into our, engine, which basically ingest all of that information and tur and, builds kinda like a mission plan, right? what does this robot need to do? what are the constraints? and how are we able to accomplish that? so that's, there's a lot of, there's a lot of energy and effort that went into building that.

[00:32:04] Audrow Nash: Yeah. if you had to guess for this part of it, for the gathering the data part, is that like of, say your man hours or, person hours that have gone in, would you say it's like a fairly substantial part? is it like 50% of the whole effort of what you guys have done or how would you, what would you think?

[00:32:27] Saman Farid: from r and d perspective, I would say it's probably about 30%, of the effort that our r and d team has expended, has been on building this, this set of tooling. yeah, the, it's not insignificant, but the, hard part about that, about building this kind of data ingestion pipeline is two things.

one is actually building it right, which is hard, but really the really, hard thing is knowing what are the right things that you need to ask.

[00:32:57] Audrow Nash: percent.

[00:32:58] Saman Farid: what are the qualification points for every type of task? and that requires a lot of end-to-end experience, right? You can't design those in a vacuum.

You need to deploy a bunch of systems and then realize I should have asked this question, right early on. and that would've really saved me a bunch of time or design effort or whatever it might be. just like a, simple thing, right? what is your ceiling height, right? if you don't ask that question, you can very much end up stuck.

the, effort that is expanded by the kind of RD team to build it is one part, but really the effort that is expanded to,

[00:33:37] Audrow Nash: The iteration.

[00:33:38] Saman Farid: yeah, take that knowledge that like a seasoned engineer would have in their head and turn it into, turn it into kind of software that does it automatically.

that's really where the majority of the work is.

[00:33:51] Audrow Nash: Yep. And so with that part included, so the r and d for like how it works, that's been 30% for the iteration. And, having those really powerful heuristics of what kind of information and what things we should even ask, that's probably an ongoing thing I would imagine. But I'd imagine that's been a big effort too.

[00:34:15] Saman Farid: It is, yeah. it's a continuous effort. It's very hard for me to estimate kinda like what percentage of our time and energy went into that, because it permeates into everything that

[00:34:22] Audrow Nash: It's

[00:34:23] Saman Farid: Yeah. as every time we deploy a system, we look back at did we get the things that we needed?

What are the, challenges that we encountered in the deployment, and how do we use that to improve our kind of upfront screening process?

[00:34:35] Audrow Nash: Yeah. So you have the lidar, which gives you a good, like you, you were getting a 3D map of the environment, and then you have this data ingestion part. is there anything else that you have in the initial setup for this kind of thing?

[00:34:50] Saman Farid: yeah. And so, a lot of it's just pure software based right? ingesting text and PDFs and whatever files that

[00:34:58] Audrow Nash: Yeah. And the injester has a lot of things in it, I imagine, like it's a complex, big combination of bunch of software, but,

[00:35:08] Configuring a Workcell with Data Gathered

[00:35:08] Saman Farid: So then, like all of those kinda mission parameters get fed to system number two, right? System number two is really where, I think there's much more, like difficulty, right?

So system one is like technically less difficult, but, operationally very complex. section two, on our software tools, is what we call fast Formic automation software tools, which is basically what allows us to rapidly configure a robot work cell. And what that means is we take all that information and we say, okay, what is the solution here?

for this set of mission parameters, what type of robot do I need? Do I need a big robot arm or a small robot arm, or a collaborative robot or a gantry robot? Do I need, safety fencing or do I need area scanners? Do I need this kind of conveyor or that kind of conveyor? What is the height? What is the, depth, what is the cycle rate?

What kind of end of arm tools do I need? That used to be a process that was highly manual, where a person would basically spend months, going through all of those iterations, and simulating all of them to figure out whether or not you meet the demand. and what we've built, is really basically an automated

system configuration tool, right? So we take those mission parameters, we feed them into our simulator. We iteratively cycle through different, robot designs,

[00:36:29] Audrow Nash: Oh, wow.

[00:36:31] Saman Farid: choose the right system for that task, as, as well as all the peripherals associated with it. and then we run that robot in simulation to validate what the cycle time is gonna be, right?

Because, the cycle time is often a function of capability. So let's say, I only need to hit four cycles per minute. I might be fine using a collaborative robot. and that collaborative robot, means that I don't need to have area scanners, I don't need to have safety fencing.

and a lot of things get much simpler. but that if that cycle rate goes from four parts per minute to 12 parts per minute or 20 parts per minute, Your physical footprint may stay the same, right? But all of a sudden you like, there's these cascading implications into all the different, all the different components that go into that work cell.

and as that starts to happen, you, encounter a trade off, right? You say, okay, the customer told me they only need to hit five parts per minute, but should we design it for seven or should we design it for four? and like we have to gather all this information. and, that's, quote, like quoting and pricing and lead times are all highly interdependent on the, all those design decisions.

[00:37:51] Audrow Nash: Oh, a hundred

percent.

[00:37:52] Saman Farid: and at the end of the day, like it's not a matter of. yeah, you can't make those choices in a vacuum. And the way that it happens today typically is, every robot vendor has some system, right? They're like, I only make these systems, right? And, so if you were a manufacturer and you go to them and you say, Hey, I want a robot that, loads and unloads a CNC machine, let's say, I have this right?

And that may or may be the right thing. it may or may not be the right thing, you have no idea of knowing. and the supplier also is unlikely to know, because they're just trying to sell you the piece of equipment that they have. and this is where the kind of power of our business model comes in, right?

It's because we are committed to the performance of that system for the entirety of its life. when we deploy a robot and we say we're gonna hit 10 parts per minute, we are committed to hitting 10 parts per minute for from the day that it arrives until the day that you say, yeah. until the day that you're done with your contract.

and so that requires, a whole different level of design and, we are then incentivized to make the right design choices, right? what is the thing that's gonna, meet that, target consistently.

[00:39:06] Audrow Nash: Yeah. It's, we talked about this in our first call, like the Alex Hormozi idea of, they'd have to be stupid to not do it kind of thing. So you're like, we're gonna make it so that we are hitting your targets from day one for this kind of thing, and we're gonna guarantee that, and fill these labor shortage gaps and make it so you have higher utilization. This is all awesome

[00:39:31] Saman Farid: And it, requires putting a lot more on the line right. Than, the typical robot supplier who says, I'll sell you the thing. You figure out how to live with it. we are set up where we take a lot of risk on, If we deploy a robot and it's not getting used, that's bad for us, right?

And we do a lot, post deployment to make sure that the customer is continuing to get the most value outta that system.

[00:39:56] Audrow Nash: Yeah. I also imagine, because you are providing metrics and everything, like if you're coming in on a customer and their supply, their manufacturing process is constantly changing due to changes in demand or whatever it is, seasonality, anything. and they're not tracking metrics very well, like you are coming in and you're adding a lot of value there.

as you supply the metrics, help them figure out where they have bottlenecks, help 'em optimize other parts of their process.

[00:40:26] Saman Farid: Yeah.

[00:40:27] Audrow Nash: So that, oh, go ahead.

[00:40:30] Programming the Robot with Generative AI

[00:40:30] Saman Farid: I was gonna say, so you know, this, kind of automated robot configuration process, it, the thing that's related to that is also automatically generating the robot program, right? to get the to, because to run the robot in simulation, you need to have a fully functioning robot program, that does that task.

and so that's another area where we spend a lot of energy is automatically generating the program for the robot so that we don't have to spend a lot of manual time kind of configuring to do the task. that, that's a kind of where, generative AI comes in, to really help, speed up the robot programming process.

[00:41:09] Audrow Nash: What? So I'm imagining with generative ai, I don't, my, current model of it is that it's not that good at very fine details for things. So if I wanted it to write the full program in c plus, that'd probably be hard.

[00:41:25] Saman Farid: Yeah.

[00:41:26] Audrow Nash: but it can probably pick action blocks very well for assembling your program, to accomplish some goal.

how are you guys using generative ai? Is it like higher level behavioral things or is it pretty low level, or what kind of things are you doing there?

[00:41:44] Saman Farid: Yeah. so it's a mix of the two. So there's some selection of action blocks, but there's also, some modification of those action blocks based on, the components that you select, right? So if you use a vacuum tool versus a clamp tool, or if you use, a vacuum tool that has one zone versus three zones like that's a design choice that you need to make.

and then all of a sudden that impacts your programming because if you're doing, if you're, let's just, sorry, I'm, going into detail, right? But to give some

[00:42:13] Audrow Nash: no, I wanna hear it.

[00:42:14] Saman Farid: right? if you have a three zone, if you, in order to hit, let's say the nine parts per minute rate, you might.

Have a robot cycle time of three cycles per minute. but you triple pick, right? If you pick three at a time, you can hit nine parts per minute. if you're picking three at a time,

[00:42:33] Audrow Nash: Just do three. Yeah.

[00:42:35] Saman Farid: And what that means is, one, you have to validate that the robot can hit the weight and all the things associated with, picking up three at a time.

You need to have, end of arm tool that supports that. You need to have indexing system upstream that supplies three parts at a time for the robot to be able to pick three parts at three parts at a time, at a fast enough rate. and then you need to have your programming configure that, right?

Because if you're picking up three parts at a time, you may not wanna drop all those three parts in the same way, right? You might pick up three

[00:43:03] Audrow Nash: Oh,

[00:43:04] Saman Farid: and you need to drop them separately. so then you need a vacuum tool that has three zones anyway. Like I'm, maybe I'm going too far into the weeds, but

[00:43:12] Audrow Nash: No, This is great.

[00:43:13] Saman Farid: I'm trying to demonstrate here is that, it's not just a matter of drag and drop the action block, right?

Pick up apart, drop apart, right? It's not as simple as that. The reality is, once you've made those design decisions, you have to modify all those action blocks and you need to see what that cycle time looks like. In, the real world, and then you may need to come back and change some of those design decisions again.

because if you pick up three at a time, but your indexing system is so slow, that you end up, having, a low cycle time, you may be better off picking two parts at a time or one part at a time. So it's, it's, these things don't scale linear linearly. And I think this is where, what, we found, and this is going back a little bit into my career as a, vc, what I found is that a lot of robotics companies were very focused on, how do I make the robot pick the thing up and drop the thing off as efficiently as possible?

How can I make it so smart that I don't need to tell it anything, I just need to pick up the thing and put it down. but the reality is that doesn't actually solve for all these things that we're talking about. how do you index the part? Where do you drop them? How do you drop them? what is your kind of upstream and downstream constraints?

What robot do you select to do that task? All of those are design decisions that need to happen. and what we found is like a lot of robotic startups, feel like that's the dirty work, right? They're like, I don't wanna get involved in, that. I just wanna provide a software, API and I want everybody else to figure out all those things.

and, like the reality is like that means that nobody's gonna be able to adopt, right? Because, that's, really where the rubber meets the road.

[00:44:54] Audrow Nash: Yeah, very much. Yeah. I've seen a whole bunch of robotics companies that make a product and then drop it off and, that, or they have to do all of the custom work of fitting it to the warehouse or fitting it to the manufacturing facility or whatever. Every time with every customer, and it makes it, it's a severe limit on their scaling ability.

so it, like what you guys have worked on, and I guess this probably goes back to your VC experience, is it seems like you have gone right to the hardest part, which is the configuration so that you can scale a lot easier.

[00:45:38] Saman Farid: Yeah. Yeah. using reinforcement learning to do the pick and place part is table stakes today. I've gotten pitched on that, probably a hundred times in the last two years, right? and I continue to get new people coming to me every day and saying, oh, I'm like, a really, smart, Stanford graduate and I have this really great way to pick up things and put them down with the robot and, I'm gonna save you a whole bunch of money on your programming costs.

And then the reality is that's not, true at all, right? 'cause programming the robot to pick the thing up and put it down is maybe 2% of my deployment, frustration, right? 98% of it is all these other

[00:46:15] Audrow Nash: Ha. Yeah, for sure. And especially as you want to grow, like the point of a startup is growth, I think. And if you are just, you're effectively, I've been thinking of a lot of companies as like making a distinction, where it's, you have startups and those are focused on growth, and you have contractors that are focused on, they may get revenue and they may gradually be able to get more efficient, but they are very often, doing things that don't scale very well for a long time.

And they may call themselves a startup, but they very often are effectively just consulting around the problem they're solving. and so you guys, it seems you're very much growth focused for this kind of thing,

[00:46:59] Saman Farid: yeah.

[00:47:00] Audrow Nash: the, oh, go ahead.

[00:47:03] Saman Farid: Oh, I, was gonna, I was gonna move on to the next, next block of software that we built, but, if you have

[00:47:09] Audrow Nash: I still have more questions for this. yeah. so you mentioned reinforcement learning as a way of doing this. I was wondering how are you actually going back to just the generative right before the reinforcement learning, generative ai. So the way that I'm imagining it, and maybe this isn't quite correct, so largely you're selecting actions, but you're also modifying those actions.

That's really cool. is it like you have small parts of it that you let it so there like this is a larger action and we're saying, okay, I'm scoping you, you can do this one thing. You're writing this input output

[00:47:48] Saman Farid: that's exactly right. Yeah. We basically give it a template of what we want it to build. And then we say, here are the inputs that are different, from kinda that, that phase one, right? as we gather all this data, we generated these kind of requirements. Here are the requirements.

Here's the template I want you to fill. Here are the things that we need to figure out. and it fills 'em out.

[00:48:08] Audrow Nash: That makes a lot more sense to me. 'cause the other one would be pretty finicky I would imagine, in terms of it works sometimes, but it wouldn't work a lot of the time. but confining it would be much better. What are you guys, are you using, are you using like Facebook's LLM or Meta's, LLM or how, I'd be curious about your specific, generative ai, I don't know, program or service you're using for this kind of thing?

For this. I would imagine that it would probably be Meta's Lambda model.

[00:48:43] Saman Farid: Llama. Yeah, we use La Llama with some, so

[00:48:49] Audrow Nash: Modifications for

[00:48:51] Saman Farid: and, a bunch of tooling around it.

[00:48:54] Audrow Nash: hell. Yeah. It's cool. It's a neat thing. I, yeah, I was imagining as a company you don't want to use like. Claude or something directly or chat GPT directly for this kind of thing, having your own control of it would probably be good. not having them like chat. GPT releases a new version. It breaks all your stuff.

Yeah. Or just some crappy update or something

[00:49:16] Saman Farid: Yeah, But the, the good thing is right, the, quality of these, multimodal models is, getting so much better, so fast that like we can start to ingest more and more types of things and skip more and more of the steps.

Yeah.

[00:49:31] Audrow Nash: And then, so just so I understand your fast, automation tool a bit more, you are, the way I'm imagining it is you. Have your ingested data, which gives you a structure in a sense, like you're inferring some sort of requirements from your ingested data. And then from that you are going to, given constraints of things you've, observed, you're going to try to figure out what's the best way to meet the customer's goals at the lowest price for this kind of thing.

[00:50:09] AI in the Loop: Revolutionizing Robotics Simulation

[00:50:09] Saman Farid: Yep.

[00:50:09] Audrow Nash: And then to do that, what you do is you, maybe you use LLMs again and they're selecting like base components or something like this, and these are like this robot here, this kind of thing. and then you are running that in simulation with a good way of establishing metrics for it and seeing how it does or how, does all this work?

[00:50:35] Saman Farid: so you could think of it a little bit as, like instead of human in the loop, like we have AI in the loop, right? So like the, core of what we're doing here lives in the simulator. so our simulation tool takes those configuration, takes that configuration information, and then generates, or, and runs, the full work cell based on what it has available.

and then the ai, what it does is it looks at the performance data, right? It says, look, cycle time is too low. Change this, or, space constraints are being exceeded. change that, right? and, and it goes through this process where it iterates through a couple of different variants, and usually what it spits out is three or four candidate configuration files, right?

and, that's where the human gets involved, right? Which it says, Hey, we have kinda option A, which is, a hundred thousand dollars of cost, using a collaborative robot, and we're able to hit four parts per minute. and option B, it's, $150,000 of cost. It's an industrial robot.

It takes a bigger footprint and it can hit eight parts per minute. option C is against your robot and it can do blah, blah, blah, and it can hit. 12 parts per minute, whatever it might be. And we will, it'll generate those options for us. And then our, then we do have humans in, the loop, right?

At, that point, who look at those configuration files and make the kind of business trade-offs related to it, right? It's okay, what are the, if I want to hit the hourly rate that, that hits, that saves money for this customer. And, we want a system that's relatively redeployable.

if that, product makes changes in that customer's site, which of these, option A, B, or C is gonna be the better fit,

Because

[00:52:20] Audrow Nash: like that you do that. You give them options

[00:52:23] Saman Farid: Yeah. because

[00:52:24] Audrow Nash: let them fine tune

[00:52:25] Saman Farid: yeah. The, it's not static, right? the, product requirements are not static. What you might have today, is something that only needs to hit four parts per minute.

but three months from now, you may win a new order and you might have to, increase that, to 12 parts per minute. And you don't wanna have to rip out that whole system and put in a new one. So you need to kind of design for a couple of, potential changes.

[00:52:51] Audrow Nash: That's super cool.

[00:52:54] Exploring the Software Stack: Tools and Technologies

[00:52:54] Audrow Nash: Okay, so how do you, can you tell me a bit about your software stack? are you using ROS, are you using Isaac Sim? Are you using, do you make your own stuff? What?

[00:53:05] Saman Farid: not using ROS. we are using, yeah, Isaac simulation for our, like hardware and loop simulation processes. we also use, another tool called Robo dk, which yeah, robo DK is a, robot simulator. that, there's a lot

[00:53:27] Audrow Nash: give you a bunch of off the shelf robots for this kind of thing? Was

[00:53:31] Saman Farid: have the, one of the big benefits is they have a huge, template library, that we could use.

so all of the, they have all the configuration files for kind of, most of the off the shelf robot arms that we might use. and they, like the, to 'cause a simulation

[00:53:51] Audrow Nash: Oh, is dk is that developer kit? it's like SDK, but a

[00:53:55] Saman Farid: what it stands for. I don't know what it stands for. Maybe,

[00:53:58] Audrow Nash: ' cause Robot developer kit.

[00:54:00] Saman Farid: Could

[00:54:01] Audrow Nash: okay. Just guessing

[00:54:02] Saman Farid: Yeah, it could be,

[00:54:04] Audrow Nash: based on your description. That's what it sounds like, but, okay,

[00:54:08] Saman Farid: yeah,

[00:54:09] Audrow Nash: so you use that, you have Isaacs in for simulation

[00:54:11] Saman Farid: built a bunch of

[00:54:12] Audrow Nash: for robot configuration.

[00:54:13] Saman Farid: Yeah. We built a bunch of plugins that go on top of Robot DK that kind of automate a lot of the steps, for it.

[00:54:20] Audrow Nash: Okay. Hell yeah. And then, okay, so that's the fast part of this, that your automation tool, what goes on top of that

[00:54:30] From Simulation to Reality: Building and Validating Robots

?

[00:54:30] Saman Farid: the, so the next phase is, build validation, right? it's one thing to have the perfect robot designed in simulation. then, physically building it often, means that you, end up with a bunch of differences, right? we, some of 'em we built in-house, but most of the time we use third parties, who build, the hardware for us based on our configuration, right?

So we say we need this robot arm and this conveyor and these sensors. and what, as that system is getting built, you need to validate. Everything.

[00:55:07] Audrow Nash: Yep.

[00:55:08] Saman Farid: one of the, if even one of the sets of wires are, are, installed, backwards, right? Which we, which would have, which happens all the time, all of a sudden the performance of that system is works wrong, right?

if this sensor is connected to that IO and, and it's the wrong one, right? it expects that the other sensors connect to the io, like your performance all just goes out the window. and so we've built a bunch of tools that's that. we, send an industrial PC to every system builder.

And as they build the system,

[00:55:42] Audrow Nash: What's that?

[00:55:45] Saman Farid: that industrial PC gets a industrial pc, which is an industrial computer.

[00:55:50] Audrow Nash: Oh, I see. Yeah.

[00:55:51] Saman Farid: the industrial pc, sits in the control box, connects to the PLC, the robot programming and, sorry, the robot controller, and all of the other peripherals.

And, it validates that everything was done correctly. it has ba essentially it's unit tests, right? And software is very common to do unit tests. We basically do unit tests for hardware as well. So we'll have the robot, do a certain movement block, a certain sensor, and then we'll say, did we get the reading that we expected from that sensor or not?

or, we will, have the robot automatically tripped some of the safety,

[00:56:31] Audrow Nash: so cool.

[00:56:35] Saman Farid: see if that triggers the response that we were expecting or not. and so we're able to validate that, the built system

[00:56:43] Audrow Nash: is what you

[00:56:44] Saman Farid: performs in the same way exactly as the simulated system.

[00:56:47] Audrow Nash: How do you, so I'm imagining this is all again, so not fully, I mean it's probably somewhat generated, but largely templated for this kind of thing where you say, okay, we added a camera. Let's make a movement that blocks it for this kind of thing.

That's super cool. I really like that

[00:57:06] Saman Farid: our deployment project manager will have a kind of, a validation checklist for every system, right? They'll say, Hey, here are the 23 things that we would check if we were on site. what percentage of those can be done? Can we do automatically?

[00:57:19] Audrow Nash: That's awesome. I really like, because these are things that are. Time consuming for most companies, and I really like that you've spent the time to automate them. And I also feel like, I'm curious because when did you, you said you guys have been running for three years or something, right?

did you say

[00:57:38] Saman Farid: and a half. Yeah. 20 end of 2020 is when we started. Yeah.

[00:57:41] Audrow Nash: End of 2020. How, much, did you pivot quite a bit with generative ai? 'cause open AI came out, I don't remember when that was. Like 2022 or what, like end of 20, 22, or am I wrong?

[00:57:55] Saman Farid: I don't remember exactly when opening AI came out, but we didn't,

[00:57:59] Audrow Nash: Did, you pivot hard or was it just it assisted in what you were trying to do already or?

[00:58:05] Saman Farid: These were all things that were on the roadmap from the beginning. they just became much more capable, right? As LLM started to get much better. now all of a sudden these tools, started to become more capable. Although, a lot of these things that we're doing don't have anything to do with LLMs, right?

Like the kind of simulation piece is not really related to LLMs. The, robot, performance validation set, is not really related to LLMs. I think AI has a part to play for sure. but. A lot of these things, have to draw from the, a foundation of a bunch of different, things that people have developed over the last 20 years.

yeah,

[00:58:44] Audrow Nash: Yeah, very much. Yeah, it is quite cool. And I guess you could have just always had a human in the loop and you in, some specific parts and it wouldn't have been that much of a slowdown, most likely. but you get better efficiency using these kinds of tools. And even if a person then curates it, or says, ah, these are results suck, run it again, or whatever it might be

[00:59:07] LLMs for Data Injestion

[00:59:07] Audrow Nash: . How does your, data ingestion part work? So if they have scraps of paper and things like this, or how do you, getting that initial data of their setup, I understand the LIDAR scan, but understanding a lot of the other things about them. How does that all work? That sounds like a thing that LLMs would be pretty good

[00:59:27] Saman Farid: yeah. That, is something that helps a lot. So usually what happens is our salespeople who are, who go on site, Know all the things that they're supposed to gather. and they just take pictures and videos of all the things that they can, right? So if the guy who's working on the site will, is doing a certain set of motion, take a picture of whatever they've written down, if you have boxes that have serial numbers and, weights and sizes that are written on them, will take a picture of that.

that whole set of things that, gets loaded into, our, our kind of, configuration tool, right? and that's what generates the mission parameters. So I. It, can come in a variety of different formats. It can also even come in the form of audio recordings. like our salespeople use chorus, which is the tool that kind of records meetings.

Usually in those meetings, a lot of the information that we need to gather, can be like, is asked and answered, right? So that also gets fed into the same, model for that project. So every project, basically has a set of, context data that it's given. and then the LLM is responsible for filling out this kind of, filling out this form

[01:00:39] Audrow Nash: you have a data structure effectively filling it out.

[01:00:42] Saman Farid: And then surfacing the things that it doesn't have. It says, Hey, nobody asked, about this thing that I

[01:00:47] Audrow Nash: Oh, ooh.

[01:00:49] Saman Farid: can somebody go back to the customer and ask it?

[01:00:51] Audrow Nash: how do you do that? Because that's a very interesting thing. 'cause LLMs are not that good at knowing what they don't know. and having it be like, I don't have this information is very interesting to me.

[01:01:03] Saman Farid: a lot of that just comes with the right kind of prompting, right? It's do you know this or

[01:01:07] Audrow Nash: Ah ha

[01:01:08] Saman Farid: did you, can you find

[01:01:10] Audrow Nash: did you see this or no? You, you, say yes or no. I see. Yeah, I, since chat GPTI have been looking for how to use LLMs in some way to make my podcast posting more efficient. And, it's very funny 'cause I feel like in some sense it's like I'm trying all the things right as they're available.

And so it's a very good excuse to become quite familiar with them, but it's very funny how, tough it is to reduce some hallucinations. it's, but I guess just asking, did you see this? And if so, what is it? Kind of thing. It's, no, I did not see it. It can probably answer that pretty reliably and then you can ask it to check itself for this kind of thing.

[01:01:55] Saman Farid: Yeah. Yeah.

[01:01:56] Audrow Nash: the new llama model, I can access it through perplexity. I haven't actually run it locally. And, the Claude 3.5 sonnet are really good. Like they're genuinely useful, which is just super, super cool 'cause nothing was useful until, Claude Opus for me.

[01:02:13] Saman Farid: yeah. I think the thing I, my, my perspective on a lot of these LLMs is that, The large language model themselves are never gonna be perfect, right? Like the thing that matters is if you have a good filtering mechanism for the output that comes out of it, right? Like you need to have, rule-based things a lot of the time that check the output of the, AI model.

because oftentimes you know what your expected output should be, it's one of these few options or whatever, right? And so the LLM is not very good at sticking to the right out, answer. but you can check that with a rule-based system. And so you need to, I feel like you need to sandwich the LLM between upstream.

You need to. give it information in a very specific way. that's the prompt, right? So you constrain the input, and then on the output you need to sandwich it with another rule-based filtering mechanism, right? So it, it does whatever it's gonna do in its black box, but you have a screen for what actually comes out of the other side.

so that it, you constrain the problem, a little bit better because

[01:03:20] Audrow Nash: both ends.

[01:03:21] Saman Farid: Yeah, most of the time, you're not gonna ask, I'm not gonna ask an LLM to write a poem for me, right? in, in the context of Formic. so I know that I can limit the kinds of outputs that are gonna come, right?

[01:03:34] Audrow Nash: Yep.

Yeah,

very, true. It's an interesting time with all this.

[01:03:41] Saman Farid: Yeah.

[01:03:41] Audrow Nash: Okay

[01:03:43] Customer Support and Maintenance: Ensuring Uptime

[01:03:43] Audrow Nash: . so you have your robot dk, and the simulation, and you are performing validation on the actual hardware. To make sure that it is working as you would expect. What do you do from there? do you install it in the customer's site or how, do we proceed from there?

[01:04:03] Saman Farid: Yeah. the next step is usually we do, a full acceptance test, which I described parts of, we do the acceptance testing, at the system, whoever's building the system for us. we run a full acceptance test. if it passes, we ship it to the customer site, we install it, and then we run a secondary acceptance test that is in their environment, right?

Because there's usually differences, for a variety of different reasons. and we do that acceptance testing. That usually happens in about half a day. and then we also generate a bunch of, Like peripheral information. So it like, usually that looks like kind of training handbooks for the operators who are there.

[01:04:48] Audrow Nash: Cool.

[01:04:49] Saman Farid: because you have a bunch of people who are interacting with that machine. and so you need, really clear instructions for how to interact with the robot. If you've never been around a robot, how do you start it? How do you stop it? What do you do in this error? What are the different lights mean?

and then we also put, QR codes on all of the components. and

[01:05:08] Audrow Nash: Oh, that's

[01:05:10] Saman Farid: they can scan the operators who are on site can scan a part and then, it notifies us, right? And, we can log in via telepresence and we can ask that operator questions. and we can also like surface little video files that say, Hey, here are all the things that you can do to, resolve this issue, right?

if a suction cup is blocked, Check and see if there's something, if there's debris in there, or if there isn't, hit this reset button or call us or we'll come on site, or whatever it might be. we general, so we, generate those procedures upfront. and then we have a bunch of software, that we call Colony, which is basically our kind of robot management tool set, which allows us to remotely manage and monitor all of these systems that we've deployed.

we every system

[01:06:02] Audrow Nash: Did you build that? That's internal, like it's a tool you guys have developed. Okay.

[01:06:06] Saman Farid: Yeah. We built it internally. we tried to not have to, we tried to evaluate other, other ones that there were out there,

[01:06:13] Audrow Nash: didn't meet your needs.

[01:06:14] Saman Farid: they didn't meet exactly the things that we needed. Yeah, exactly. because our use case is a little bit, complex, right?

one is, we need to not just monitor the robot, we need to monitor all the peripherals, right? so all the server servos and PLCs and conveyance, all that needs to be monitored as well. there's complexity there. Number two, we kinda need video data, from multiple cameras on that work cell, at all times.

Three, we need to be able to inter intervene when there's an issue. and so about 60 or 70% of the time, if there is an error on the robot, we can resolve it remotely, without having to come on site. and so that saves, a bunch of time and cost.

[01:06:59] Audrow Nash: oh, for sure. Yeah.

[01:07:01] Saman Farid: and we just had all of these kind of relatively unique needs, on our, robot,

[01:07:07] Audrow Nash: didn't exist that well.

[01:07:08] Saman Farid: Yeah.

[01:07:11] Audrow Nash: very cool. Do you have, when someone has to intervene, is it like, so do you know plus one robotics and they're set up so they have a pretty sophisticated, to me, the core competence, maybe not core competence, but one of the big differentiating things about them is that they have a very good workflow around human in the loop for when their robots get stuck. so they're trying to pick up a box that I taped together, to ship back an Amazon package, and it's total chaos and it's falling apart. And so then they have a human say where you can pick it up or should it be discarded or whatever.

Or not discarded, but reprocessed or something. do you guys have a similar human in the loop system or how do you, handle these kinds of things?

[01:08:03] Saman Farid: Yeah. So for handling edge cases, we do have, humans in the loop. So we have a, our service engineers that are, kinda monitoring all the systems 24 7. If there's an edge case that needs human attention, it pings one of them. They log in and they can remotely support. the, the amount of things that a remote, person can do is quite limited though, so we do have to kind of design for it upfront, right?

we need to put in place procedures for what happens if different kinds of errors are tripped. to your example about tape, like one of the most common failure modes for a palletizer, which is, one of the simplest system types that we deploy, the most common failure mode is that the box has not been taped properly.

So the robot picks up the box, but if there's no

[01:08:56] Audrow Nash: lips up or

[01:08:57] Saman Farid: Yeah. Or if there's no tape on the bottom, all the contents just spill out. so that's a common, that's a common occurrence. and it's, not the robot's fault per se, but because we're committed to overall uptime and performance, like we still need to detect that situation, and avoid it, before the other stuff.

So we, put, we, we have a vision model that, trains tape on tape detection and or flap open detection. and if it detects the tape is, missing or the flaps are open, it either pauses the system or if there's a place for it to push to the side and keep running, then it will do that.

And there's a, there, every system type has a bunch of different failure modes you need to account for. I need to build it, build into the system.

[01:09:41] Audrow Nash: Now, let's see. Going back a little bit, when you said you build a document like a technical or some sort of document for the workers so that they have exact information on the system you've set up, I imagine that's fairly similar to a number of the things we've talked about, where it's largely template, template driven, and then also like you have templated text that you put throughout, but then you also have custom text using LLMs for this kind of thing.

Is it the same kind of thing?

[01:10:20] Saman Farid: yes, there's a little bit of it that's a automatically generated, although there's a human, heavily involved in that. but I'll say that the challenge is, in, a real production environment, it's very unlikely that somebody's gonna stop what they're doing and go read a handbook.

[01:10:37] Audrow Nash: Yeah, but you probably want quick reference points for this kind of

[01:10:41] Saman Farid: so the way we generally do that is with video. so we

[01:10:46] Audrow Nash: Oh,

[01:10:46] Saman Farid: a. 10 or so, 10 to 20 different kind of common failure modes, that we've accounted for. and we just pop up a little video on the screen of that robot that says, Hey, here are here. Here's what we think might be happening.

here's the steps that, that you take, and, hit this button if you, if it's not resolved, and it'll call us. and then we can

[01:11:12] Audrow Nash: That's so cool. So you like pull up the YouTube video they need to see, but it's internal probably. That's super cool. So yeah, you're not focusing on the documentation because they're not gonna use that to solve things. But you have uptime requirements and so you have a very streamlined process for them, which often is, this is a video of exactly what you need.

And you have through your data analysis, through collecting data, you understand a lot of the common things that may go wrong. And you probably have some ability to introspect the robot to understand what may go wrong. And then you pull up the correct video, they watch it, they clear it out, or they call you guys, if they cannot figure it

[01:11:52] Saman Farid: The most common sequence of events is that the first thing that people do when the system goes down is they call us, right? So they'll call, we have a one 800 number, they call us, one of our technicians answers. they ask 'em a few questions and they say, okay, here's the issue.

and we can remotely trigger the video to start getting played on the screen. So we say, go to the

screen, look at this video, it's gonna tell you what to do.

[01:12:13] Audrow Nash: When are you gonna do LLMs, voice for your technician to just direct them to the video?

[01:12:20] Saman Farid: I think we're, I think we're a little bit of a ways away from that still, but hopefully one

[01:12:24] Audrow Nash: Yeah. Yeah. Hopefully one day

[01:12:26] Scaling and Future Challenges in Robotics

[01:12:26] Audrow Nash: . I wonder, for, so one thing that is interesting to me about you guys so far is that it seems like you're doing your technical support for your customers in-House. several other companies that I, don't have scaled and maybe are a bit further along, they seem to outsource technical support for this kind of thing.

And I'm just wondering about, I guess your view on technical support and is it something that should always be in-house or how do you think about it?

[01:13:05] Saman Farid: yeah, I, am, I'm gonna sound a little bit like an asshole, saying this, but, who's further along? I don't know of any other robotics startups. They're in a hundred plus manufacturing or a hundred plus facilities

[01:13:16] Audrow Nash: Not in manufacturing. I'm thinking

[01:13:18] Saman Farid: in right? They're typically in three to five facilities.

Maybe 10, maybe 20 if they're really pushing it. There's very few robotics startups that actually are managing a fleet of systems across a hundred, a hundred plus facilities. and so, I'm not, saying that to push, right? But what I'm saying to say is,

[01:13:38] Audrow Nash: yeah,

[01:13:39] Saman Farid: I think that outsourcing service and maintenance is one of those things that sounds nice, right?

A lot of people are like, yeah, I, can't scale if I do it myself. and that's just a very common like fallacy that I think a lot of robotics companies fall into where they say, I wanna do the things that scale, which leads them to, I just wanna provide an API and I won't let somebody else do all the work.

And it's that's not the hard part. That's the hard part, right? doing service and maintenance is the hard part. That's the, whole key, right? If you don't do that, your solution is worthless. you can't just drop it off and give it to a third party. and I think this is where, if I have any advice to give to robotics companies, it's don't, think you're too good to do maintenance.

don't think you're too good to write, documentation, don't think you're too good, to go in custom design, end of arm tool. 'cause those are the things that are necessary to drive, massive adoption. Deploying systems in a hundred plus facilities is a herculean task. that is, there may be other companies that have more robots deployed than us, but they may, have a hundred robots deployed at one facility, for example, right?

that's a totally different class of problem than having robots deployed at a hundred plus facilities, right? every facility has a bunch of its own idiosyncrasies. You have different people, you have to interact with, different operators with smaller amounts of training, less onsite support. So as a, I think we've solved, we've gone much farther on solving the kind of post-deployment issues than I think anybody else on the planet today.

[01:15:12] Audrow Nash: Hell yeah. Okay. Super cool. Yeah. So you view it as a core competence of the company to basically be able to service its robots over time. 'cause you're right, if they, you can't really just drop 'em off and if the people that are supposed to maintain them are not super good and you may get some variation in quality, then that's a mixed experience for customers for this kind of

[01:15:36] Saman Farid: And I think if it depends what your goal is, right? If your goal is to sell robots and get them out the door, then maybe you don't need to do what we're doing. If your goal is to have robots deployed and be useful for a long period of time, and like what, 'cause the, metric that we track is not sold units.

The metric that we track is usage hours every month, every day, every morning we look at what are the number of usage hours from yesterday on our robots? Did they, actually do useful things for, 10 hours a day or one hour a day? and if your metric is that, then you have to be responsible for the service of that system.

if your metric is, did I sell things and get them out the door, then yeah, you don't need to track how they did once you've sold them.

[01:16:21] Audrow Nash: Yeah. Yeah. I, really like the idea of using usage hours as your proxy for success, because a lot of times, I think, a lot of times with new technology, what happens is a lot of bigger companies, like huge companies in manufacturing, but agriculture and other places, they'll buy technology just to try it out.

And it looks promising for the startups that are getting these early trials, especially if it's a prominent company. But then. It's not always a land and expand. It may be a try and, or we'll just keep you at the same level for this kind of thing to see how it develops over time or something like

[01:17:05] Saman Farid: Yeah. I think the most damning thing is a lot of these robots get deployed into innovation centers, right?

[01:17:13] Audrow Nash: ha,

[01:17:14] Saman Farid: environments. And it, they think of it as a point of pride where it's oh, I got deployed in, so and so's innovation center, like Proctor and Gamble is notorious for this, right?

Proctor and Gamble has this giant innovation center of stuff that they will buy and show off and then never use in production. and so we have flatly refused any, anything like that. Like our systems are only ever deployed in real production environments. we never do proof of

[01:17:42] Audrow Nash: Love it.

[01:17:42] Saman Farid: if it's not being used 20 hours a day plus, we're not interested in doing that project.

[01:17:49] Audrow Nash: Do you, so at your current stage being in a hundred facilities, are you, I, imagine you're quite, are, you careful about who you deploy to or who you end up working with? do you vet them for similarity or how do you choose, in your scaling, how do you choose to scale?

or like what decisions are you making while scaling, I suppose

[01:18:16] Saman Farid: Yeah. we talked a little bit about what are the task types that we, are comfortable with. so the first thing and the most important thing is to validate that the task is a fit for the things that we feel like we can hit 99% uptime on. so there's a list of those things.

[01:18:37] Audrow Nash: is that your 99% is what you're going for

[01:18:40] Saman Farid: 99 point, nine, eight is our,

[01:18:44] Audrow Nash: Ah, that's so exciting.

[01:18:46] Saman Farid: past 12 months. Our uptime on all of our deployed systems is,

[01:18:51] Audrow Nash: Oh, hell yeah.

[01:18:52] Saman Farid: point, 99.85, I think is, the last number that we have.

[01:18:55] Audrow Nash: 9, 9, 8, 5. That's awesome.

[01:18:58] Saman Farid: and that's across all of our deployed systems, or the trailing 12 months number.

we're, tracking that religiously. That's another part where, a lot of, robotics companies fall short, right? With, it meets performance targets, and then they have a week of downtime, right? And all of a sudden, that number goes out the door.

[01:19:16] Audrow Nash: Yeah.

[01:19:17] Saman Farid: sorry.

you you're asking about what kinds of customers do we go after? So one, like we have to make sure that the tasks that fit for what we can accomplish. number two is we do have a kind of, a specific type of ICP that we're working with today. typically

[01:19:35] Audrow Nash: What's ICP.

[01:19:36] Saman Farid: Ideal customer profile.

Sorry. so

[01:19:38] Working with Small and Medium-sized Manufacturers

[01:19:38] Saman Farid: our ideal customer profile today are, small to medium sized manufacturers. So between a hundred and a thousand employees, in the facility, is a good fit for us. So we're not selling to, the fortune fifties of the world. we're selling to all of their suppliers. and we have a kind of very specific, reason for that.

primarily it's about speed of decision making. but, all that to say, like we we qualify from a business perspective, we qualify from a technical perspective, and then we qualify also from a operational perspective, right? Like, how many hours a day do they run their plant today? If they increase that, would it be valuable to 'em or not?

Like we have to validate all the business side of things too.

[01:20:21] Audrow Nash: so you look for a good opportunity in a sense with the company. So if they're already say, whatever they're doing is like super efficient already and they probably wouldn't benefit that much from this. then you don't go for them. Or they wouldn't be in as high a priority, a customer to deploy to for this kind of thing. Okay. What do you think your, growth curve will look like over the next year, two years, five years, these kinds of things?

[01:20:51] Saman Farid: yeah, I think we're, growing, probably like three to four X a year right now.

[01:20:59] Audrow Nash: That's

[01:21:00] Saman Farid: our goal is to improve

[01:21:02] Audrow Nash: Keep it doing that.

[01:21:03] Saman Farid: go,

[01:21:04] Audrow Nash: Improve that. Hell yeah.

[01:21:05] Saman Farid: yeah, We wanna get higher than that.

[01:21:07] Audrow Nash: do you

[01:21:09] Saman Farid: the rate of growth is increasing because Our sales team continues to sell kind of new systems, to new customers.

But, one of the areas of a lot of growth for us exactly, is in our existing customers, the more. And deployed systems that we have. Each of those opportunities starts to triple or quadruple as they expand that solution to multiple lines and then move, the upstream or downstream, in that same production line to do new tasks.

[01:21:35] Audrow Nash: That's super cool. Yeah. Yeah. The land and expand is a really powerful thing. It's interesting to me that you are going for smaller companies, like a hundred to a thousand people. 'cause, and I, bet you the decision making is much faster. So that's probably better for you guys.

[01:21:52] Saman Farid: Yeah,

[01:21:53] Audrow Nash: where instead of like really long sales cycles for these large companies, you get to ship much faster after you sell them.

[01:22:03] Saman Farid: the other thing is that these smaller companies don't have innovation centers, right? So there's no,

[01:22:10] Audrow Nash: Oh, you don't wanna be pigeonholed

[01:22:11] Saman Farid: no, it's less than, it's less that and more, we need people who are actively concerned with production outcomes, not, innovation theater.

And I think a lot of the bigger companies have teams of people that do innovation theater where they say, oh, let's do a Silicon Valley tour and let's have a facility where we demo all this stuff and let's show off to our executives and we have a earnings call next week, and we're gonna talk about ai.

most of our customers don't do any of that stuff. It's I produced 200 parts to yesterday. I need to produce 210 parts today, otherwise we're out of business. and so like we're focused on customers that are much more practical in their, worldview.

[01:22:52] Audrow Nash: Oh hell yeah. That's great. That's so funny. The innovation hubs and the, or the innovation, facilities or whatever it is that they have. So silly. But yeah, it's they're just looking like they're being innovative, not really being that innovative.

[01:23:11] Saman Farid: Yeah.

[01:23:12] Audrow Nash: So if you grow three times this year, you'll have, do you think you'll have 300?

Deploy, facilities deployed in a year, or is it,

[01:23:21] Saman Farid: we may

[01:23:22] Audrow Nash: might have three times the number of robots

[01:23:24] Saman Farid: Yeah. I think it'll be

[01:23:25] Audrow Nash: land and

[01:23:26] Saman Farid: of the Exactly. Yeah. We may not be in three times the number of facilities, but we should have, roughly three times the number of robots.

Yeah.

[01:23:35] Audrow Nash: Let's see. Looking forward, what are some of the hard challenges that you guys still are gonna be going through? like where, are the risks to you guys,

[01:23:50] Saman Farid: I think that there are a lot of areas where we could be doing better, in the, the first area is what are the kinds of tasks that we're trying to automate. and as we, expand that list, the level of complexity goes up, more and more, right? So each incremental task requires us to, change everything in our process, right? what are the questions that we ask upfront? what are the, site evaluation things that need to be validated? What are the kinds of robots that are available for that task? What are the, post-deployment, considerations? what are the training videos that we need to create, et cetera.

So like every new task, has a bunch of work associated with it. so we're getting better and faster at doing that. But, like our customers expect us to be able to solve all problems. Like when we show up and we deploy one robot, typically the immediate next thing that happens is, oh, while you're here, can you do this?

And that and the other thing. and, as it's, not, quite that simple yet. but that is one of our goals is to be able to, solve as many problems as possible for our existing customers.

[01:24:58] Audrow Nash: How do you stay focused with that? Because it's a hard thing. 'cause I know, a lot of times when people talk to customers, those customers don't really have a good model of what robotics can do. And so they may suggest outlandish things. And I suppose you probably wouldn't take on any of the like, super difficult robotics things that probably are like, it might even be just specific to them too.

So like super hard and very low value. but how, do you prioritize the things that customers may want?

[01:25:31] Saman Farid: yeah. so we, do a census every once in a while On all the things that people have asked us to do that we've said no to. and we're

[01:25:44] Audrow Nash: you look for low hanging fruit

[01:25:45] Saman Farid: Yeah. We say, okay, if we could provide this type of case packer, here are 20 customers that have already asked for it, and here are 10 more

[01:25:54] Audrow Nash: Oh.

[01:25:55] Saman Farid: we could offer it to.

and so this is like a worthwhile opportunity for us.

[01:26:00] Audrow Nash: Cool. I really like that way of prioritizing or keeping track of things, for things you can expand to next. This kind of thing,

[01:26:10] Saman Farid: Yeah. Because of what we're not trying to do is go and sell to new customer types. I think, the focus is who, are we currently successfully selling to and what are the additional needs that they have that we haven't been able to meet? and so this is where that, the kind of site scan, for example, comes in really handy.

' cause we have a 3D LIDAR scan of every facility that we've walked. and it's not just of we, we don't just scan that one part, right? We scan the whole facility. so we can go back in time and say, what was the, what, were they doing upstream of, this robot or downstream?

we have that, full scan on our, in our

[01:26:48] Audrow Nash: Yeah. That's nice. 'cause then you don't have to get the data multiple times. And that makes things a bit more efficient. Okay. Hell yeah. Any other points of risk to you? So improving your new process I suppose is something, but, anything else that you view as like large challenges in the future?

[01:27:13] Saman Farid: I would say another big one is, like you mentioned earlier, like scaling up the service and maintenance footprint. that clearly has a lot of operational complexity, associated with it, right? We need to have technicians across the country and outside the country. At some point. We need to have, spare parts available.

We need to have, density, in different regions to support, those maintenance technicians. we need to have better remote, error resolution tools. So all of those things, each one, has a lot of, operational questions associated with it. luckily we have a fantastic COO, who's much better at kind of those operational questions than I am.

and

making great progress towards them.

[01:28:01] Audrow Nash: Hell yeah.

[01:28:04] What are Most Robotics Companies Doing Wrong?

[01:28:04] Audrow Nash: So I have the feeling that, you have some interesting opinions about, what a bunch of robotics companies are doing wrong in terms of like how they go to market and or what they're prioritizing. one that we've already said is focusing too much on getting the robot to do the thing and not enough on how to configure the robot for different customers.

but what if you, could reach the robotics community and tell them to focus on something or do something differently, what kind of advice do you have in general for the robotics community?

[01:28:46] Saman Farid: I, think, I. One of them you nailed, right? Which is, I think we need to make sure that whoever the, people who are designing new robots to do new tasks are very, focused on, like the, practical, immediately necessary tasks, right? I think a lot of people get obsessed with the problem.

They're like, what if we had a robot that could, pick up a fabric and, do these kind of very complicated motions and there's actually just not really as big of a market as they might think for that. and so this is the challenge of engineers being engineers is we love to solve technical problems.

and we'd love to make, solve really hard engineering problems. but a lot of the time in robotics, like that's not where the bottleneck is, right? The bottleneck is in all the boring stuff, that we talked about. I think the other thing I would say is for a lot of, I. Robotics companies, they don't think about this uptime metric that you and I talked about, right?

we have a target around 99.9% uptime. that, is something that far too few robotics companies think about on a daily basis. So they'll ship products with all kinds of bugs and, and they'll realize that that robot goes down all the time in the field. and like they'll, it's infuriating.

but the reality is, if you don't solve all of the little mechanical things, oh, this screw rattles and then eventually falls out, or my suction cup breaks after 2000 cycles, or, the facility air pressure, fluctuates more than I expected. And my vacuum tool, suddenly loses pressure and my system goes down every once in a while.

these are all very common occurrences and, If you want to have a robust solution that's actually useful for a manufacturer or any kind of end user, like you need to account for all that stuff. You can't come back and complain and say, you provided me, varying air pressure and so my system was down.

Like it just doesn't fly in a manufacturing facility like you need, to be bulletproof.

[01:31:02] Audrow Nash: Hell yeah.

[01:31:04] Thoughts on the US Manufacturing Sector

[01:31:04] Audrow Nash: And then this is a little off topic, but, or maybe not exactly, not robots, but, what is your read on the US manufacturing system, I suppose overall and how can people help or get involved or I don't know, like where are we headed and how can we make it better?

[01:31:28] Saman Farid: Yeah. I think that American manufacturing like world is facing like a crucible moment, right? this is, I. right now it's do or die, right? I think, American manufacturing has stagnated for the last 20 years and, it's gotten less and less capable. and that is, a function of, less and less skilled, skilled, people coming into the industry, less and less labor available, to do production, and the constant out offshoring that's been happening, right?

People are moving production to, other regions, because that's where they can, meet the performance specs that they need to. the, I think that, the manufacturing industry needs number one. Like skilled people to come and join it, right? And in, in whatever fashion, whether that's engineers, whether that's operations people, whether that's, deployment capabilities.

there's a beauty and there's a romance to making things, that I think is just not a part of our culture that much today. and I think that needs to change. so I'm very enthusiastic about any efforts to improve that like your podcast. number two, I think, utilization needs to be top of mind. I think that there's this fallacy that we hear a lot in American manufacturing, which is, let's let other people do the low value stuff and we'll just do the high value stuff, right? Let's let other people make all the car parts and we'll make the aircraft parts, let's let other people make all the plastic parts and we'll make the high-end medical devices, right?

Like we want, we only want the high margin things, right? There's this sense of superiority. the problem is, a manufacturing industry does not exist if you can't make the low end stuff too, right? Like the people that are good at making the low end stuff are very likely gonna be the people who are good at making the high end stuff right.

If you look at, China's evolution, today, there, you cannot make an iPhone in the us. No matter how much money you put towards that problem, you cannot make.

the most advanced chips in the us no matter how much money you put towards the problem, it's not because we don't have great engineers, but if you don't have all the ecosystem of the kind of the low end manufacturing, the mid end manufacturing and the high end manufacturing, like you end up missing all of the building blocks to do that high-end manufacturing.

So I think it's this like logical fallacy that we've told ourselves in America, which is we're focusing on the high-end high value stuff. Like it's bullshit. if you can't make a cheap component, what makes you think you can make an expensive component? If you can't make a simple component, what makes you think you can make a complex component?

and like the American manufacturing industry just needs to be able to support the full spectrum of things. and I am, I'm very enthusiastic on, things that are helping smaller and medium sized manufacturers. 'cause that's really where the majority of manufacturing happens in the US today.

I.

[01:34:38] Audrow Nash: yeah. Now if you, if our listeners and watchers take away only one thing from this episode, what do you want it to be?

[01:34:48] Saman Farid: I would say like the number one thing that American, like people who are interested in American manufacturing need to spend a lot of time thinking about is, how to increase the utilization of our existing manufacturing base. I think, we currently have in America 300,000 plus manufacturing facilities, that are churning out all kinds of products on a daily basis.

And, the average utilization of these facilities is so low. so I would say, like for any, whether it's a kind of aspiring entrepreneur, whether it's a business owner, whether it's a manufacturer, to the extent that we can get more utilization from our existing, base, there's so much potential there.

and I think that really is the path to, like the promised Land, right? Of having a really strong, robust manufacturing base that creates a world of abundance. it comes from we already have a lot of these assets, let's get the most out of them.

[01:35:45] Audrow Nash: Hell yeah. Love it. All right. I think we'll end with that.

[01:35:51] Saman Farid: Awesome.

[01:35:52] Audrow Nash: having you.

[01:35:53] Saman Farid: Thanks so much for having me, Audrow.

[01:35:55] Audrow Nash: All right. Bye everyone.

[01:35:57] Audrow Nash: What an insightful conversation with Saman Farid from Formic. It's fascinating to see how they're revolutionizing the accessibility of robotics and American manufacturing. A few things that really stood out to me.

First, their focus on increasing utilization of existing manufacturing facilities. It's eyeopening to learn that many American factories are running at far below their capacity.

Second, their commitment to solving real world problems by deploying robots only in actual production environments and tracking usage hours they're ensuring their technology makes a tangible impact.

And finally, their emphasis on the importance of maintaining and servicing robots after deployment. It's not just about selling robots, it's about keeping them running efficiently long term.

As we wrap up, I wanna leave you with a question to ponder. How might increased automation and better utilization of existing facilities change the landscape of manufacturing in your local community?

Thanks for listening and I'll catch you in the next episode.