A day in the life for an AR-enabled plant operator in 2028 – what will it look like?

the future of manufacturing and industry 4.0Today, plant floors range from massive steel plants with operators’ faces dark from grime, to electronics factories that are cleaner than an operating room. From food to fabric, from car parts to rolling mills, the range is grand.

Still. Let’s look at one theoretical day in the life an AR-enabled operator, just a few years into the future. We’ll call him Jeff.

How it used to be for Jeff, way back in 2018

7:56 Monday morning, Jeff arrives for the start of his shift. Now what?

Over his career, Jeff has received work instructions verbally, through scribbled notes, on a printed list, and via a spreadsheet.

More recently it has been through a screen. More recently still, as the plant has increasingly automated, many screens.

For Jeff the work instructions were typically specific – e.g. open valves, start motors – but for others at different plants it might be more task-oriented – e.g. make 3000 pounds of this, then 1000 pounds of something else.

Along with the increasing number of screens, Jeff and his co-workers have experienced an increasing number of mice, touchpads and trackballs, more and more to look at and monitor. Way back in 2018, it wasn’t uncommon to see an operator in a control room monitoring as many 14 or 15 monitors.

Throughout his day, Jeff moves his eyes, hands, attention and activity from one screen to the next to the next, connecting the dots to perform the work.

Jeff’s plant makes toothpaste, so the environment is relatively clean, but for less pristine industries, such as steel, operators typically wear a pair of work-grimed gloves to touch-pad their way across a set of dirt-smeared screens. As currently designed, in many environments, computer screens and heavy manufacturing don’t mesh too well.

Start of Jeff’s day, just a few years from now

Now the start of Jeff’s day is quite different.

Jeff’s safety glasses have a high-resolution screen in the upper corner of one lens, which gives him guidance and instruction all day long. Rather than look to several screens for briefing and instructions, he has a single, integrated interface. He can always see it in the corner of his vision as he does his work, but it does not stop him from focusing elsewhere, or from executing any work-related action.

[… While many AR scenarios presume operators carrying a tablet with instructions and overlays, the need to actually carry is a problem, taking up the work of at least one hand. Plus, there’s the Pokemon Go Problem  – walking into pals or pillars while staring at a screen. Sub-optimal, on the manufacturing floor].

Jeff no longer has to transfer his attention from screen to screen, machine to machine. His work instructions are always there, always up to date – clear and bright, right in front of his eyes.

Next steps

Now Jeff’s glasses have their own mic; he talks to his glasses and the glasses talk back. Data becomes conversation. And not just the click-response of old desktop days – actual conversation.

Better yet, the glasses start to take ownership, responsibility. Now Jeff and the glasses work together to get the job done:

Glasses [greeting Jeff at the start of his day]: Hey Jeff, what’s up?

Jeff: Just starting my shift.

Glasses: What equipment are you running?

Jeff: Packaging Lines 3 and 4.

Glasses: Okay, here’s what’s been going on in the past two hours in your area, what you need to know […]

Jeff: Okay, Glasses, first question. Do I need to change any machine settings right now? Also, can you order me some more cartons for Line 3?

Glasses: Yes. You need to change the torque setting on Line 4 to 8.6. Also, note that this run finishes in 20 minutes, and you’ll need to … [Later, a self-guided-vehicle will deliver the Line 3 cartons to Jeff’s station].

A bit further into the future

Now Glasses gets a super-lightweight camera [note: all the technology for what we describe in this blog already exists].

Jeff: R2D2 has just delivered the Line 3 cartons – here’s the barcode, Glasses [taking a pic with the light weight camera] – is this the right one?

Glasses: Yes. It’s from the new supplier.

Jeff: [calling supervisor Len while also giving Len the camera feed] Len, have I loaded the cartons the right way? They’re from a new provider … they don’t seem to be as thick?

Len: Jeff, see that screw on the left and up a bit? Just tighten it a couple of notches – that’ll   increase the tension to compensate for less thickness – and you should be ready to go.

What’s holding us back? Are we there yet?

There are a number of issues to be resolved before the above scenario can actually occur. In no particular order, the obstacles to overcome are: prescription glasses that can still do the job, batteries that can last a 12-hour shift, and integrated voice recognition that runs locally in a very small footprint.

The conversation is coming

While our plants have been talking to us for a while, as Charles Horth (then Factora CEO, now Board Chair) described in this post in Industry Week, it’s taken manufacturers a while to understand what they were saying. And we’re still in the process of working out how to talk back.

But that is the vision we see at Factora as the future of AR in manufacturing. The conversation is coming, and it’s just around the corner.