Joe White, chief products & solutions officer, Zebra Technologies, discussed a five-year plan for the company’s expansive industrial workflow portfolio. It spans software, innovations in robotics, machine vision, automation and digital decision-making.

Q&A: Joe White on Advancing Mobile Computing and Digitizing and Automating Operations

Dec. 10, 2024
Joe White has three decades of product development experience. He talks about Zebra Technologies’ adjacent market expansion and the role of AI in modern workflows.

Best known for its rugged label-making and tracking technology such as barcode readers and mobile scanning devices, Zebra Technologies has an expansive industrial workflow portfolio that spans software, innovations in robotics, machine vision, automation and digital decision-making.

If one asks Joe White about the company’s go-to-market platform, he unwaveringly highlights three pillars: asset visibility (giving a digital voice to assets that are operating through the supply chain), connected frontline workers (recognizing the needs of the people) and intelligent automation (automating workflows).

These three areas are the culmination of a 55-year history, dating back to the days when it operated as Data Specialties Incorporated, a manufacturer of high-speed electromechanical products. A string of acquisitions has followed since and the company’s ecosystem today extends to 10,000 partners across more than 100 countries, pointed out White when interviewed at Zebra Technologies’ R&D facility in Mississauga, Ontario.

Deeply Embedded Knowledge

As chief product and solutions officer, White leads the adoption of new enterprise technologies, overseeing the strategy, investments and development of Zebra Technologies’ expansive portfolio.

He comes by his role honestly. Despite graduating with a bachelor’s degree in accounting from the University of Maryland, a deep personal interest steered him towards a technology-driven career. Through determination and grit, White honed his technical expertise by attending night school to master programming and by securing roles that fostered growth along the way. He taught himself to program when he was an accounting system analyst at a law binder company. Then, when the Internet was still in its nascent stages, he seized an opportunity at Digex, the first web hosting company in the world, effectively paving the way for his journey into the evolving digital landscape.

READ MORE: How Zebra Technologies Uses Machine Vision to Transform Production Automation

“I studied to become a network engineer and to be able to program routers and networks,” White said. “That was fun, because nobody knew that.”

A move to an internet company that deployed Wi-Fi and internet infrastructure at hotels across the globe was a bridging role to Matrics, whose founders were “really smart people from DARPA,” and built some of the first RFID technology that eventually went to Walmart. “That company was eventually acquired by Symbol Technologies in 2004 and grew up through Motorola,” White recalled. “They said, ‘Hey, Joe, we need help building the silicon for RFID tags.’ I said, ‘I don’t know anything about silicon, but I’m book smart, so I’ll go figure it out.’”

No matter who drew the other in, the succession of roles laid the foundation for the responsibility he now holds at Zebra Technologies.

Here, White talks to Machine Design about executing the company’s five-year plan, lays out how he approaches an adjacent market strategy, and where he stands on the use of AI in Zebra Technologies’ portfolio.

Machine Design: With all the acquisitions over the years, what is the glue that holds the various technologies together?

Joe White: Those three pillars are our five-year plan. That’s how we’re looking at making investments at a portfolio level. When you talk about product portfolio and how I allocate capital and how I invest in the portfolio, we have our core portfolio, which tends to be the anchor of what Zebra does. And by core portfolios, printing, barcode scanning and mobile computing tend to be the anchors. That’s what we’ve known for over the past 55 years. So that tends to be the anchor point.

But you’ll see we’ve looked at how to leverage that core portfolio, and how to expand—near and adjacent. I look at it from the lens of, “how do I deliver more value to our customers?” And by doing so, “how do I deliver technology and capability out of the portfolio?”

So, when you look at an adjacent market, think about tablets. We were the world’s leader in rugged handheld mobile computers with over 50% share of the global market. And what we heard from our customers around the 2018 timeframe was not only do I want to collect information at the edge, but I also want to process it at the edge.

And where a handheld might be more about collecting data, a tablet form factor, how about: How do I give a manufacturing operator real-time visibility into what his machines are doing, and give them that data on a big enough real estate so that they can look at the operations and understand what’s working well and what’s not working? So, tablets became a good example of that near adjacency.

We extended that product line into a larger tablet form factor and that’s what makes us No. 1 in tablets today. We’ve grown that portfolio pretty dramatically, and we’re delivering that value for the customer of, “I get a single platform, I only do one security patch across all the different modalities that Zebra delivers. I get OS updates at the same time. Single API when I write my apps. I can deploy on them.” Or, for our partner ecosystem, they could put the right device in the right worker’s hand to drive the productivity that they’re about. So, that’s a good example of an adjacency.

Another adjacency would be supplies and smart sensors. We’ve had a long supply business, but we look at expanding those into new categories of sensors and smart supplies. And, not just being an asset tracker, but also giving the condition, the temperature, the environment that you’re seeing in that in that world. RFID fits in there. How do I place RFID technology and deliver real-time track and trace through my supply chain? Those would be good adjacencies that we’ve invested in over the years.

READ MORE: High-Speed Camera and Barcode Reading Technology Showcased by Cognex Corp.

The third, if you think about a concentric horizon that I look at, is the expansion businesses. You see a lot around machine vision, robotic automation, software for the connected frontline worker.

Those are opportunities where we can deliver expanded value to our customers and grow. That’s why, when I look at intelligent automation, it’s really a growth pillar for the longer term. But that’s how we think about portfolio investments as a whole.

MD: I wonder if you can expand on your “adjacent market” approach by walking me through one product and how it is being applied in warehousing and distribution, manufacturing, throughput or healthcare verticals?

JW: Historically, we’ve always played in the supply chain side of manufacturing. So, as I receive components coming in the dock door, they’re typically labeled with Zebra labels. They’re scanned with either a mobile computer or a handheld scanner. And then they go into the factory line and the assembly. If you look at how we’ve added technology through that journey, how do I get replenishment into my factory lines? Well, I could use an AMR (autonomous mobile robot) to be able to take goods to the assembly line and prevent people from walking away from factory lines in real time. This is how we’re deploying some of the robot technology.

When you think about how to get real-time quality inspection: We had an investor day a couple months ago, and I did a good example of the journey of a sneaker through its lifecycle all the way from factory floor to being delivered to your doorstep. And Zebra interacts with that product about 30 times during that journey. It really starts in that factory environment of receiving product but also doing 3D inspection to make sure it’s the right product being shipped to the right person. It could be RFID tech to track and trace visibility; not only to the delivery point, but back through the return point if it ever got returned. So, you can see how in the factory environment we touch the product multiple times as we go through the manufacturing team, warehouse and distribution.

We’ve had a long history in warehouse distribution. If you look at our first mobile computer product, the MC 9000—which is probably the most iconic rugged mobile computer in the industry—it started in warehouse distribution. And, in fact, it really was a combination of technologies. Consider Wi-Fi technology. We created the Wi-Fi symbol back in the day. We gave it up by giving it to the Wi-Fi Alliance, which owns the critical patents for that and to enable the mobile computing market and create a market where you can mobilize compute power at the edge.

This enables receipt of goods in a warehouse environment. You could scan a pallet and know all the products coming into the dock door and where to put it away in the operations. And then we do vehicle-mount computers that sit on the forklifts, that actually take it from the dock door to the put away location and actually put it in a rack.

We do warehouse environment picking fulfillment. Our wearable technologies—we had the first wearable computer in the market. Our WT 6400 is our latest one. That product with ring scanners is used universally across warehouses for doing picking fulfillment operations, where if you have an order for company XYZ, it enables one to grab each item from all the different aisles and put it together on a pallet to ship to them. And that’s often done using our technologies.

And then some of the latest. We’re also using our machine vision capability and technologies to dimension those pallets and to understand what’s being shipped out the door. Then, after you’ve received it, you’re now staging it. You might be cross-stocking it. You might be taking it out to the dock door to ship to an end customer, or to a healthcare environment, where you’re delivering products and goods into a healthcare environment.

In a healthcare environment, this is actually a good growth category for us overall, as we mobilize nurse clinicians to do patient-care work within a hospital. So, as a patient, when you check into a hospital, what is the first thing they do for you? They give you a wristband. That’s a Zebra-printed wristband. It enables track and trace visibility through your entire journey at that hospital. When a nurse comes to your bedside to administer medications, what does she do? She takes our HC50 product, scans your wristband, verifies that you’re the right patient. She scans the medicine, verifies the right medicine with the right dosage gets administered to the patient. That’s what we do around patient care.

Zebra really plays a critical role—forget about manufacturing to your home—but manufacturing all the way to point of care. We play a critical role. I often tell people, “We’re the technology that you see every day, that you don't know who it's from.

I presented a good example of the journey of a sneaker through its lifecycle, all the way from factory floor to being delivered to your doorstep. And Zebra interacts with that product about 30 times during that journey.

- Joe White, Chief Products and Solutions Officer, Zebra Technologies

MD: We cannot have a conversation these days without asking about artificial intelligence. You’ve got the hardware and the software, and you’ve got the deep-learning piece as well. What is AI’s place in Zebra Technologies?

JW: There’s artificial intelligence, and then there’s generative AI. And the way I like to very simply think about them, is AI is a very structured data set. In a manufacturing environment, you’re going to be more likely to use AI because you need 99.99% accuracy through that process. You can’t have dynamic learning algorithms learning on the fly. It’s a very structured environment. You want to repeatedly and accurately identify that part and is it within millimeter tolerance. You can’t afford to deviate over time.

That would be AI, very structured.

Generative AI is more of a deep-learning algorithm where you’re taking unstructured data and creating intelligence out of that unstructured data. When it comes to things like the connected frontline worker, Generative AI can be a very powerful tool to make your newest employee operate at the level of your oldest employee or most senior employee, in a world where labor can be challenging at times or getting a trusted expert on site quickly. The ability for you to train generative AI models that can deliver guidance and capability to your employee at the moment that they're in front of the machine that’s down, is a powerful tool.

AI is something that Zebra has had for a very long time. This is not new. If you look at our machine vision business, even our scanning business, or our wireless LAN in our mobile computers, we’ve been using AI algorithms for a long time to deliver reliability and productivity in those environments, especially around quality inspection with the machine vision.

I think the buzzword today is generative AI, with ChatGPT and OpenAI engaging with Microsoft, Google with Gemini Nano, or Llama with Meta. These provide a whole different spectrum of capability, and they all operate a little bit differently. Where ChatGPT is, “I know all things about everything,” Llama is promoting, “let’s train a model that represents Joe White, can behave like Joe White, think like Joe White, answers questions like Joe White.”

We don’t care about answering for all of Zebra. We just want to do that one thing. I think there’s a lot of opportunity, especially when you think about connected frontline workers and how we can do that.

Last November [2023], we announced with Qualcomm that we have our first generative AI, large language models running on our mobile computers. It’s a little different than some of the other vendors who are running in cloud, which can often be associated with high cloud costs. We’re actually running natively, locally on the mobile computer.

READ MORE: How Deep Learning Complements Machine Vision Solutions

Think of your factory manufacturing environments where Wi-Fi connectivity may not be as pervasive and where they may have holes in the environment. We can run autonomously without relying on outside connectivity. And so that’s been a vision we advanced going into NRF, the National Retail Federation show, where we demonstrated how to take a store associate and train him with everything he needs to know about the products that I’m selling in that environment.

But that could just as easily be a manufacturing environment, a new OT engineer out on the floor. How do I give him all the information about the factory line that he can actually start to troubleshoot in real-time in the field? All that can be trained in a model. Take standard operating procedures, train the model and then, in natural language, you can ask the model: “What do I do if this machine goes down? What is the standard operating procedure? How do I report on that?”

And it will tell you in real time, this is what to do. When you think about the manufacturing environment or health care, these are critical moments to mobilize employees with technology. Also, to give them the power to take action at the moment that matters.

About the Author

Rehana Begg | Editor-in-Chief, Machine Design

As Machine Design’s content lead, Rehana Begg is tasked with elevating the voice of the design and multi-disciplinary engineer in the face of digital transformation and engineering innovation. Begg has more than 24 years of editorial experience and has spent the past decade in the trenches of industrial manufacturing, focusing on new technologies, manufacturing innovation and business. Her B2B career has taken her from corporate boardrooms to plant floors and underground mining stopes, covering everything from automation & IIoT, robotics, mechanical design and additive manufacturing to plant operations, maintenance, reliability and continuous improvement. Begg holds an MBA, a Master of Journalism degree, and a BA (Hons.) in Political Science. She is committed to lifelong learning and feeds her passion for innovation in publishing, transparent science and clear communication by attending relevant conferences and seminars/workshops. 

Follow Rehana Begg via the following social media handles:

X: @rehanabegg

LinkedIn: @rehanabegg and @MachineDesign

Sponsored Recommendations

All About Safety Light Curtains

Dec. 23, 2024
Product spotlight on safety light curtains

Safeguarding Robots and Robot Cells

Dec. 23, 2024
Learn which standards are relevant for robot applications, understand robot functionality and limitations and how they affect typical methods of safeguarding robots, and review...

Automation World Gets Your Questions Answered

Dec. 23, 2024
Enjoy the videos and music you love, upload original content, and share it all with friends, family, and the world on YouTube.

Basic OSHA Requirements for a Control Reliable Safety Circuit (Video)

Dec. 23, 2024
Control reliability is crucial for safety control circuits. Learn about basic wiring designs to help meet OSHA, Performance Level (PL), and Safety Integrity Level (SIL) requirements...

Voice your opinion!

To join the conversation, and become an exclusive member of Machine Design, create an account today!