Smiley face
Weather     Live Markets

Amazon’s Smart Glasses: Revolutionizing Package Delivery

At Amazon’s recent “Delivering the Future” event in San Francisco, GeekWire’s Todd Bishop experienced the company’s new smart delivery glasses firsthand during a simulated demo. These high-tech glasses, still in prototype phase, represent a significant shift in how Amazon delivery drivers interact with technology on their routes. Despite their advanced components and slightly bulky design, Bishop found them surprisingly comfortable and only marginally heavier than regular eyewear. The real magic began when a monochrome green text display appeared in the right lens, showing delivery information including an address and sorting code. This heads-up display occupied just a portion of his field of vision, allowing natural awareness of surroundings while providing critical delivery details.

The functionality impressed Bishop as he worked through the simulated delivery. When looking at packages, the glasses automatically recognized and scanned the correct labels, displaying progress with a simple checkmark. The system provided audio alerts about potential hazards (like dogs on properties) and seamlessly transitioned into navigation mode once packages were scanned, showing a simplified map with the user’s location and delivery destinations. The process eliminated the need for handheld devices entirely – to capture proof of delivery, Bishop simply looked at the package on the doorstep and pressed a button on the small “compute puck” controller attached to his harness. While he initially worried about potential distraction from the display, Amazon representatives pointed out that the alternative – constantly looking down at a handheld device – actually creates greater safety risks. Unlike consumer AR devices like Meta Ray-Bans or Apple Vision Pro, Amazon’s glasses are purely utilitarian, designed specifically for the delivery context.

KC Pangan, an Amazon delivery driver in San Francisco who has been testing the glasses for two months, shared his enthusiasm for the technology. “The best thing about them is being hands-free,” Pangan told Bishop during the event. He explained that the glasses have become so natural that he barely notices wearing them, and when occasionally returning to handheld devices, finds himself thinking, “Oh, this thing again.” The hands-free aspect allows him to maintain better situational awareness by keeping his eyes up rather than looking down at a device. This improves safety in several ways, including maintaining the recommended three points of contact when entering or exiting vehicles and better managing packages while opening gates. According to Pangan, the glasses “do practically everything for me” – from taking photos to providing navigation guidance and showing his location relative to his van. While Amazon emphasizes safety and driver experience as primary goals, early tests suggest efficiency benefits as well, with preliminary data showing up to 30 minutes saved per shift.

These smart glasses emerged from a brainstorming session about five years ago, according to Beryl Tomay, Amazon’s vice president of transportation. During an annual ideation meeting, her team posed a provocative question: What if delivery drivers didn’t have to interact with technology at all? “The moonshot idea we came up with was, what if there was no technology that the driver had to interact with — and they could just follow the physical process of delivering a package from the van to the doorstep?” Tomay explained. This led to experimentation with various approaches before settling on glasses as the optimal solution. Though initially seeming implausible, early trials with delivery drivers quickly revealed the potential, with users describing the hands-free experience as “magical.” The project has already been tested with hundreds of drivers across more than a dozen Delivery Service Partners (DSPs), with plans to expand trials in November before determining a timeline for wider deployment. Notably, Amazon states that using the glasses will be entirely optional for both DSPs and their drivers, even after full rollout, and the devices will be provided at no cost.

The technical system extends beyond just the glasses themselves. The photochromatic lenses darken automatically in bright conditions and can be fitted with prescription inserts. Two cameras support functions like package scanning and proof-of-delivery photos, while a built-in flashlight activates automatically in dim settings. The glasses connect via a magnetic wire to a small “compute puck” worn on a heat-resistant harness, which houses AI models, manages the visual display, and includes an emergency button connecting drivers to Amazon support. A swappable battery on the opposite side balances the system and provides power for a full shift. Connectivity runs through the driver’s Amazon delivery phone via Bluetooth and through the vehicle using Amazon’s “Fleet Edge” platform. This integration allows smart features like automatically activating the display when a van is parked and disabling it when in motion – a deliberate safety measure to prevent distraction while driving. Data collected by the glasses feeds into Amazon’s “Project Wellspring,” which uses AI to improve mapping, identify safe parking spots, locate building entrances, and optimize walking routes, all while implementing privacy measures like blurring personally identifiable information.

The glasses represent just one aspect of Amazon’s broader investment in delivery innovation and safety. During his visit, Bishop also experienced Amazon’s comprehensive driver training programs, including a slip-and-fall demo that taught proper walking techniques on slippery surfaces, VR training for identifying hidden hazards like pets under vehicles, and a challenging Rivian van simulator called EVOLVE (Enhanced Vehicle Operation Learning Virtual Experience). These training technologies are part of the Integrated Last Mile Driver Academy (iLMDA), currently available at 65 sites with plans to expand to over 95 delivery stations across North America by the end of 2026. While these initiatives show Amazon’s commitment to improving the delivery experience, they also raise questions about the company’s relationship with its DSPs and drivers. Amazon doesn’t directly employ these delivery personnel, instead contracting with ostensibly independent companies that handle hiring and management – an arrangement that has occasionally sparked friction and legal disputes over autonomy and accountability. As Amazon deepens its technological involvement with these partners through innovations like smart glasses, questions about who truly controls the delivery workforce may intensify. Nevertheless, if the glasses live up to their potential, they could represent a significant advancement in both safety and efficiency for the challenging work of package delivery.

Share.
Leave A Reply