What Happened
On February 23, 2026, Uber Technologies unveiled Uber Autonomous Solutions, a comprehensive suite of services for autonomous mobility and delivery. The launch reflects Uber’s intention to accelerate its footprint in autonomous operations, both for passengers and goods. The offering is positioned as a global expansion of Uber’s autonomy efforts, potentially impacting how cities and logistics networks adapt to driverless technology (Uber press release via BusinessWire).
On the same day, Wearable Devices Ltd. announced the formation of ai6 Labs, a research and development arm dedicated to neural AI ecosystems that bridge human intent to digital reality. The initiative uses non‑invasive electromyography sensing and mudra‑based innovations, backed by the company’s financial strength following over $20 million in funding secured in 2025 (Wearable Devices press release via GlobeNewswire).
Also yesterday, Guide Labs published Steerling‑8B, an open‑source language model with 8 billion parameters. Unlike typical LLMs, Steerling‑8B is built with a “concept layer” allowing each generated token to be traced back to its training data, aiming to provide interpretability and control essential for adoption in regulated or scientific contexts (Skynet Countdown).
The Details
Uber’s announcement describes Uber Autonomous Solutions as a “comprehensive suite of unique services” aimed at accelerating autonomous mobility and delivery globally. While the press release does not disclose specific vehicle partners or markets, the branding suggests integration across Uber’s existing ride‑hailing and delivery networks (Uber press release via BusinessWire).
Wearable Devices emphasized that ai6 Labs is rooted in its expertise in touchless sensing wearables and is intended to pioneer a neural AI ecosystem. The initiative is underpinned by technologies including electromyography sensors and mudra detection, with plans to translate users’ intent directly into digital actions. The press release cites the company’s funding momentum from 2025 as a foundation for the lab (Wearable Devices press release via GlobeNewswire).
Guide Labs states that Steerling‑8B uses a traceable token architecture, attributing each output token to its source training data via a “concept layer.” The model achieves roughly 90% of the performance of comparable models while requiring less training data, according to the announcement. It is presented as particularly suitable for domains that demand transparency and monitoring of AI reasoning (Skynet Countdown).
Background
Uber has long invested in autonomous vehicles, previously developing Uber ATG and partnering with AV developers. This launch represents a consolidation and rebranding of its autonomy strategy, moving from experimental projects to service offerings.
Wearable Devices operates in the niche of sensor‑based, AI‑enhanced wearables. The launch of ai6 Labs suggests a shift from product focus to building foundational AI ecosystems that could support future wearable interaction paradigms.
Key Facts
These developments reflect broader trends across tech and AI adoption:
- Uber’s new suite aims to integrate autonomous delivery and passenger services under one umbrella.
- Wearable Devices is leveraging over $20 million raised in 2025 to fuel ai6 Labs.
- ai6 Labs centers on non‑invasive neural interface technologies.
- Guide Labs’ Steerling‑8B has 8 billion parameters and open‑source licensing.
- The model’s traceable architecture addresses interpretability challenges.
- Steerling‑8B reportedly achieves 90% of comparable model performance with less data.
What It Means
Uber’s launch may reshape urban mobility and last‑mile logistics if autonomous systems scale effectively. Cities, regulators, and competitors will have to reassess infrastructure and safety frameworks. Yet success hinges on regulatory approval, safety validation, and cost efficiency.
For Wearable Devices, ai6 Labs signals a push toward more intuitive, intent‑driven interfaces. If effective, such technology could change how users interact with devices, particularly in accessibility and lifelogging contexts, but human factors and accuracy in intent interpretation remain open questions.
Guide Labs’ release of an interpretable LLM addresses growing concern over opaque AI decision‑making. Regulators and industries like healthcare and finance may favor models that can trace outputs to data. Yet the trade‑off between interpretability and capability remains to be tested in real‑world applications.
What Comes Next
Uber is likely to pilot Uber Autonomous Solutions in select cities. Observers will watch for early deployments, regulatory responses, and how the service integrates with Uber’s existing offerings.
Wearable Devices may follow the launch with prototype demonstrations or collaborations to showcase ai6 Labs’ capabilities. Watch for publications, wearable prototypes, or developer tools. Similarly, Guide Labs’ Steerling‑8B may attract use in regulated contexts; researchers and developers may test its traceability claims and evaluate performance trade‑offs.