Stock Talk@stocktalkweekly
The on-device inference stock I shared with our community members on Wednesday was Synaptics $SYNA
This is the thesis I shared:
In my view, one of the biggest themes of this year will be Edge Compute & On-Device Inference.
Currently, the only position in my portfolio that addresses this theme is $OSS (opened @ $4.71 on 11/26/25, before the crowd knew about it), which is a core position.
However, I would like to layer exposure here with a few more stocks.
One of those stocks is Synaptics $SYNA
You should think of an “AI-capable device” as having three big layers:
Sensors & inputs (camera, microphones, touch, motion sensors)
Local compute (the chips that run the model and decide what’s happening)
Connectivity (Wi‑Fi/Bluetooth/etc. to talk to phones, routers, or other devices)
SYNA has products that map to (2) and (3) directly, and it markets itself as providing an integrated edge stack (compute + connectivity + multimodal support)
SYNA’s SL2610 family is described as integrating:
- General compute (Arm CPU cores)
- GPUs (NVIDIA or otherwise)
- An AI acceleration subsystem (Synaptics' Torq platform)
In layman’s terms: it’s designed so an OEM can build a device that sees/hears something, runs an AI model locally, and reacts, without needing a datacenter or offsite server
GOOGLE PARTNERSHIP
On January 2, 2025, Synaptics announced it was collaborating with Google on Edge AI for IoT to define “optimal implementation of multimodal processing for context-aware computing,” integrating Google’s MLIR-compliant ML core on Synaptics Astra hardware with open-source software/tools.
In October 2025, Synaptics announced the SL2610 line and stated that Torq delivers the first production deployment of Google’s RISC‑V-based Coral NPU with dynamic operator support, using an open-source IREE/MLIR compiler/runtime approach.
Google’s own developer documentation calls Synaptics its first strategic silicon partner for Coral NPU and says SL2610 features the first production implementation of Google’s open-source Coral NPU ML core
THE 'ASTRA' PROGRAM
Astra is not a single chip. It’s Synaptics’ AI‑native IoT compute platform that bundles together processor families (silicon), SL-series (high power MPUs built on Arm Cortex CPUs), SR-series (low-power MCUs) + software, hardware & connectivity.
Astra is Synaptics effort to sell the “whole kit” that makes an IoT device smart on its own: a processor that can run AI locally, the software tools to deploy models, and the dev hardware + wireless pieces to get a product built and shipped faster.
Astra launched in April 2024, but in October 2025 they updated Astra to the next generation purpose-built for edge AI workloads:
- Synaptics announced the Astra SL2600 Series, launching with the SL2610 product line, positioned for multimodal Edge AI and a wide set of IoT endpoints (appliances, automation, charging infrastructure, healthcare, retail POS/scanners, robotics/UAVs, etc.) -- this provides very robust multi-theme exposure
- SL2610 is explicitly tied to the Torq Edge AI platform and Google’s open Coral NPU
The thesis behind on-device compute is that: some inference leaves the data center and moves to endpoints Astra is a direct lever to this idea, because Astra is literally designed for endpoints that:
- generate raw data (camera/mic/sensor)
- need real-time responses
- can’t depend on network reliability
In my opinion, the company trades at a very reasonable valuation of 21x free cash flow, 21 trailing P/E, 15 forward P/E, 3x sales ($1.1B vs. $3B mkt cap) with a PEG ratio of 0.82 (which is below 1, implying the stock is undervalued).
Management is guiding for 25-30% CAGR in their IOT business over the next 4 years, which is massive growth in the key segment considering this type of undemanding valuation.