SusHi Tech 2026: Four domains reshaping hardware and AI
SusHi Tech 2026 focuses on AI, Robotics, Resilience and Entertainment — expect humanoid demos, autonomous-driving software sessions, cyber and climate deep dives, and creative AI debates.
Key takeaways
- SusHi Tech 2026 highlights AI, Robotics, Resilience, and Entertainment as operational domains.
- Evaluate demos by technical maturity, integration burden, and commercial fit.
- Operationalise AI/robotics with governance, telemetry, and lifecycle plans.
- Convert resilience and creative-AI insights into measurable product KPIs.

SusHi Tech 2026 has narrowed its program to four practical domains: AI, Robotics, Resilience and Entertainment. The agenda signals where engineering teams and product leaders should be allocating attention — not as hype topics, but as operational domains demanding concrete trade-offs.
Attendees will see live humanoid robot demonstrations, sessions on the software transformation inside autonomous driving, substantive coverage of cyber defense and climate tech, and frank discussions about how AI is altering the music and anime value chains. These are not loose themes; they represent touchpoints where product, safety, and commercial strategy intersect.
Why these four domains matter now
The program selection reflects a shift from idea-stage conversations to execution pressures. Each domain raises a distinct set of engineering and go-to-market challenges that every founder and CTO must convert into operational plans.
SusHi Tech 2026 is centering conversations around AI, Robotics, Resilience, and Entertainment — areas moving from research into repeatable product engineering and commercialization.
AI: beyond models, into pipelines and ownership
Sessions on AI at the event emphasize how models are becoming an ingredient, not a product. For operators, that means focusing on data pipelines, model governance, latency and cost controls, and IP ownership. If you ship a feature powered by generative models, your product roadmap must include versioning strategies, observability for hallucinations, and a repeatable retraining cadence.
- Prioritise telemetry: log prompts, outputs, and downstream signals to make model behaviour auditable.
- Define guardrails early: who owns content-moderation decisions and how are false positives/negatives surfaced?
- Budget for continuous evaluation: model updates are operational events, not one-off upgrades.
Robotics: demos are useful, but integration wins
Live humanoid robot demos are attention-grabbing, but the operational work is in integrating perception, motion-control and safety into customer workflows. Executable demos validate technical feasibility; commercial viability depends on repeatability, maintainability and lifecycle costs.
- Assess software modularity: is the stack componentised to replace perception or control modules independently?
- Probe edge requirements: what compute, network and power constraints govern deployment?
- Evaluate maintenance strategy: who patches firmware, and how are field failures triaged?
Resilience: cyber defense and climate tech as product requirements
Resilience at the event combines cyber defense and climate-focused engineering. Both are increasingly product-level constraints: security and environmental risk affect design choices, partner selection, and investor due diligence.
Cyber defense
Cyber sessions focus on hardening supply chains, threat modelling and incident response. For scaling companies, the practical takeaway is to treat security as a feature with measurable SLAs and runbooks — not an afterthought.
- Create threat models for core user flows and prioritise mitigations by impact and exploitability.
- Institutionalise incident playbooks with clear ownership and drill cadence.
Climate tech
Climate tech conversations are shifting from outputs to integrations: how do emissions, resilience and regulatory risk manifest in product roadmaps? Founders building climate-facing solutions must translate environmental metrics into customer KPIs and measurable ROI.
- Map climate impacts to buyer pain points — e.g., energy savings, compliance, resilience.
- Design instrumentation so customers can measure outcomes, not just inputs.
Entertainment: AI is rewriting creative pipelines
Panels on music and anime reflect a broader reality: generative AI is changing creative workflows, rights management, and distribution economics. For operators, this raises immediate product and legal questions.
- Rework content pipelines to include model-assisted creation and human-in-the-loop checkpoints.
- Revisit licensing and metadata systems to capture provenance and attribution for AI-assisted works.
- Build UX that exposes editable provenance to creators and downstream platforms.
How to treat demos and talks as strategic reconnaissance
Events often produce noise. Convert sessions into strategic inputs by evaluating each demo or panel against three dimensions: technical maturity, integration burden, and commercial fit.
- Technical maturity: Does the demo show repeatable performance in realistic conditions, or just controlled scenarios?
- Integration burden: How much systems work is required to make the demo product-grade (APIs, security, scaling, maintenance)?
- Commercial fit: Is there a clear customer who will pay for the capability, and does pricing reflect total cost of ownership?
Use these axes to triage follow-ups. A flashy humanoid demo with brittle software scores poorly on integration burden even if underlying components are impressive. Conversely, a less glamorous sensor or model that integrates cleanly may warrant a pilot.
Practical playbook for founders and CTOs
Before the event, map which sessions address your product gaps. During the event, focus on rapid validation and partner qualification. After the event, convert insights into tests and milestones.
- Pre-event: list three unknowns you want answered — safety, cost, partner capability — and schedule conversations accordingly.
- On-site: request technical runbooks or architecture diagrams, not marketing slides; prioritise teams that can share measurable performance data.
- Post-event: create a 30–60–90 day plan for the most promising integrations (pilot scope, success metrics, engineering owner).
What This Means For You
If your roadmap touches AI, robotics, resilience or entertainment, treat SusHi Tech 2026 as a diagnostic rather than a spectacle. The practical outcomes to aim for are specific: a validated integration partner, a prioritized security mitigation, a measurable climate KPI or a content-provenance workflow.
Operationalise what you learn by converting demos into technical acceptance criteria, and panels into hypothesis-driven experiments. Keep the focus on systems that scale: modular software, repeatable testing, observable behaviour, and measurable customer impact.
Key Takeaways
- SusHi Tech 2026 concentrates on AI, Robotics, Resilience, and Entertainment — areas moving into operational execution.
- Treat demos as evidence of feasibility; prioritise integration, maintenance and commercial viability.
- Operational readiness for AI and robotics requires engineering plans for governance, observability and lifecycle management.
- Translate resilience and creative-AI conversations into measurable product requirements and customer KPIs.
Next move
Continue the operator thread — or move from reading to execution.
Continue reading
More Originae insights from the same operating thread.

When a model release is paused: reading Anthropic’s Mythos move
Anthropic limited the rollout of its new model, Mythos, citing that it was “too capable of finding security exploits.” Here’s a clear operational read on what that claim does — and doesn’t — tell you.

Railway’s $100M bet: AI-native cloud for instant deploys and cheaper infra
Railway raised $100M to commercialize an AI-native cloud: sub-second deploys, per-second billing and custom data centers. Founders and CTOs should map implications for build loops and costs.

Goose vs Claude Code: How local AI breaks the $200/month era
Anthropic's Claude Code charges up to $200/month with opaque rate limits. Block's open-source Goose runs locally, free, model-agnostic, and preserves developer control.