Swipe up to explore

Meet
second
si<t

Advanced Solutions for Independence

2ndSight is the first hardware assistant designed expressly for accessibility, empowering blind and low-vision users with real-time scene, text, and object recognition through intuitive tactile controls—at a fraction of the cost of existing solutions and legacy tool sets.

About 2ndSight

2ndSight was founded with the mission to deliver true agency and independence to blind and low-vision individuals through technology engineered for accessibility from day one. Our flagship wearable fuses an ultra-lightweight, proprietary vision model with dialogic large-language models, enabling real-time object, text, and scene recognition plus conversational guidance.

Designed hand-in-hand with blind beta-testers, the unit features high contrast tactile buttons, ergonomic housing for all-day comfort, and haptic cues that cut through noisy environments. By eliminating superfluous hardware and deeply optimizing our edge-AI pipeline, we match the performance of high-end assistive devices at a fraction of their cost, putting everyday independence within reach for far more people.

Our guiding principles are simple: Accessibility First, Affordability Without Compromise, Privacy & Security, and Continuous Co-Creation driven by real user feedback. As we move toward commercial launch, 2ndSight is looking to partner with accessibility nonprofits, orientation-and-mobility specialists, and educators to turn a single device into a pathway to everyday freedom; because independence isn't a luxury; with 2ndSight, it's finally within reach.

Our Team

Aashir Khiani

Co-Founder

Aashir blends entrepreneurial grit with deep technical expertise. Before 2ndSight, he bootstrapped a Mumbai-based branding and domain studio, sharpening his eye for user-centric design and fast iteration. A self-taught AI engineer, he has built multiple AI-integrated web apps including conversational agentic services. At 2ndSight, Aashir architects our model integration, orchestrates the fusion with dialogic LLMs, and translates tester feedback into hardware-ready features, ensuring every line of code moves us closer to everyday independence for blind users.

John Perry

Co-Founder

John pairs sharp market insight with hands-on technical problem-solving. In prior fintech roles, he conducted market and consumer research, analyzing financial products and economic factors to guide strategy. At 2ndSight he runs business relations, fundraising, and day-to-day operations, while also coding the programs that link dialogic LLMs and ultralight vision models to low-power hardware. His blend of analytical rigor and practical engineering keeps 2ndSight both grounded and technically agile.

Our Projects

Ultra-Light Vision Models

Fine-tuned TensorFlow, optimized for tiny on-device footprints

We prune and quantize TensorFlow CNN/ViT backbones to run at <1 W on our custom embedded platform, delivering real-time object detection, classification, and scene cues with no cloud latency and strict privacy by design.

2ndSight Conversational Vision Aide

Wearable independence powered by on-device vision + dialogic LLMs

A discreet, voice-first assistant that fuses our lightweight vision stack with a conversational LLM. Users can ask context-aware questions ("What brand is this shampoo?") or get guided navigation, empowering blind and low-vision individuals to move, shop, and live with greater agency—hands-free and near-zero lag.

Contact Us

Investors & Media

jack@2ndSight.dev

Questions & Customers

Info@2ndSight.dev