Home
Traditional AVAV++®WorldModel™
Published ArchitecturePortfolioWho Are You?ServicesOur StoryWonderlandNewsIPContact
AV++® Technology

Alice® Avatars

Patent-pending life-size digital expert docents that deliver cinematic, personalized guided tours — with real-time rendering, sign language support, and infinite storylines from a single collection.

Architecture Continuum

Imagine walking into an aviation gallery and choosing your guide: a pressure-suited astronaut, a g-suit-clad test pilot, a grease-pencil aircraft designer, or a rocket scientist clutching a coffee-stained notebook. These avatar docents are vividly present — breathing, blinking, and shifting weight like real people. You choose whom to explore with, tell them your age, language, and interests, and your private tour begins.

The venue owns the Body of Knowledge that drives all Avatar content — they control the narrative, the accuracy, and the depth. Because every combination of character, interest, and depth level generates a different experience, repeat visits become genuinely compelling: guests return to explore with a different guide, choose different interests, and discover content they've never encountered before. Our roadmap includes Alice® Avatars accompanying visitors through WonderLens™ spatial AR — combining patented and patent-pending technology for guided experiences where digital characters walk alongside you through physical exhibits.

Come back next month, select a different guide, and the same collection morphs into a brand-new narrative — no flag-waving docent, no twenty-person herd, just a one-to-one deep dive at whatever pace feels right. Every story is drawn from the venue's own Body of Knowledge — the institution controls the content, not the AI.

On the roadmap: Alice® Avatars accompanying visitors through WonderLens™ spatial AR — combining patented and patent-pending technology so that a digital expert walks beside you through the real world, pointing at actual artifacts and explaining what you're seeing in your language, at your level.

Pop-Up Companions, Not Static Kiosks

Avatar docents don't park beside a touchscreen waiting to be poked. They phase in exactly where curiosity spikes. Linger over an artifact and your chosen expert appears at your elbow with context. Ask them to explain in more detail and the story unfolds. Stroll past without stopping and they politely stay out of sight, keeping the gallery uncluttered.

How It Works

QuickSilver® Backbone

Runs on rugged, commodity mini-PCs — think high-spec gaming rigs, not black-box appliances. Need to double GPU power for holographic shaders? Pop the lid and swap a card. Non-proprietary, guaranteeing future spares.

Alice® AI Personalization

Hyper-personalizes every visit. Facial & demographic recognition clocks language preference, visitor type (quick scanner, curious browser, or deep-dive sponge), and tailors content in real time from curated Bodies of Knowledge.

CheshireCat® Recognition

Fully private, ~0.5 second recognition with no internet dependency. Recognizes returning visitors, tracks family groups, and enables personalized avatar responses without any data leaving the venue.

Lory® Accessibility

Flawless speech synthesis, real-time captions, sign-language avatars, and Bluetooth audio streamed to hearing aids — no trenching floors for loop coils. ListenAssist™ for Bluetooth-equipped hearing aids.

Recognizing Avatars

Alice® Avatars don't just present — they recognize. Combined with CheshireCat® recognition technology, avatar docents know who they're talking to. A returning visitor gets continuity: "Welcome back — last time we stopped at the lunar module. Ready to continue?" A family group gets age-appropriate content for the kids while delivering expert knowledge to the parents. A visitor who switches from the pilot to the rocket scientist mid-visit gets a seamless handoff: the new guide knows what you've already seen.

This recognition is fully private. No data leaves the venue. No cloud dependency. No personal identity is stored. CheshireCat® operates with privacy as the default — the venue receives the answer ("returning visitor, interested in engineering"), never the document.

Sign Language Avatars

Alice® Avatars include full ASL (American Sign Language) signing capability. Deaf and hard-of-hearing visitors receive the same rich, personalized storytelling as every other guest — delivered by their chosen expert docent in fluent sign language. This isn't a subtitle overlay or a separate accessibility mode. The avatar itself signs, with proper facial grammar, co-articulation, and natural motion.

Proof at Home: Samtoo

Walk into Mad Systems HQ and you meet Samtoo, our Virtual Receptionist — a live Alice® Avatar. Samtoo greets staff and visitors by name, checks schedules, cracks tailored jokes, and naturally poses for selfies when asked. Need Korean for tomorrow's delegation? The language pack was uploaded during lunch, with no downtime and no vendor service call.

Why Everyone Wins

Guests

Endless perspectives from one building. Families return to hear different experts. Every visit is different. No crowds, no waiting — a private tour without the private tour price.

Curators & Educators

Full editorial control. Update the knowledge base overnight. Spin up limited-time avatars for events. Analytics show which objects spark questions and where curiosity drops off.

Leadership & Finance

Future-proof commodity PC infrastructure. Modular upgrades. Revenue upside through repeat visits, premium ticket tiers, and brand-sponsored personas.

Accessibility

ASL signing avatars, multi-language support (100+ languages), Bluetooth hearing aid streaming, and age-appropriate content delivery — all from one system.

← Back to Technologies
Vertical Markets

40+ Markets. One Technology Ecosystem.

Hover over each market to discover how the AV++® ecosystem transforms that vertical.

🖼️ Museums
Use Case
🎢 Theme Parks
Use Case
🌐 Visitor Centers
Use Case
💼 Corporate
Use Case
🎬 Entertainment
Use Case
📣 Branding
Use Case
🛍️ Retail
Use Case
🚢 Cruise Ships
Use Case
🏨 Hospitality
Use Case
🎓 Education
Use Case
🏥 Healthcare
Use Case
🎟️ Trade Shows
Use Case
🏛️ Government
Use Case
✈️ Transport Hubs
Use Case
🏙️ Smart Cities
Use Case
🎖️ Military & Defense
Use Case
🏟️ Sports Venues
Use Case
🦁 Zoos & Aquariums
Use Case
🔬 Science Centers
Use Case
🎰 Casinos
Use Case
🕯️ Memorials
Use Case
🍽️ Restaurants
Use Case
📚 Libraries
Use Case
⛪ Religious Facilities
Use Case
🚗 Drive-Through
Use Case
💪 Fitness & Wellness
Use Case
Five monkeys experiment illustration
Why Are We Doing This?

Five New Monkeys

There is a well-known parable about five monkeys. It is not really about monkeys. It is about what happens when a group keeps enforcing a rule long after the original reason for that rule has disappeared. In AV and Location Based Entertainment, that pattern shows up all the time.

Five monkeys are placed in a habitat with a ladder in the middle. At the top of the ladder hangs a bunch of bananas. Naturally, one of the monkeys climbs the ladder to grab the bananas. When it does, the researchers spray the rest of the monkeys with cold water. After a few attempts, the monkeys quickly learn the connection.

Banana = Cold Shower.

Before long, whenever a monkey even tries to climb the ladder, the other monkeys pull it down and beat it up to stop it.

1One original monkey is replaced. The new monkey tries to climb. The others attack. It learns not to climb — even though it has never been sprayed.
2Another original is replaced. Same result. The new monkey learns the rule.
3This continues until all five originals are gone.
None of them have ever been sprayed. None of them know why.

But none of them will climb the ladder. And if a new monkey tries, the group will still beat it up.

Why?
Because that is just the way things are done.

That is how large parts of this industry still operate.

Too many teams inherit technical assumptions from the last project, then production assumptions from the last technical stack, then budget assumptions from the last production model. Eventually those assumptions stop looking like choices and start being treated like physics.

Much of that behavior comes from designing around proprietary black box hardware. Once a system is built on closed, single-purpose devices, everything downstream begins to conform to their limitations. Workflows. Support models. Spare strategies. Upgrade paths. Before long, entire organizations are protecting the ladder — without asking whether the ladder still needs to be there.

We chose a different starting point.

We built a system that bases AV hardware on non-proprietary compute node hardware rather than on closed black box endpoints. That decision changed the problem space.

Hyper-personalization became possible — behavior no longer had to be locked inside fixed-function hardware.
Spares were minimized — standardized around a smaller family of node types.
Upgrades became easier — increasing capability often means increasing compute, not replacing entire subsystems.

We did not set out to defend the old ladder. We set out to remove the reasons people thought they could not climb it.

We started our work with five new monkeys.