|

Digital Freedom Alliance Virtual Assistants

🧪 Anonymous Access to Digital Freedom Virtual Assistants – Placeholder Page

Project Status:
🛠️ Planned
👉 🚧 Under Development
🔬 Experimental
🧪 Beta Testing
🕓 Pending Specification
🧩 Awaiting SLOT Integration
📅 Roadmapped (future cycle)


📄 Project Description

This project is building the infrastructure for highly anonymous access to Digital Freedom Alliance VAPA‑powered AI Virtual Assistants — a system that ensures no identifiable user data is ever linked to assistant sessions. It provides that crucial extra layer of separation for users escaping Big Tech control.

The goal is to host AI‑powered assistants trained on:

  • DFA reference docs including the 12 Principles of Digital Freedom
  • All 10 Steps of the Escape the Big Tech Digital Prison curriculum
  • Digital Freedom Cloud resources, including summaries and transcripts from Digital Freedom Tube content
  • The DFA internal knowledge base across the 5 domains

Online Assistant Architecture (Current Design)

The online DFA assistants currently run on a federated but integrated architecture:

  • Network of 5 sites
    A cluster of five WordPress sites, each with a specific role (public information, training, coaching, resources, etc.), all connected to a shared Pinecone vector database:
    • Content and embeddings are shared at the data layer.
    • Each site can contribute documents to, and retrieve context from, the same knowledge pool.
  • Shared Pinecone knowledge base
    • Stores vector embeddings for:
      • DFA reference docs
      • 10 Steps curriculum materials
      • Digital Freedom Cloud resources (including DFA Tube summaries/transcripts)
      • Selected DFA internal documents
    • Gives assistants a unified semantic view of DFA content, even though it is physically distributed across multiple sites.
  • Coaching site as the Assistant hub
    • The DFA coaching site hosts the members area where the Virtual Assistants live.
    • Logged‑in members access assistants from within this compartmentalised coaching environment.
    • From there, assistants can query the entire 5‑site network via the shared Pinecone index, while keeping user interaction isolated to the coaching site.
  • WordPress integration: AI Engine Pro (Meow Apps)
    • The coaching site uses the AI Engine Pro plugin from Meow Apps as the operational layer for:
      • Running VAPA‑based assistants and chatbots.
      • Managing chat sessions and assistant configurations.
      • Connecting to external LLMs via API.
    • VAPA prompt specifications define each DFA Assistant’s role, privacy behaviour, and workflows; AI Engine Pro provides the runtime container inside WordPress.
  • VAPA as the Assistant framework
    • Each DFA Virtual Assistant is implemented as a VAPA profile:
      • Clear role and domain
      • Strict privacy and abstraction rules
      • Explicit instructions about what data it may and may not handle
    • VAPA ensures assistants:
      • Work primarily with abstracted profiles and scenarios.
      • Avoid pulling or requiring raw identifying information.
      • Exchange only generic artefacts between assistants (e.g. Personal Risk Profile summaries, not detailed audits).
  • LLM connectivity (model‑agnostic via API)
    • The system is LLM‑agnostic: AI Engine Pro + VAPA can connect to any compatible model via API.
    • The current design is exploring OpenRouter as a routing layer:
      • Allows choosing between multiple underlying LLMs.
      • Provides flexibility to switch models or providers without changing assistant logic.
    • This preserves long‑term autonomy: assistants are not tied to a single vendor or model.

📇 Project Details

FieldValue
Assistant Name:Anonymous Digital Freedom Assistants (working title)
Maintainer:The Virtual Webmaster
Affiliation:Digital Freedom Alliance (DFA)
Proposed License:CC BY-NC-SA 4.0

📬 Contact


Here’s the definitive, consolidated version incorporating all the additional detail from the earlier document into the current authority version we just built.


DFA Virtual Assistants and the 10‑Steps Roadmap

Introduction

The Digital Freedom Alliance (DFA) is developing a suite of privacy‑aware Virtual Assistants as practical tools to help people assess risks, design safer digital practices, and “escape to digital freedom.”

These are not “AI friends” with human‑like names. They are clearly‑defined tools built on top of the VAPA framework, designed for:

  • Strong privacy and compartmentalisation
  • Transparency and user control
  • Portability across different LLMs and platforms

All DFA Assistants and their prompt specifications will be published under a Creative Commons license and made available via the CARS/VAPA system so people can:

  • Download and run them locally, and
  • Inspect and adapt the logic to fit their own needs.

Data Files and Access Model

We cannot publish the full DFA working database (all files across the 5 domains). It is simply too large and contains too much material to reasonably ship as a single bundle.

Instead:

  • Each assistant will ship with a basic data file:
    • Core concepts, key tables, and minimum reference content to function usefully.
  • To unlock the full power of the assistants (deep reference content, rich examples, continuously updated materials), users will need a subscription to the DFA coaching service, which provides:
    • Access to the hosted assistants backed by the full DFA knowledge base.
    • Guided coaching to use them safely and effectively.

Each assistant is designed so it can operate in a degraded but still useful mode using the basic data file, and in a full‑featured mode when connected to the coaching environment.


Privacy, Compartmentalisation & Data Minimisation

Compartmentalised Assistants

Each assistant sees only what it needs:

  • Assistants work primarily with summarised / generic inputs (e.g. “high‑risk journalist”, “small community organiser”) rather than raw, identifying data.
  • For example, the Custom CAR Builder only ever sees your Personal Risk Profile report, not the detailed Digital Life Audit that produced it.
  • Assistants are designed to exchange:
    • Generic profiles
    • Aggregated risk scores
    • Abstracted scenarios
      rather than raw logs of your activities.

“Paranoia Level” / Privacy Settings

All DFA Assistants will expose explicit privacy controls:

  • Ability to set your “level of paranoia” or privacy strictness.
  • Expanded versions of VAPA’s Privacy Adviser Mode to:
    • Warn you when you are about to overshare
    • Suggest safer ways to phrase or abstract what you are describing
    • Offer example scenarios instead of requiring your real details

Working with Categories, Not Identities

You do not need to provide specific, identifying data for the assistants to be useful.

  • You can describe:
    • Roles (e.g. “union organiser in hostile environment”)
    • Risk types (e.g. “high risk of surveillance by employer”)
    • Concerns (e.g. “fear of doxxing”)
  • The assistants will:
    • Use generic labels and placeholders
    • Provide scenarios and examples
    • Clearly label advice as general when you choose to be vague

There is always a trade‑off: more specificity → more accurate guidance, but you stay in control of how precise you want to be.

DFA Privacy Coaching Site & Access Model

A separate website exists for the DFA Privacy Coaching service, with a membership area providing access to all coaching assistants.

Key design points:

  • The only data stored for the WordPress members area:
    • Username
    • Password
    • Coaching level / access tier
  • The only resources in the members area are DFA Virtual Assistants, each in compartmentalised rooms/spaces.
  • You do not need to reveal your username to the assistants:
    • Being logged in with a given access level is sufficient.
    • Assistants have no way to link your real‑world identity to your username or any identifying info.
  • The only potentially identifying data is whatever you choose to type in your conversations.
    • Assistants are privacy‑aware and will:
      • Warn about oversharing
      • Offer abstraction options
    • But you are always in control of what you disclose.

Learning to interact safely and privately with these assistants is itself a core component of the DFA training.

Pay‑What‑You‑Want (PWYW) for Local Bundles

For downloadable/local versions of the DFA Assistants:

  • We will use a Pay‑What‑You‑Want model.
  • People can:
    • Download assistant bundles (prompt specs + basic data files) for free
    • Contribute financially if they are able and willing

Online (Hosted) vs Downloadable (Local) Behaviour

These points apply to all DFA Assistants.

Hosted / Coaching Environment

  • Hosted DFA assistants:
    • Run in private rooms (e.g. separate Matrix rooms / private contexts).
    • Have access to the full DFA working database across the 5 domains (subject to licensing and safe‑use constraints).
    • Benefit from:
      • Continuous updates to knowledge and patterns
      • Cross‑linking between assistants
      • Direct integration with the DFA coaching workflow
  • Compartmentalisation is preserved:
    • Different assistants may share generic outputs (e.g. your risk profile summary), but not raw sensitive data from your inputs.

Downloadable / Local Use

  • Local DFA assistants:
    • Are distributed as VAPA/CARS‑compatible bundles:
      • Assistant spec (prompts, logic, workflows)
      • Basic data file with essential reference content
    • Do not have access to the full DFA working database.
  • Users:
    • Provide their own data (stored in tools they control, like Joplin).
    • Can stack multiple assistants locally (via VAPA slots) if they choose.
  • The local environment is ideal for:
    • Privacy‑maximalists
    • People who want total offline control
    • Those comfortable managing their own data and backups

The same assistant design powers both modes. The difference is in how much data the assistant can draw on and whether it is embedded in the coaching environment.


How the Assistants Support the 10‑Steps

The DFA Virtual Assistants support the 10 Steps to Escape the Big Tech Digital Prison.
They do not replace the training.

A coaching client should always work with:

  • The 10‑Steps training (Workbook + Playtime)
  • The Escape to Digital Freedom Coach (human + Coach assistant)
  • The specialised assistants at the right points in the journey

List of currently planned DFA Virtual Assistants

#1: Escape to Digital Freedom Coach

Main Step:

  • Step 1 – The Escape Plan

Secondary (supporting) role:

  • Steps 2–10 (never primary outside Step 1, always available as a secondary)

Role in the roadmap

  • Introduces the Race to Escape the Big Tech Digital Prison and the 10 Steps.
  • Helps the client understand:
    • Where they are starting from.
    • Which Step they are working on now.
    • Which assistant to use next.
  • Keeps the client anchored in the training and Workbook, not just “tool use”.
  • Acts as a central guide and coordinator:
    • Explains the 10‑Steps training and core DFA concepts.
    • Helps you decide which specific assistants to use next.
    • Keeps you oriented in the overall journey.

How a coach might use it

  • At the beginning:
    • Use the Coach assistant to walk through The Escape Plan and clarify goals, constraints, and expectations.
  • During the program:
    • Ask the Coach assistant: “Given this client’s current Step and outputs, what should we focus on next?”
    • Get pointers to the correct specialist assistant and relevant Workbook sections.

Online (hosted) notes

  • May expose additional “course navigation” features:
    • Progress markers
    • Links into exercises and worksheets
    • Cross‑references to other assistants

Downloadable / local

  • Includes:
    • 10‑Steps training core content (or summaries)
    • Key DFA documents and patterns
  • Helps you recreate the core trajectory of the program even if you’re working primarily offline with your own tools.

The Coach is the hub that helps you move through the program step by step.


#2: Digital Life Audit Assistant

Primary Step:

  • Step 2 – Give Big Tech the Mushroom Treatment

Secondary usage:

  • Step 3 – Take Control of Your Digital Life (ongoing audit refinement)
  • Step 5 – Reduce Your Digital Footprint (identify what to trim or harden)

Purpose
To guide you through a structured Digital Life Audit while keeping sensitive data in your own private system (e.g. Joplin).

Role in the roadmap

  • Step 2 is about Digital Life Separation and starting to identify distinct “parts” of the client’s digital world.
  • The Digital Life Audit Assistant:
    • Guides the client to map their Digital Life: accounts, devices, platforms, roles, channels, dependencies.
    • Helps group activities into separable Destinations and emerging CARS candidates.
    • Produces a structured view that later feeds the Driver Risk Assessment and CAR design.

Key features

  • Step‑by‑step guidance:
    • Walks you through identifying accounts, services, devices, roles and activities.
    • You record all specific, sensitive details only in your private “Escape to Digital Freedom Workbook” (e.g. Joplin).
  • Risk scoring guidance:
    • Helps you assign a risk score to each part of your Digital Life.
    • Prompts you to consider threat models (e.g. state actors, corporate surveillance, doxxing, harassment).
  • Compartmentalised output:
    • The assistant keeps only generic placeholders and labels that you recognise, such as:
      • “EMAIL‑1: primary email account”
      • “SOC‑2: high‑risk social media account”
    • The actual usernames, URLs and credentials remain only in your private workbook.

How a coach might use it

  • Soon after Step 1:
    • Run one or more sessions where the client works with the Audit Assistant while filling their private Workbook.
  • Emphasise:
    • The client keeps detailed lists and credentials; the assistant sees generic structure and labels, not secrets.
    • This map becomes the base layer for all later risk work and CAR building.

Result:
You finish with a clear, structured view of your Digital Life and risk scores, while sensitive details never leave your control.


#3: Personal Risk Profile Assistant

Primary Step:

  • Step 3 – Take Control of Your Digital Life (Driver Risk Assessment component)

Secondary usage:

  • Step 5 – Reduce Your Digital Footprint
  • Step 6 – Walk Out from Under the Digital Surveillance Umbrella
  • Step 7 – Identify and Mitigate Threats
  • Periodic re‑assessment as life and threats change.

Purpose
To convert your Digital Life Audit into a generic Personal Risk Profile that other assistants can safely use.

Role in the roadmap

  • Step 3 includes the Driver Risk Assessment: understanding risks to the person (the “driver”) across their compartments.
  • The Personal Risk Profile Assistant:
    • Uses the outputs of the Digital Life Audit (Step 2).
    • Guides the client through a structured Driver Risk Assessment:
      • Assigns risk levels to each part/role.
      • Identifies cross‑contamination risk between areas.
    • Produces a generic Personal Risk Profile:
      • Safe to reuse across assistants (no direct identifiers).
      • Becomes the reference for decisions in later Steps.

What it does

  • Inputs:
    • Compartmentalised output from the Digital Life Audit Assistant (no raw credentials, no identifying accounts).
  • Analyses:
    • Role‑based risks (e.g. activist, whistleblower, vulnerable worker, public figure).
    • Cross‑cutting risks (e.g. one high‑risk area that endangers many others).
  • Generates:
    • A generic Personal Risk Profile with:
      • Aggregate scores
      • Risk categories (low/medium/high/critical)
      • Scenario‑based notes (e.g. “high risk if employer or government targets you”)
    • Designed as a clean, depersonalised input to other assistants.

Guidance

  • Highlights that high‑risk aspects of your life can affect even seemingly “low‑risk” activities.
  • Helps you:
    • Identify and prioritise risk areas
    • Map each aspect of your digital life to a potential CAR (Compartmentalised Autonomous Rig)

How a coach might use it

  • After a reasonably complete audit:
    • Run the Personal Risk Profile Assistant to generate or update the Driver Risk Assessment.
  • When circumstances change:
    • Rerun or update the profile and then revisit CARS / platforms based on new risk levels.

Primary function:
To help you decide which CARS you need to build.


#4: Push the Overton Window Assistant

Primary Step:

  • Step 5 – Reduce Your Digital Footprint

Secondary usage:

  • Step 8 – Find Alternatives to Big Tech (messaging around migration and public statements)
  • Step 10 – Join The Race and Escape (activism, outreach, public communication)

Purpose
To support participants in the “Open / Push the Overton Window” Challenge with safe, strategic communication.

Role in the roadmap

  • Step 5 is about reducing your footprint and changing how the DRIVER behaves and communicates, so trails don’t expose or link CARS.
  • The Overton Assistant:
    • Helps clients reconsider:
      • What they say,
      • Where they say it,
      • How they say it.
    • Supports the Tap / Push / Big Push framing from the Overton Challenge:
      • Saying what they mean,
      • In forms less likely to trigger censorship, automated moderation, or long‑term risk.
    • Is especially important for:
      • Activists, campaigners, whistleblowers,
      • Anyone whose speech is central to their threat model.

Core functions

  • Explains:
    • The Overton Window concept
    • The specific DFA Challenge rules and goals
  • Helps you:
    • Clarify what you really want to say.
    • Understand where the window is now (what is sayable / shareable in your context).
    • Design a strategy for nudging the window rather than smashing it.

Message creation

  • Guides you to craft:
    • Core messages and narratives
    • Safer variations that:
      • Communicate your real concerns and values
      • Avoid likely censorship triggers or automated filters
  • Supports:
    • “Say what you need to say without breaking the window”
    • Rewriting and layering messages (e.g. public vs semi‑private vs “inside” versions).

Context awareness

  • Connects to curated resources on:
    • Current censorship patterns
    • “Banned words” and likely flags
    • Case studies from the Overton Window Challenge and related activism

How a coach might use it

  • At Step 5:
    • Use it to review the client’s existing visible footprint (where appropriate): posts, blogs, campaigns, profiles.
    • Design safer patterns of communication that reduce linkability and avoid unnecessary exposure.
  • Later:
    • Reuse it whenever the client prepares sensitive public content, outreach, or campaigns.

The assistant is both a strategy coach and a message‑crafting tool for Overton Challenge participants.


#5: CARS Threat Analysis Tool

Primary Steps:

  • Step 4 – Secure Your Digital Life
  • Step 6 – Walk Out from Under the Digital Surveillance Umbrella
  • Step 7 – Identify and Mitigate Threats

Secondary usage:

  • Step 8 – Find Alternatives to Big Tech (checking alternatives against threat models)
  • Step 9 – Using CARS to Escape (checking final CARS before rollout)

Purpose
To evaluate specific CARS (technology stacks and workflows) you intend to use, and identify threats to you in your context.

Role in the roadmap

  • Step 4 focuses on securing the newly mapped assets and environments.
  • Step 6 focuses on evasive manoeuvres and staying under surveillance radar.
  • Step 7 focuses on Digital Risk & Threat Analysis and targeted mitigations.

The CARS Threat Analysis Tool:

  • Uses:
    • The Digital Life Audit (Step 2)
    • The Personal Risk Profile (Step 3)
    • Trust information (from the Trust Ranking Assistant, where available)
  • Helps:
    • Identify key threat vectors against specific CARS or segments of the digital life.
    • Distinguish between mass surveillance and targeted threats.
    • Suggest focused mitigations appropriate to each CAR or Destination.

Focus

  • Evaluates an entire CAR (stack + workflow), not just individual tools.
  • Considers:
    • Intended use of the CAR
    • Your Personal Risk Profile (optional input)
    • Trust Ranking Reports of the components

Capabilities

  • Compares your CAR design against:
    • Known threats, vulnerabilities, and failure points.
    • Your risk profile and intended use.
  • Helps you answer:
    • Are there obvious weaknesses in this CAR?
    • Do any of these weaknesses specifically affect me?
    • Is this CAR suitable for my level of risk, or should I choose another / modify this one?
  • Inputs may include:
    • Trust Ranking Reports
    • Your Personal Risk Profile
    • Generic descriptions of the CAR stack (no need for account credentials or identifiers)

Think of it as a “roadworthiness check” (RWC) for each CAR before you drive it.

How a coach might use it

  • After Step 3:
    • Use the tool at Step 4 to prioritise what needs locking down first.
  • At Steps 6–7:
    • Use it to drive scenario‑based threat and mitigation discussions.
  • Before and after migrations or major changes:
    • Run a quick threat check on updated CARS or new architectures.

#6: Custom CAR Builder Assistant

Primary Step:

  • Step 9 – Using CARS to Escape

Secondary usage:

  • Step 7 – Identify and Mitigate Threats (fine‑tuning CARS during risk workflows)
  • Step 8 – Find Alternatives to Big Tech (building CARS around chosen alternatives)

Purpose
To help you design, choose, or adapt a CAR that fits your Driver / Destination pair and your risk profile.

Role in the roadmap

  • By Step 9, the client has:
    • Mapped their Digital Life (Step 2),
    • Assessed risk (Step 3),
    • Secured key areas (Step 4),
    • Reduced footprint (Step 5),
    • Implemented evasive measures (Step 6),
    • Run threat analyses (Step 7),
    • Identified alternatives (Step 8).

The Custom CAR Builder Assistant:

  • Takes:
    • DRIVER / DESTINATION pairs,
    • The Personal Risk Profile,
    • Threat information from Step 7.
  • Then:
    • Selects or adapts suitable CAR patterns,
    • Designs concrete CARS for each critical Destination,
    • Outputs practical specifications the client can actually implement.

What it does

  • Inputs:
    • Your Driver/Destination description (e.g. “union organiser → securely coordinate with members”)
    • Your Personal Risk Profile (generic report)
  • Decision process:
    • Determines what kind of CAR you need.
    • Searches the existing CAR Library for:
      • A suitable CAR “off the shelf”
      • A CAR that can be easily modified
    • If necessary, designs a new CAR from scratch.
  • Over time:
    • As the Library grows, increasingly assembles new CARS by combining and tweaking existing ones.

Privacy & compartmentalisation

  • Assistants run in separate sessions; there is no hidden linkage of raw inputs between them.
  • The Builder only sees what it needs (e.g. your profile summary, not your detailed audit).

How a coach might use it

  • At Step 9:
    • Work through key Destinations and build the “fleet” of CARS.
  • When changes occur:
    • Return to the Builder to adjust or add CARS as threats, tools, or roles evolve.

Outcome

  • A recommended CAR design with:
    • Stack components
    • Configuration guidance
    • Notes on trade‑offs between usability and safety
  • Optionally, a list of alternative CARS for different paranoia or convenience levels.

#7: Trust Ranking Report Assistant

Primary Step:

  • Step 7 – Identify and Mitigate Threats

Secondary usage:

  • Step 4 – Secure Your Digital Life (early trust checks when hardening)
  • Step 8 – Find Alternatives to Big Tech (evaluating candidate replacements)

Purpose
To help you rank the trustworthiness of platforms, tools, organisations and other digital actors, and generate structured “Trust Ranking Reports”.

Role in the roadmap

  • At Step 7, the client needs to understand who they’re trusting with what:
    • Platforms, vendors, infrastructure providers, organisations, even states.
  • The Trust Ranking Assistant:
    • Provides a structured way to evaluate and rank trust for key entities.
    • Produces outputs that can plug into:
      • CARS Threat Analysis,
      • Big Tech Alternatives decisions,
      • Custom CAR Builder choices.

Behaviour (high level)

  • Guides you in identifying:
    • Key entities (platforms, vendors, infrastructure, states, etc.)
    • Relevant trust factors (governance, jurisdiction, track record, business model, known abuses)
  • Uses transparent criteria to:
    • Score or rank entities
    • Flag entities that are especially risky for certain roles or use‑cases
  • Outputs:
    • General trust rankings for later use by other tools (e.g. CARS Threat Analysis Tool)
    • Explanations you can understand and challenge (not “black box scores”)

How a coach might use it

  • Around Steps 4 and 7:
    • Run rankings on current providers and likely alternatives.
  • During Step 8:
    • Use it to compare alternative tools/platforms on trust grounds, not just features.

Local vs hosted:

  • Local: works from basic trust‑factor templates you can fill in.
  • Hosted: can draw on curated DFA reference tables, examples and case studies.

#8: Big Tech Alternatives Tool

Primary Step:

  • Step 8 – Find Alternatives to Big Tech

Secondary usage:

  • Step 4 & 7: Light exploratory use when considering what’s possible.
  • Step 9: Choosing tools to implement the CARS.

Purpose
To help you find and evaluate alternatives to Big Tech platforms and tools.

Role in the roadmap

  • Step 8 is where the client starts moving off misaligned services and into better‑suited alternatives.
  • The Big Tech Alternatives Tool:
    • Helps specify:
      • Functional requirements for each role or Destination.
      • Constraints and threat model from the Personal Risk Profile.
    • Suggests concrete alternatives that fit:
      • The client’s needs,
      • Their risk tolerance and trust rankings.
    • Guides:
      • Side‑by‑side trials,
      • Backup and migration planning,
      • Minimising lock‑in and exposure while switching.

Scope

  • Primarily a stand‑alone evaluation tool:
    • Focuses on functionality and safety of an individual tool (email, chat, storage, etc.)
    • Does not require your risk profile or CAR context — but can optionally use them.

What it does

  • Prompts you through criteria:
    • Core functionality
    • Threat model (who are you worried about?)
    • Jurisdiction and governance
    • Interoperability with other tools
    • Data portability & export options
  • Suggests:
    • Potential alternatives
    • Safer adoption and migration strategies

Practical guidance

  • Strategies for safe testing:
    • Running tools in parallel
    • Backing up and migrating data carefully
    • Minimising lock‑in and exposure during transition

You can optionally feed it your Personal Risk Profile so it can highlight options aligned with your risk level.

How a coach might use it

  • Mainly at Step 8:
    • Plan specific migrations (messaging, email, storage, collaboration, etc.).
  • Earlier:
    • Use selectively to demonstrate that meaningful alternatives exist, reducing emotional resistance.

Example Coaching Workflow (Aligned to the 10 Steps)

Step 1 – The Escape Plan

  • Human coach + Escape to Digital Freedom Coach assistant introduce:
    • The Race,
    • The 10 Steps,
    • High‑level mission and roadblocks.
  • Clarify client priorities and constraints.

Step 2 – Give Big Tech the Mushroom Treatment

  • Client works with the Digital Life Audit Assistant to:
    • Map accounts, roles, devices, platforms, Destinations.
  • Output: a structured, compartmentalised view of their Digital Life.

Step 3 – Take Control of Your Digital Life

  • Client uses the Personal Risk Profile Assistant to perform a Driver Risk Assessment, using the audit as input.
  • Output: a reusable Personal Risk Profile, aligned to Destinations and roles.

Step 4 – Secure Your Digital Life

  • Coach and client:
    • Use CARS Threat Analysis Tool to prioritise what to secure.
    • Optionally use Trust Ranking Report Assistant to spot badly‑trusted platforms.
  • Implementation work (MFA, encryption, backups) happens guided by training; assistants inform where to focus.

Step 5 – Reduce Your Digital Footprint

  • Client works with Push the Overton Window Assistant:
    • To reshape how they present themselves and share information.
    • To reduce unnecessary trails linking compartments.
  • Personal Risk Profile Assistant is reused here to keep reductions aligned with risk.

Step 6 – Walk Out from Under the Digital Surveillance Umbrella

  • CARS Threat Analysis Tool is used for surveillance‑focused scenarios:
    • Mass vs targeted, network‑level vs account‑level.
  • Outputs help the client decide where evasive manoeuvres are worthwhile and how far to go.

Step 7 – Identify and Mitigate Threats

  • CARS Threat Analysis Tool and Trust Ranking Report Assistant are central:
    • Detailed risk & threat workflows for key CARS and Destinations.
  • Personal Risk Profile Assistant feeds underlying driver risk into these analyses.

Step 8 – Find Alternatives to Big Tech

  • Big Tech Alternatives Tool becomes primary:
    • Suggests and compares alternatives consistent with the risk profile and trust rankings.
  • Trust Ranking Report Assistant refines choices.
  • CARS Threat Analysis Tool checks that chosen alternatives don’t introduce new unacceptable risks.

Step 9 – Using CARS to Escape

  • Custom CAR Builder Assistant designs the actual CARS “fleet”:
    • Uses audit, risk profile, threats, and chosen alternatives as inputs.
  • Threat Analysis and Alternatives assistants are reused as checks on proposed CARS.

Step 10 – Join The Race and Escape

  • (Primary assistant still to be defined.)
  • Escape to Digital Freedom Coach continues in a secondary role:
    • Helping integrate all previous work,
    • Supporting ongoing improvement, community involvement, and (where applicable) Overton‑aligned outreach using the existing assistants.

Summary Table: Assistants ↔ 10 Steps

Step #Step TitlePrimary Assistant(s)Secondary Assistant(s)
1The Escape PlanEscape to Digital Freedom Coach(none)
2Give Big Tech the Mushroom TreatmentDigital Life Audit AssistantEscape to Digital Freedom Coach
3Take Control of Your Digital LifePersonal Risk Profile AssistantDigital Life Audit Assistant, Escape to Digital Freedom Coach
4Secure Your Digital LifeCARS Threat Analysis ToolTrust Ranking Report Assistant, Escape to Digital Freedom Coach
5Reduce Your Digital FootprintPush the Overton Window AssistantPersonal Risk Profile Assistant, Digital Life Audit Assistant, Escape to Digital Freedom Coach
6Walk Out from Under the Digital Surveillance UmbrellaCARS Threat Analysis ToolPersonal Risk Profile Assistant, Trust Ranking Report Assistant, Escape to Digital Freedom Coach
7Identify and Mitigate ThreatsCARS Threat Analysis ToolPersonal Risk Profile Assistant, Trust Ranking Report Assistant, Escape to Digital Freedom Coach
8Find Alternatives to Big TechBig Tech Alternatives ToolTrust Ranking Report Assistant, CARS Threat Analysis Tool, Escape to Digital Freedom Coach
9Using CARS to EscapeCustom CAR Builder AssistantCARS Threat Analysis Tool, Big Tech Alternatives Tool, Escape to Digital Freedom Coach
10Join The Race and EscapeTBAEscape to Digital Freedom Coach

Assistants Under Consideration (Not Yet Finalised)

The following assistants are under active consideration but have not yet been fully scoped or integrated into the current roadmap. They address gaps or specialised needs that have emerged during development and early coaching work.


System Hardening & Stealth Assistant

Potential primary Steps:

  • Step 4 – Secure Your Digital Life
  • Step 6 – Walk Out from Under the Digital Surveillance Umbrella

Purpose

To provide practical, step‑by‑step guidance for hardening existing systems and implementing network‑level stealth measures.

Scope

This assistant would operate in two complementary modes:

Mode A: Account & Device Hardening

  • Practical lock‑down of:
    • Accounts (MFA, backup codes, recovery settings, app passwords)
    • Devices (OS security settings, disk encryption, local account setup)
    • Browsers (profiles, extensions, tracking protection, container tabs)
  • Focused on doing the work, not just identifying threats.

Mode B: Network & Transport Stealth

  • Network‑level privacy and evasion:
    • Home router configuration, guest networks, basic segmentation
    • VPN use patterns, limitations, and provider selection
    • Tor and anonymity networks (threat‑model‑appropriate guidance)
    • DNS choices (DoH/DoT, resolver jurisdiction, split DNS)
    • Traffic obfuscation concepts for high‑risk users

Relationship to existing assistants

  • CARS Threat Analysis Tool identifies what needs hardening and why.
  • System Hardening & Stealth Assistant shows how to implement those mitigations, step by step.

Why separate from Threat Analysis?

  • Different mental model:
    • Threat Analysis: “What can go wrong?”
    • Hardening & Stealth: “Now do X, Y, Z to fix it.”
  • Keeps the Threat Analysis Tool focused on assessment, not operational instructions.

Status

Under consideration. Needs careful scoping to avoid:

  • Giving dangerous config commands to non‑technical users.
  • Becoming a “do everything” catch‑all.

Data Lifecycle / Deletion & Archiving Assistant

Potential primary Steps:

  • Step 5 – Reduce Your Digital Footprint
  • Step 7 – Identify and Mitigate Threats

Potential secondary usage:

  • Step 8 – Find Alternatives to Big Tech (migration and data portability)
  • Step 9 – Using CARS to Escape (designing CARS with clear data policies)

Purpose

To help users make informed, threat‑aware decisions about:

  • What data to keep, where, and for how long
  • What data to delete, and how to delete it safely
  • What data to archive, and with what protections
  • How to migrate and convert data between platforms and formats

Scope

  • Data minimisation:
    • Identifying unnecessary data accumulation across accounts, devices, and CARS.
    • Prompting regular reviews: “Do you still need this? What’s the risk if you keep it?”
  • Deletion strategies:
    • Effective vs performative deletion (account closure, data export, residual copies).
    • Threat‑model‑driven decisions: when deletion reduces risk vs when it’s futile.
  • Archiving:
    • What to archive, where (local, encrypted cloud, offline), and for how long.
    • Balancing accessibility, security, and long‑term risk.
  • Migration & data conversion:
    • Safely moving data between platforms during Step 8 transitions.
    • Format conversion, export/import workflows, and avoiding lock‑in.
    • Ensuring data integrity and minimising exposure during migration.

Why a separate assistant?

  • Data lifecycle decisions cut across multiple Steps and contexts.
  • Migration and archiving are practical, high‑stakes workflows that benefit from dedicated, step‑by‑step guidance.
  • Keeps other assistants (Threat Analysis, CAR Builder, Overton) focused on their core functions without overloading them with data‑handling detail.

Relationship to existing assistants

  • Takes risk profile and threat analysis outputs as inputs.
  • Feeds practical data‑handling decisions back into:
    • Step 5 footprint reduction work
    • Step 8 migration planning (Big Tech Alternatives Tool)
    • Step 9 CAR design (Custom CAR Builder)

Status

Under consideration. High practical value, especially for users migrating off Big Tech platforms or managing sensitive archives.


Digital Autonomy & Emerging Tech Assistant

Primary Step:

  • Step 10 – Join The Race and Escape

Potential secondary usage:

  • Step 8 / Step 9 (as optional exploration for advanced users)

Purpose

To help users explore and evaluate emerging technologies and alternative systems that increase digital autonomy beyond baseline “escape from Big Tech.”

Scope

  • Web3 & decentralised infrastructures:
    • What problems they actually solve vs marketing hype.
    • Decentralised identity (DIDs, wallets) and associated risks.
    • Federation vs full decentralisation: trade‑offs and threat models.
  • Alternative economic & coordination systems:
    • Mutual credit, co‑operatives, DAOs (with a harsh reality filter).
    • When and why these models increase autonomy vs introduce new dependencies.
  • Self‑hosting & community hosting:
    • When it makes sense for your threat model and technical capacity.
    • Shared infrastructure models that reduce Big Tech dependence without requiring everyone to be a sysadmin.
  • System‑level CARS:
    • How communities, organisations, or networks can build layered alternative systems that give members more autonomy.
    • Moving from individual escape to collective infrastructure.

Why an assistant, not just documentation?

  • The landscape moves fast; users need:
    • Current maps of what exists and what works.
    • Filtering through their Personal Risk Profile and capacity.
  • The assistant can push back on hype and help distinguish:
    • “Looks cool” vs “genuinely increases autonomy for you in your context.”

Relationship to existing assistants

  • Assumes the user has already:
    • Completed Steps 1–9 (mapped, secured, reduced footprint, adopted CARS).
  • Focuses on what comes next: building or joining better systems, not just escaping alone.

Status

Under consideration. Strong thematic fit for Step 10. Needs careful scoping to avoid:

  • Becoming a generic “tech trends” assistant.
  • Overpromising on immature or hyped technologies.

Note:
These assistants are not yet part of the core DFA assistant suite. They represent potential directions based on observed user needs and gaps in the current roadmap. Feedback from early coaching clients will determine which (if any) are developed and integrated.