schema Transparency

How AI Radar works

AI Radar is not a random link directory. The platform combines a relational data layer, a dedicated worker for enrichment and verification, and a curated admin workflow. This page explains the structure, scoring logic, and the role of each area.

System structure

Tools

Curated directory with summaries, category, relevance, link status, and later also interaction data such as likes or ratings.

Models

Registry for relevant AI models including provider, status, context window, release date, and technical notes.

Model updates

Change log for new models, status changes, renames, capability changes, or deprecations.

Sources

Watchlist for YouTube channels, release notes, blogs, and directories that can produce signals or new candidates.

Submissions and discovery

Inputs from forms or discovery runs. They do not go live automatically and instead pass through review first.

From signal to page

1

1. Intake

Entries are created from seeds, curated sources, manual suggestions, and eventually automated discovery jobs.

2

2. Storage

PostgreSQL is the source of truth. Tools, models, updates, sources, discovery candidates, and submissions all live there.

3

3. Enrichment

The worker exposes the API, runs health checks, processes model updates, and feeds filtered data to the frontend pages.

4

4. Editorial review

Entries are reviewed, enriched, approved, or archived in the admin area. This is intentionally not just a raw database frontend but a curated review system.

5

5. Delivery

Homepage, tools, models, and radar all read from the same data source. That keeps rankings, filters, and detail pages consistent.

Scoring and evaluation

Relevance score

0 to 100

The relevance score orders tools in listings and on the homepage. It combines editorial weighting, sharpness of the summary, and the signal of whether an entry truly belongs in a production-focused AI radar.

  • Quality and clarity of the short summary
  • Appropriate category and sensible classification
  • Entry status, for example published instead of draft
  • Editorial prioritization for especially relevant tools

Link health

technical status

Each tool link can be checked automatically. The result does not directly change the relevance score, but it is central to trust and maintenance quality.

  • healthy: target page responds normally
  • redirected: URL forwards to another destination
  • changed: content or title changed in a relevant way
  • timeout, forbidden, not_found, server_error: technical issues or blocked requests

Trend signals

radar / popularity

The radar can weigh additional signals such as likes, freshness, and model updates so that movement in the market becomes visible alongside static data.

  • Likes and interactions
  • Publication or last-seen timestamp
  • Provider and release note changes
  • new or changed models in the registry

Important principles

An entry is not automatically important just because it is new. AI Radar prioritizes relevance, traceability, and maintenance quality over sheer volume.

Seeds and automated sources help with initialization and updates, but they do not replace editorial judgment. That is why statuses, health checks, update logs, and review surfaces exist.

The platform is built so frontend, admin, and worker all use the same data foundation. That reduces contradictions and keeps rankings, filters, and detail pages consistent.