Namazu 3.0 / Lagrangian Tensor Generating Logic

Namazu TGL
Single File Concept Demo

A one-page technical concept site for the next Namazu public build: sensor observation ingest, building-specific graph dynamics, and time-indexed tensor generation for floor-by-floor earthquake propagation analysis.
Model
TGL v1
Graph Nodes
4
Tensor Frames
300
Current Build Direction
API First + Visual Proof
FastAPI backend, canonical observation objects, building graph model, and a public-facing simulation layer.
Input
Ground + Floor Sensors
1F or rooftop-only is possible first, but the design supports multiple floors and building-specific response paths.
Output
Peak response pending
Summary metrics can later map to readable classes such as Good / Ordinary / Dangerous / Very Dangerous.

How this demo is structured

This page is intentionally self-contained. It is not the final Namazu 3.0 production UI. It is a technical showcase page that can be used to align on architecture and story before the backend is finalized.

Observation ingest
Building graph
Lagrangian propagation
Tensor frames
FastAPI-ready

In this concept, each floor is treated as a node. Motion enters at the base, propagates upward through coupling edges, and a tensor-like time series is generated from displacement, velocity, and transfer intensity.

Lagrangian propagation visual

1.8
0.74
0.08

Suggested API surface

POST
/v1/observations/push
Send sensor observations with timestamp, building_id, floor, sample_hz, and xyz samples.
POST
/v1/buildings
Register a building graph, structure type, floors, and proxy physical parameters.
POST
/v1/tgl/generate
Generate the first tensor result from stored observations and a selected building model.
GET
/v1/tensors/{building_id}
Return tensor frames, summary metrics, and peak floor response values.
POST
/v1/simulate/propagation
Developer endpoint for replay and visual testing before production deployment.

Example objects

{
  "building_id": "bld_tokyo_demo_001",
  "name": "Tokyo Demo Building",
  "structure_type": "wood",
  "floors": 4,
  "nodes": [
    {"id": "f1", "floor": 1, "mass": 1.00, "stiffness": 1.30, "damping": 0.08},
    {"id": "f2", "floor": 2, "mass": 0.95, "stiffness": 1.15, "damping": 0.08},
    {"id": "f3", "floor": 3, "mass": 0.90, "stiffness": 0.98, "damping": 0.07},
    {"id": "f4", "floor": 4, "mass": 0.85, "stiffness": 0.86, "damping": 0.07}
  ],
  "edges": [
    {"from": "f1", "to": "f2", "coupling": 0.74},
    {"from": "f2", "to": "f3", "coupling": 0.72},
    {"from": "f3", "to": "f4", "coupling": 0.70}
  ]
}
{
  "event_id": "evt_001",
  "timestamp": "2026-04-19T09:00:00Z",
  "building_id": "bld_tokyo_demo_001",
  "sensor_id": "sensor_1f_base",
  "floor": 1,
  "source_type": "iot",
  "sample_hz": 100,
  "samples": [
    {"t": 0.00, "ax": 0.02, "ay": -0.01, "az": 0.99},
    {"t": 0.01, "ax": 0.04, "ay": -0.01, "az": 0.98}
  ],
  "quality": 0.96
}

FastAPI starter shape

from fastapi import FastAPI
from pydantic import BaseModel
from typing import List

app = FastAPI(title="Namazu TGL API", version="0.1.0")

class AxisSample(BaseModel):
    t: float
    ax: float
    ay: float
    az: float

class ObservationEvent(BaseModel):
    event_id: str
    timestamp: str
    building_id: str
    sensor_id: str
    floor: int
    sample_hz: int
    samples: List[AxisSample]
    quality: float = 1.0

@app.get("/v1/health")
def health():
    return {"status": "ok", "service": "namazu-tgl-api"}

@app.post("/v1/observations/push")
def push_observation(event: ObservationEvent):
    return {"accepted": True, "event_id": event.event_id}

@app.post("/v1/tgl/generate")
def generate_tgl(payload: dict):
    return {
        "job_id": "tgl_demo_001",
        "building_id": payload.get("building_id"),
        "mode": "lagrangian_v1",
        "status": "queued"
    }