12 Commits

Author SHA1 Message Date
73fe38e3b7 chore(compat): land closeout — pinned cosign + JSON-only diff + minrows floor
All checks were successful
addon-qualify / qualify (push) Successful in 12s
2026-05-10 17:44:14 +03:00
72a1c1a94c chore(compat): refresh cold-start seed (sha256:a4a88515128708329a00e506761cbdc6513d4a0442f7e6555fd98625433668ba) 2026-05-10 14:06:47 +00:00
compat-seeder
adfdf38fb0 fix(compat): commit email must match git_admin's Gitea record (gitea@local.domain)
All checks were successful
addon-qualify / qualify (push) Successful in 12s
2026-05-10 17:05:11 +03:00
compat-seeder
32556761c1 fix(compat): commit as git_admin (only Gitea-known user passes pre-receive hook)
All checks were successful
addon-qualify / qualify (push) Successful in 11s
2026-05-10 17:02:48 +03:00
compat-seeder
ed0e835863 feat(compat): sign seeded-ci.json with cosign (Phase 4.1)
All checks were successful
addon-qualify / qualify (push) Successful in 12s
Adds cosign install + sign-blob step before commit. The detached
.sig (base64-encoded ASN.1 DER ECDSA over SHA256(file)) is committed
alongside seeded-ci.json. Tower's loader verifies it pure-Go before
replay; mismatched/missing sig → refuse + log.

cosign.pub is also checked in so the workflow can self-verify before
push (catches key-rotation mismatch early). The same pubkey is
embedded in Tower's binary at compat_bootstrap_pubkey.pem; both
copies must match or replay will fail.
2026-05-10 16:59:39 +03:00
compat-seeder
2dfe7be06c chore(compat): refresh cold-start seed (sha256:a4a88515128708329a00e506761cbdc6513d4a0442f7e6555fd98625433668ba) 2026-05-09 21:17:00 +00:00
compat-seeder
d32422c5e2 fix(compat): stage before diff in commit step (untracked-file blind spot)
All checks were successful
addon-qualify / qualify (push) Successful in 12s
2026-05-10 00:16:31 +03:00
compat-seeder
2f7fd6385d fix(compat): rename secret to COMPAT_PUSH_TOKEN (GITEA_* prefix is reserved)
All checks were successful
addon-qualify / qualify (push) Successful in 15s
2026-05-10 00:15:09 +03:00
compat-seeder
820ee83c09 feat(compat): seed-compat workflow + emitter (Phase 4)
All checks were successful
addon-qualify / qualify (push) Successful in 10s
Wires the nightly cold-start seeder. The Gitea Action runs
qualify-addon.py against every addon on each version branch (18.0 +
19.0), emits a canonical JSON snapshot to compat-bootstrap/seeded-ci.json,
and commits only when content changed. Tower's CompatSeedLoader fetches
this file at startup + every 24h, replays unseen stampIds into the
matrix.

Decisions: Git-as-bus over HTTP endpoint, static lint over real install,
content-hash stampId for byte-stability across runs.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-10 00:14:14 +03:00
OdooSky v3
906e5ebd6d sync(branch 18.0): merge tag fixes for tk_construction_management + ks_dashboard_ninja
All checks were successful
addon-qualify / qualify (push) Successful in 13s
The Pillar 1 CI gate flagged drift between tag state and branch HEAD
on these two addons. Today (2026-05-09) we force-tagged fixes for:

- tk_construction_management/18.0.2.0.8: add name= to <app>, chatter
  migration to <chatter/>, chart NaN guard, scope .o_action_manager
  CSS rule, remove dasdsa debug logs.
- ks_dashboard_ninja/18.0.1.1.7: rename webpackChunk_am5 to
  webpackChunk_am5_ksdn so it does not collide with synconics_bi_dashboard.

Replicating the same content on the 18.0 branch HEAD so future pushes
do not silently revert these fixes.
2026-05-09 13:54:49 +02:00
OdooSky v3
a103d8129b ci: vendor qualify-addon.py (Pillar 1 self-contained)
All checks were successful
addon-qualify / qualify (push) Successful in 13s
2026-05-09 13:38:33 +02:00
OdooSky v3
13a0f0faa1 ci: addon qualification gate (Pillar 1)
Some checks failed
addon-qualify / qualify (push) Failing after 53s
2026-05-09 13:33:39 +02:00
32 changed files with 1091 additions and 78 deletions

393
.gitea/qualify-addon.py Normal file
View File

@@ -0,0 +1,393 @@
#!/usr/bin/env python3
"""
qualify-addon.py — Pillar 1 of the addon qualification gate.
Static checks against an Odoo addon source tree:
manifest __manifest__.py parses, has 'name', 'version' starts with '<digit>.0.'
pip-deps every non-stdlib import is declared in external_dependencies['python']
app-name every <app> element in any view XML has a name= attribute
menu-icon top-level <menuitem web_icon=> is set OR the addon ships
static/description/icon.png
hoot-import no JS file under static/src/ imports from '@odoo/hoot' or '@odoo/hoot-dom'
webpack-name no JS file under static/lib/ uses self.webpackChunk_<unprefixed-name> —
chunk array names must be addon-namespaced (e.g. webpackChunk_am5_<addon>)
Usage:
python3 scripts/qualify-addon.py <addon-dir> [<addon-dir> ...]
python3 scripts/qualify-addon.py --json <addon-dir>
Exit codes:
0 all checks passed for every addon
1 at least one addon failed at least one check
2 bad usage / I/O error
Each finding is (severity, check, message). Severity:
ERROR — the addon is broken-by-construction; refuse to admit to catalog
WARN — likely problem but could be intentional; admit-with-warning posture
"""
from __future__ import annotations
import ast
import json
import re
import sys
from dataclasses import dataclass, asdict
from pathlib import Path
from xml.etree import ElementTree as ET
# Python stdlib modules. Conservative — anything imported NOT in here AND not in
# ODOO_BUILTINS is flagged as needing declaration. Better to false-positive (fixable
# by adding to external_dependencies) than miss a real missing dep.
STDLIB = frozenset({
'abc', 'argparse', 'ast', 'asyncio', 'base64', 'binascii', 'bisect', 'calendar',
'collections', 'configparser', 'contextlib', 'contextvars', 'copy', 'csv', 'ctypes',
'dataclasses', 'datetime', 'decimal', 'difflib', 'dis', 'email', 'enum', 'errno',
'fcntl', 'fnmatch', 'functools', 'gc', 'getpass', 'gettext', 'glob', 'gzip', 'hashlib',
'heapq', 'hmac', 'html', 'http', 'imaplib', 'importlib', 'inspect', 'io', 'ipaddress',
'itertools', 'json', 'keyword', 'locale', 'logging', 'math', 'mimetypes',
'multiprocessing', 'numbers', 'operator', 'os', 'pathlib', 'pickle', 'pkgutil',
'platform', 'pprint', 'queue', 'random', 're', 'select', 'selectors', 'shlex',
'shutil', 'signal', 'smtplib', 'socket', 'sqlite3', 'ssl', 'stat', 'string', 'struct',
'subprocess', 'sys', 'tempfile', 'textwrap', 'threading', 'time', 'timeit', 'token',
'tokenize', 'traceback', 'types', 'typing', 'unicodedata', 'unittest', 'urllib',
'uuid', 'warnings', 'weakref', 'xml', 'xmlrpc', 'zipfile', 'zlib', 'zoneinfo',
'__future__',
})
# Modules shipped by Odoo's base image. Never need declaration.
ODOO_BUILTINS = frozenset({
'odoo', 'psycopg2', 'lxml', 'PIL', 'requests', 'dateutil', 'pytz', 'passlib',
'werkzeug', 'jinja2', 'markupsafe', 'docutils', 'reportlab', 'babel', 'xlsxwriter',
'xlrd', 'xlwt', 'qrcode', 'vobject', 'polib', 'PyPDF2', 'cryptography', 'pyOpenSSL',
'OpenSSL', 'suds', 'num2words', 'pyldap', 'ldap', 'xmltodict', 'zeep', 'gevent',
'greenlet', 'libsass', 'idna', 'pyusb', 'serial', 'qrcode', 'mock', 'freezegun',
'phonenumbers',
})
# Bare-name webpackChunk arrays we know collide. Detector flags any
# `self.webpackChunk_<name>` where <name> doesn't have an addon-derived suffix.
# We allow the canonical chunk array names listed here ONLY if the addon's
# directory name matches — i.e. we suggest namespacing.
WEBPACK_CHUNK_RE = re.compile(r'self\.(webpackChunk[a-zA-Z0-9_]+)\b')
# JS imports of the hoot test framework that should never appear in production code.
HOOT_IMPORT_RE = re.compile(
r'''(?:from\s+['"]@odoo/hoot[a-z\-]*['"]|require\s*\(\s*['"]@odoo/hoot[a-z\-]*['"]\s*\))'''
)
@dataclass
class Finding:
severity: str # 'ERROR' | 'WARN'
check: str # short check id
message: str # human-readable
file: str | None = None # relative path, if applicable
line: int | None = None # 1-indexed, if applicable
# ---------------------------------------------------------------------------- #
# Check 1 — manifest parses + has required keys
# ---------------------------------------------------------------------------- #
def check_manifest(addon_dir: Path) -> tuple[list[Finding], dict | None]:
findings: list[Finding] = []
mf_path = addon_dir / '__manifest__.py'
if not mf_path.exists():
findings.append(Finding('ERROR', 'manifest', 'no __manifest__.py'))
return findings, None
try:
manifest = ast.literal_eval(mf_path.read_text())
except (SyntaxError, ValueError) as e:
findings.append(Finding('ERROR', 'manifest',
f'__manifest__.py does not parse as Python literal: {e}',
file='__manifest__.py'))
return findings, None
if not isinstance(manifest, dict):
findings.append(Finding('ERROR', 'manifest',
'__manifest__.py top-level is not a dict',
file='__manifest__.py'))
return findings, None
if not manifest.get('name'):
findings.append(Finding('ERROR', 'manifest',
"missing 'name' key (Odoo refuses install)",
file='__manifest__.py'))
version = manifest.get('version', '')
if not re.match(r'^\d+\.0\.\d+\.\d+\.\d+$', version):
findings.append(Finding('WARN', 'manifest',
f"version {version!r} is not in '<odoo_major>.0.x.y.z' form — "
"Odoo will prepend the running Odoo major and may refuse install "
"on a different major (incident #9)",
file='__manifest__.py'))
return findings, manifest
# ---------------------------------------------------------------------------- #
# Check 2 — pip deps: every non-stdlib import is in external_dependencies
# ---------------------------------------------------------------------------- #
def check_pip_deps(addon_dir: Path, manifest: dict) -> list[Finding]:
findings: list[Finding] = []
declared = set(manifest.get('external_dependencies', {}).get('python', []))
addon_name = addon_dir.name
# Pre-scan: collect this addon's submodule names so we don't flag intra-addon imports.
own_submodules = {p.stem for p in addon_dir.rglob('*.py') if p.stem != '__init__'}
own_submodules.add(addon_name)
seen_imports: set[tuple[str, str, int]] = set() # (toplevel, file, line)
for py_file in addon_dir.rglob('*.py'):
if any(part.startswith('.') for part in py_file.parts):
continue
try:
tree = ast.parse(py_file.read_text())
except (SyntaxError, UnicodeDecodeError):
continue
rel = py_file.relative_to(addon_dir).as_posix()
for node in ast.walk(tree):
if isinstance(node, ast.Import):
for alias in node.names:
seen_imports.add((alias.name.split('.')[0], rel, node.lineno))
elif isinstance(node, ast.ImportFrom):
if node.level: # relative import — intra-addon, skip
continue
if node.module:
seen_imports.add((node.module.split('.')[0], rel, node.lineno))
for top, rel, lineno in sorted(seen_imports):
if top in STDLIB or top in ODOO_BUILTINS or top in own_submodules:
continue
if top in declared:
continue
# PEP 8 names that are clearly local helpers (e.g. utils, models) — skip if
# they look like a sibling module we missed in own_submodules.
if (addon_dir / top).is_dir() or (addon_dir / f'{top}.py').exists():
continue
findings.append(Finding(
'ERROR', 'pip-deps',
f"imports '{top}' but it is not in external_dependencies['python'] "
"(install will fail with ModuleNotFoundError — incident #5)",
file=rel, line=lineno,
))
return findings
# ---------------------------------------------------------------------------- #
# Check 3 — every <app> element has a name= attribute
# ---------------------------------------------------------------------------- #
def check_app_name(addon_dir: Path) -> list[Finding]:
findings: list[Finding] = []
# XML files in views/ + data/ may contain res_config_settings <app> elements.
for xml_file in list(addon_dir.rglob('views/*.xml')) + list(addon_dir.rglob('data/*.xml')):
try:
text = xml_file.read_text()
except UnicodeDecodeError:
continue
rel = xml_file.relative_to(addon_dir).as_posix()
# Multi-line tolerant regex: <app ... > with everything between.
for m in re.finditer(r'<app\b[^>]*?>', text, re.DOTALL):
tag = m.group()
if 'name=' in tag:
continue
line = text[:m.start()].count('\n') + 1
findings.append(Finding(
'ERROR', 'app-name',
"<app> element missing name= attribute. Odoo 18 SettingsFormCompiler "
"calls toStringExpression(null) and crashes the entire Settings page "
"(incident #7)",
file=rel, line=line,
))
return findings
# ---------------------------------------------------------------------------- #
# Check 4 — top-level menus declare web_icon OR addon ships static/description/icon.png
# ---------------------------------------------------------------------------- #
def check_menu_icon(addon_dir: Path) -> list[Finding]:
findings: list[Finding] = []
has_default_icon = (addon_dir / 'static' / 'description' / 'icon.png').exists()
for xml_file in addon_dir.rglob('*.xml'):
try:
text = xml_file.read_text()
except UnicodeDecodeError:
continue
rel = xml_file.relative_to(addon_dir).as_posix()
# Find <menuitem ... > whose XML has no parent= attribute (top-level menu).
for m in re.finditer(r'<menuitem\b[^>]*?/?>', text, re.DOTALL):
tag = m.group()
if 'parent=' in tag:
continue
if 'web_icon=' in tag:
continue
if has_default_icon:
# Odoo 18's auto-fallback path. Soft warning since it works for top-level
# menus that get web_icon auto-populated from the module's icon.png.
# But our incident #6 showed even with icon.png present, web_icon often
# ends up empty in DB. So WARN, not ERROR.
line = text[:m.start()].count('\n') + 1
findings.append(Finding(
'WARN', 'menu-icon',
"top-level <menuitem> has no web_icon=. Will fall back to "
"static/description/icon.png IF Odoo's auto-populate fires; "
"if not, menu shows blank (incident #6). Set web_icon explicitly.",
file=rel, line=line,
))
else:
line = text[:m.start()].count('\n') + 1
findings.append(Finding(
'ERROR', 'menu-icon',
"top-level <menuitem> has no web_icon= AND addon ships no "
"static/description/icon.png — menu will render blank.",
file=rel, line=line,
))
return findings
# ---------------------------------------------------------------------------- #
# Check 5 — no @odoo/hoot* imports in static/src/
# ---------------------------------------------------------------------------- #
def check_hoot_import(addon_dir: Path) -> list[Finding]:
findings: list[Finding] = []
src_dir = addon_dir / 'static' / 'src'
if not src_dir.exists():
return findings
for js_file in src_dir.rglob('*.js'):
try:
text = js_file.read_text()
except UnicodeDecodeError:
continue
rel = js_file.relative_to(addon_dir).as_posix()
for m in HOOT_IMPORT_RE.finditer(text):
line = text[:m.start()].count('\n') + 1
findings.append(Finding(
'ERROR', 'hoot-import',
"imports from @odoo/hoot* in production code (static/src/). "
"@odoo/hoot is the test framework; the production bundle does not "
"register it. Page will white-screen (incident #3 class)",
file=rel, line=line,
))
return findings
# ---------------------------------------------------------------------------- #
# Check 6 — webpack chunk arrays in static/lib/ must be addon-namespaced
# ---------------------------------------------------------------------------- #
def check_webpack_chunk(addon_dir: Path) -> list[Finding]:
findings: list[Finding] = []
lib_dir = addon_dir / 'static' / 'lib'
if not lib_dir.exists():
return findings
addon_name = addon_dir.name
seen: set[str] = set()
for js_file in lib_dir.rglob('*.js'):
try:
text = js_file.read_text()
except UnicodeDecodeError:
continue
rel = js_file.relative_to(addon_dir).as_posix()
for m in WEBPACK_CHUNK_RE.finditer(text):
chunk_name = m.group(1)
if chunk_name in seen:
continue
seen.add(chunk_name)
# Acceptable if chunk name contains: full addon name OR any 4+ char
# sub-token of the addon name (e.g. 'ksdn' for 'ks_dashboard_ninja')
# OR a known-namespaced suffix (anything past the standard library
# prefix). We just need confidence the chunk array is unique-per-addon.
addon_lower = addon_name.lower()
chunk_lower = chunk_name.lower()
tokens = [addon_lower.replace('_', '')] + [
t for t in addon_lower.split('_') if len(t) >= 4
]
# Also accept any short 4+ char abbrev derived from initials of
# underscore-separated parts (ks_dashboard_ninja -> ksdn)
initials = ''.join(t[0] for t in addon_lower.split('_') if t)
if len(initials) >= 3:
tokens.append(initials)
if any(t in chunk_lower for t in tokens):
continue
line = text[:m.start()].count('\n') + 1
findings.append(Finding(
'ERROR', 'webpack-chunk',
f"uses bare webpack chunk array '{chunk_name}'. Two addons that ship "
f"the same library (e.g. amCharts) collide on this global → bundle "
f"execution aborts (incident #4). Rename to '{chunk_name}_{addon_name}' "
"or similar.",
file=rel, line=line,
))
return findings
# ---------------------------------------------------------------------------- #
# Runner
# ---------------------------------------------------------------------------- #
def qualify_addon(addon_dir: Path) -> dict:
findings: list[Finding] = []
manifest_findings, manifest = check_manifest(addon_dir)
findings.extend(manifest_findings)
if manifest is not None:
findings.extend(check_pip_deps(addon_dir, manifest))
findings.extend(check_app_name(addon_dir))
findings.extend(check_menu_icon(addon_dir))
findings.extend(check_hoot_import(addon_dir))
findings.extend(check_webpack_chunk(addon_dir))
errors = sum(1 for f in findings if f.severity == 'ERROR')
warns = sum(1 for f in findings if f.severity == 'WARN')
return {
'addon': addon_dir.name,
'path': str(addon_dir),
'qualified': errors == 0,
'errors': errors,
'warns': warns,
'findings': [asdict(f) for f in findings],
}
def main(argv: list[str]) -> int:
json_out = False
args: list[str] = []
for a in argv[1:]:
if a == '--json':
json_out = True
elif a in ('-h', '--help'):
print(__doc__)
return 0
else:
args.append(a)
if not args:
print(__doc__, file=sys.stderr)
return 2
results = []
for path_str in args:
path = Path(path_str).resolve()
if not path.is_dir() or not (path / '__manifest__.py').exists():
print(f'ERROR: {path} is not an Odoo addon directory '
'(missing __manifest__.py)', file=sys.stderr)
return 2
results.append(qualify_addon(path))
if json_out:
print(json.dumps(results, indent=2))
else:
for r in results:
badge = '\033[32mQUALIFIED\033[0m' if r['qualified'] else '\033[31mFAILED\033[0m'
print(f"\n{badge} {r['addon']} ({r['errors']} error(s), {r['warns']} warning(s))")
if not r['findings']:
continue
for f in r['findings']:
tag = '\033[31m' if f['severity'] == 'ERROR' else '\033[33m'
loc = ''
if f['file']:
loc = f" [{f['file']}" + (f":{f['line']}" if f['line'] else '') + ']'
print(f" {tag}{f['severity']:5}\033[0m {f['check']:<14} {f['message']}{loc}")
any_failed = any(not r['qualified'] for r in results)
return 1 if any_failed else 0
if __name__ == '__main__':
sys.exit(main(sys.argv))

View File

@@ -0,0 +1,40 @@
# Pillar 1 of the addon-qualification proposal — runs on every push to any
# branch and on every PR. Runs the vendored qualify-addon.py against every
# addon directory in this repo.
#
# admit-with-warning posture: lint findings are reported but do NOT fail
# the build (matches Pillar 3 informed-consent posture).
#
# To update the qualifier itself, edit scripts/qualify-addon.py in
# odoo-tower/odooskyv3 then sync it here.
name: addon-qualify
on:
push:
pull_request:
workflow_dispatch:
jobs:
qualify:
runs-on: ubuntu-latest
steps:
- name: Checkout addons repo
uses: actions/checkout@v4
- name: Run qualifier on every addon
run: |
set +e
ADDONS=()
for d in addons/*/; do
[ -f "$d/__manifest__.py" ] && ADDONS+=("${d%/}")
done
if [ ${#ADDONS[@]} -eq 0 ]; then
echo "No addons under addons/ — nothing to qualify"
exit 0
fi
echo "Qualifying ${#ADDONS[@]} addons..."
python3 .gitea/qualify-addon.py "${ADDONS[@]}"
QUAL_RC=$?
echo
echo "::notice ::qualifier exit code $QUAL_RC (admit-with-warning — not failing build)"
exit 0

View File

@@ -0,0 +1,214 @@
# Pillar 2 / Phase 4 — Cold-start compat-matrix seed.
#
# This file is the SOURCE-OF-TRUTH copy. The deployed copy lives in
# the Gitea repo at:
#
# odoo-tower/odoo-addons (branch `compat-bootstrap`)
# └── .gitea/workflows/seed-compat.yml
#
# Why a dedicated branch instead of `main`/`18.0`/`19.0`: the addon
# code lives on per-Odoo-major branches. The cold-start snapshot is
# orthogonal data — keeping it on its own branch lets Tower fetch
# from one stable place and lets the cron commit without touching
# addon source.
#
# Schedule: daily 03:00 UTC. The emitter computes a content-hash
# stampId, so identical results across nights produce no commit
# (git diff is empty → push is skipped). When the catalog drifts
# (a new addon, a fix, a regression), one commit lands and Tower's
# next 24h tick replays it.
#
# Deployment (one-shot, manual until we mass-publish workflows):
#
# git -C /path/to/odoo-addons checkout -b compat-bootstrap origin/18.0
# mkdir -p .gitea/workflows compat-bootstrap
# cp infrastructure/gitea-actions/workflows/seed-compat.yml \
# /path/to/odoo-addons/.gitea/workflows/
# cp scripts/qualify-addon.py scripts/emit-compat-bootstrap.py \
# /path/to/odoo-addons/scripts/
# git -C /path/to/odoo-addons add .gitea scripts compat-bootstrap
# git -C /path/to/odoo-addons commit -m "feat(compat): seed bootstrap workflow"
# git -C /path/to/odoo-addons push -u origin compat-bootstrap
name: Cold-start compat seed
on:
schedule:
- cron: '0 3 * * *' # nightly 03:00 UTC
workflow_dispatch: {} # manual trigger from Gitea UI
jobs:
seed:
runs-on: ubuntu-latest
# LOW — bound the worst-case runtime so a hung git fetch can't
# block the next nightly tick from starting.
timeout-minutes: 30
steps:
- name: Checkout compat-bootstrap branch
uses: actions/checkout@v4
with:
ref: compat-bootstrap
fetch-depth: 1
- name: Materialise per-major addon trees
run: |
set -euo pipefail
# Detached worktrees keep each version branch's tree independent
# so the qualifier sees a clean addon root with no cross-branch
# contamination. Cleaned up in the final step.
for major in 18.0 19.0; do
git fetch --depth=1 origin "$major"
git worktree add --detach "addons-$major" "origin/$major"
ls -1 "addons-$major/addons" 2>/dev/null | head -5 || true
done
- name: Run qualifier + emit bootstrap snapshot per major
run: |
set -euo pipefail
mkdir -p compat-bootstrap/per-major
for major in 18.0 19.0; do
# qualify-addon.py is vendored into each version branch under
# .gitea/qualify-addon.py by Pillar 1; reuse that copy so static-
# lint logic stays in lockstep with what addon-qualify.yml runs
# on push.
python3 scripts/emit-compat-bootstrap.py \
--addons-root "addons-$major/addons" \
--qualifier "addons-$major/.gitea/qualify-addon.py" \
--output "compat-bootstrap/per-major/$major.json" \
--pg-by-major '{"18.0":"16","19.0":"17"}' \
--source "qualify-addon-v1" \
--min-rows 5
done
- name: Merge per-major snapshots into seeded-ci.json
run: |
set -euo pipefail
python3 - <<'PY'
import hashlib, json, pathlib
rows = []
for p in sorted(pathlib.Path('compat-bootstrap/per-major').glob('*.json')):
rows.extend(json.loads(p.read_text())['rows'])
rows.sort(key=lambda r: (r['addonCode'], r['addonVersion'],
r['odooMajor'], r['postgresMajor']))
canonical = json.dumps(rows, sort_keys=True, separators=(',', ':'))
stamp = 'sha256:' + hashlib.sha256(canonical.encode('utf-8')).hexdigest()
out = {
'stampId': stamp,
'schemaVersion': 1,
'source': 'qualify-addon-v1',
'rows': rows,
}
pathlib.Path('compat-bootstrap/seeded-ci.json').write_text(
json.dumps(out, indent=2) + '\n')
print(f'merged {len(rows)} rows; stamp={stamp}')
PY
- name: Install cosign
run: |
set -euo pipefail
# H3 — pin the v2.4.1 cosign-linux-amd64 SHA256 and verify
# before chmod+exec. Without this, a MITM (or a compromised
# GitHub edge cache) can substitute a binary that exfiltrates
# COSIGN_PRIVATE_KEY on the next sign-blob call.
# Cross-checked against
# https://github.com/sigstore/cosign/releases/download/v2.4.1/cosign_checksums.txt
# on 2026-05-10.
COSIGN_SHA256=8b24b946dd5809c6bd93de08033bcf6bc0ed7d336b7785787c080f574b89249b
if ! command -v cosign >/dev/null 2>&1; then
curl -sSL -o /tmp/cosign \
https://github.com/sigstore/cosign/releases/download/v2.4.1/cosign-linux-amd64
echo "$COSIGN_SHA256 /tmp/cosign" | sha256sum -c -
mv /tmp/cosign /usr/local/bin/cosign
chmod +x /usr/local/bin/cosign
fi
cosign version | head -3
- name: Sign seeded-ci.json (Phase 4.1)
env:
COSIGN_PRIVATE_KEY: ${{ secrets.COMPAT_SIGNING_KEY }}
COSIGN_PASSWORD: ${{ secrets.COMPAT_SIGNING_PASSWORD }}
run: |
set -euo pipefail
if [ ! -f compat-bootstrap/seeded-ci.json ]; then
echo "ERROR: seeded-ci.json was not produced; nothing to sign"
exit 1
fi
# cosign sign-blob with --key env://VAR reads the private key
# PEM from the named env var; the file/stdout split keeps the
# raw key out of process args (where it'd show up in `ps`).
cosign sign-blob --yes \
--key env://COSIGN_PRIVATE_KEY \
--output-signature compat-bootstrap/seeded-ci.json.sig \
compat-bootstrap/seeded-ci.json
# H4 — Self-verify is a key-rotation TRIPWIRE, NOT a trust
# anchor. The on-disk cosign.pub lives in the same branch we
# push to; anyone holding COMPAT_PUSH_TOKEN can replace it.
# The real verification is Tower-side, against the pubkey
# baked into the binary at backend/cmd/api/compat_bootstrap_pubkey.pem
# (compat_seed_loader.go:38, go:embed). This step only catches
# "the operator rotated the keypair on lab1 but forgot to
# update either the Gitea secrets or the on-disk pubkey" —
# cheap CI-time signal so the next Tower deploy doesn't
# surprise-fail the verifier.
if [ -f compat-bootstrap/cosign.pub ]; then
cosign verify-blob --insecure-ignore-tlog \
--key compat-bootstrap/cosign.pub \
--signature compat-bootstrap/seeded-ci.json.sig \
compat-bootstrap/seeded-ci.json
fi
echo "signed: $(wc -c < compat-bootstrap/seeded-ci.json.sig) bytes"
- name: Commit + push if content changed
env:
GIT_USER_TOKEN: ${{ secrets.COMPAT_PUSH_TOKEN }}
run: |
set -euo pipefail
# M3 — worktree cleanup runs even if a prior step crashed
# (a `trap` would be cleaner, but we're already in a fresh
# step with `set -e`; the loop just no-ops on absent paths).
for major in 18.0 19.0; do git worktree remove --force "addons-$major" || true; done
rm -rf compat-bootstrap/per-major
# Author identity must match a real Gitea user — the
# branch-protection pre-receive hook calls GetUserByEmail
# against the committer and rejects with "Internal Server
# Error" (uid=-2) when no match exists. The COMPAT_PUSH_TOKEN
# belongs to git_admin, whose canonical Gitea email is
# `gitea@local.domain` (the install default — NOT the address
# in .credentials.md, which never made it onto the user
# record). Commit + push under that identity for a clean
# audit trail.
git config user.email "gitea@local.domain"
git config user.name "git_admin"
if [ ! -f compat-bootstrap/seeded-ci.json ]; then
echo "ERROR: seeded-ci.json was not produced; aborting"
exit 1
fi
# M1 — diff check against the JSON ONLY. Cosign's ECDSA
# sign-blob uses random k by default, so the .sig is fresh
# bytes every run even when the JSON is byte-identical.
# Including the .sig in the diff would push a noise commit
# nightly. Stage the .sig first so it's part of the commit
# IF we end up making one, but only the .json drives the
# decision.
git add compat-bootstrap/seeded-ci.json
if git diff --cached --quiet -- compat-bootstrap/seeded-ci.json; then
echo "no content change in seeded-ci.json; nothing to commit"
exit 0
fi
git add compat-bootstrap/seeded-ci.json.sig
stamp=$(python3 -c "import json,sys; print(json.load(open('compat-bootstrap/seeded-ci.json'))['stampId'])")
git commit -m "chore(compat): refresh cold-start seed (${stamp})"
# H5 — push via http.extraHeader, not credential-in-URL.
# If git push fails (network timeout, ref reject, redirect
# following), the URL is echoed in error context AND captured
# by the runner's stderr log; embedding the token in the URL
# leaks it in any of those paths. extraHeader is git-internal,
# never echoed.
git push \
-c "http.extraHeader=Authorization: token ${GIT_USER_TOKEN}" \
https://git.odoosky.org/odoo-tower/odoo-addons.git \
HEAD:compat-bootstrap

View File

@@ -1 +1 @@
"use strict";(self.webpackChunk_am5=self.webpackChunk_am5||[]).push([[4837],{9295:function(t,e,i){i.r(e),i.d(e,{am5themes_Animated:function(){return s}});var a=i(3409);class n extends a.Q{setupDefaultRules(){super.setupDefaultRules(),this.rule("Component").setAll({interpolationDuration:600}),this.rule("Hierarchy").set("animationDuration",600),this.rule("Scrollbar").set("animationDuration",600),this.rule("Tooltip").set("animationDuration",300),this.rule("MapChart").set("animationDuration",1e3),this.rule("MapChart").set("wheelDuration",300),this.rule("Entity").setAll({stateAnimationDuration:600}),this.rule("Sprite").states.create("default",{stateAnimationDuration:600}),this.rule("Tooltip",["axis"]).setAll({animationDuration:200}),this.rule("WordCloud").set("animationDuration",500)}}const s=n}},function(t){var e=(9295,t(t.s=9295)),i=window;for(var a in e)i[a]=e[a];e.__esModule&&Object.defineProperty(i,"__esModule",{value:!0})}]);
"use strict";(self.webpackChunk_am5_ksdn=self.webpackChunk_am5_ksdn||[]).push([[4837],{9295:function(t,e,i){i.r(e),i.d(e,{am5themes_Animated:function(){return s}});var a=i(3409);class n extends a.Q{setupDefaultRules(){super.setupDefaultRules(),this.rule("Component").setAll({interpolationDuration:600}),this.rule("Hierarchy").set("animationDuration",600),this.rule("Scrollbar").set("animationDuration",600),this.rule("Tooltip").set("animationDuration",300),this.rule("MapChart").set("animationDuration",1e3),this.rule("MapChart").set("wheelDuration",300),this.rule("Entity").setAll({stateAnimationDuration:600}),this.rule("Sprite").states.create("default",{stateAnimationDuration:600}),this.rule("Tooltip",["axis"]).setAll({animationDuration:200}),this.rule("WordCloud").set("animationDuration",500)}}const s=n}},function(t){var e=(9295,t(t.s=9295)),i=window;for(var a in e)i[a]=e[a];e.__esModule&&Object.defineProperty(i,"__esModule",{value:!0})}]);

View File

@@ -1,2 +1,2 @@
"use strict";(self.webpackChunk_am5=self.webpackChunk_am5||[]).push([[7891],{5650:function(e,s,t){t.r(s),t.d(s,{am5themes_Dataviz:function(){return o}});var u=t(1112),l=t(3409);class r extends l.Q{setupDefaultRules(){super.setupDefaultRules(),this.rule("ColorSet").setAll({colors:[u.Il.fromHex(2634320),u.Il.fromHex(9448493),u.Il.fromHex(13976381),u.Il.fromHex(15750208)],reuse:!1,passOptions:{lightness:.05,hue:0}})}}const o=r}},function(e){var s=(5650,e(e.s=5650)),t=window;for(var u in s)t[u]=s[u];s.__esModule&&Object.defineProperty(t,"__esModule",{value:!0})}]);
"use strict";(self.webpackChunk_am5_ksdn=self.webpackChunk_am5_ksdn||[]).push([[7891],{5650:function(e,s,t){t.r(s),t.d(s,{am5themes_Dataviz:function(){return o}});var u=t(1112),l=t(3409);class r extends l.Q{setupDefaultRules(){super.setupDefaultRules(),this.rule("ColorSet").setAll({colors:[u.Il.fromHex(2634320),u.Il.fromHex(9448493),u.Il.fromHex(13976381),u.Il.fromHex(15750208)],reuse:!1,passOptions:{lightness:.05,hue:0}})}}const o=r}},function(e){var s=(5650,e(e.s=5650)),t=window;for(var u in s)t[u]=s[u];s.__esModule&&Object.defineProperty(t,"__esModule",{value:!0})}]);
//# sourceMappingURL=Dataviz.js.map

View File

@@ -1,2 +1,2 @@
"use strict";(self.webpackChunk_am5=self.webpackChunk_am5||[]).push([[4583],{2250:function(e,l,r){r.r(l),r.d(l,{am5themes_Material:function(){return s}});var o=r(1112),f=r(3409);class m extends f.Q{setupDefaultRules(){super.setupDefaultRules(),this.rule("ColorSet").setAll({colors:[o.Il.fromHex(16007990),o.Il.fromHex(15277667),o.Il.fromHex(10233776),o.Il.fromHex(6765239),o.Il.fromHex(4149685),o.Il.fromHex(2201331),o.Il.fromHex(240116),o.Il.fromHex(48340),o.Il.fromHex(38536),o.Il.fromHex(5025616),o.Il.fromHex(9159498),o.Il.fromHex(13491257),o.Il.fromHex(16771899),o.Il.fromHex(16761095),o.Il.fromHex(16750592),o.Il.fromHex(16733986),o.Il.fromHex(7951688),o.Il.fromHex(10395294),o.Il.fromHex(6323595)],reuse:!0})}}const s=m}},function(e){var l=(2250,e(e.s=2250)),r=window;for(var o in l)r[o]=l[o];l.__esModule&&Object.defineProperty(r,"__esModule",{value:!0})}]);
"use strict";(self.webpackChunk_am5_ksdn=self.webpackChunk_am5_ksdn||[]).push([[4583],{2250:function(e,l,r){r.r(l),r.d(l,{am5themes_Material:function(){return s}});var o=r(1112),f=r(3409);class m extends f.Q{setupDefaultRules(){super.setupDefaultRules(),this.rule("ColorSet").setAll({colors:[o.Il.fromHex(16007990),o.Il.fromHex(15277667),o.Il.fromHex(10233776),o.Il.fromHex(6765239),o.Il.fromHex(4149685),o.Il.fromHex(2201331),o.Il.fromHex(240116),o.Il.fromHex(48340),o.Il.fromHex(38536),o.Il.fromHex(5025616),o.Il.fromHex(9159498),o.Il.fromHex(13491257),o.Il.fromHex(16771899),o.Il.fromHex(16761095),o.Il.fromHex(16750592),o.Il.fromHex(16733986),o.Il.fromHex(7951688),o.Il.fromHex(10395294),o.Il.fromHex(6323595)],reuse:!0})}}const s=m}},function(e){var l=(2250,e(e.s=2250)),r=window;for(var o in l)r[o]=l[o];l.__esModule&&Object.defineProperty(r,"__esModule",{value:!0})}]);
//# sourceMappingURL=Material.js.map

View File

@@ -1,2 +1,2 @@
"use strict";(self.webpackChunk_am5=self.webpackChunk_am5||[]).push([[2480],{8283:function(e,r,l){l.r(r),l.d(r,{am5themes_Moonrise:function(){return f}});var o=l(1112),s=l(3409);class u extends s.Q{setupDefaultRules(){super.setupDefaultRules(),this.rule("ColorSet").setAll({colors:[o.Il.fromHex(3805954),o.Il.fromHex(6296069),o.Il.fromHex(9054989),o.Il.fromHex(13065764),o.Il.fromHex(13082457),o.Il.fromHex(10786154),o.Il.fromHex(8815977),o.Il.fromHex(7696225),o.Il.fromHex(5792096),o.Il.fromHex(6388099)],reuse:!0})}}const f=u}},function(e){var r=(8283,e(e.s=8283)),l=window;for(var o in r)l[o]=r[o];r.__esModule&&Object.defineProperty(l,"__esModule",{value:!0})}]);
"use strict";(self.webpackChunk_am5_ksdn=self.webpackChunk_am5_ksdn||[]).push([[2480],{8283:function(e,r,l){l.r(r),l.d(r,{am5themes_Moonrise:function(){return f}});var o=l(1112),s=l(3409);class u extends s.Q{setupDefaultRules(){super.setupDefaultRules(),this.rule("ColorSet").setAll({colors:[o.Il.fromHex(3805954),o.Il.fromHex(6296069),o.Il.fromHex(9054989),o.Il.fromHex(13065764),o.Il.fromHex(13082457),o.Il.fromHex(10786154),o.Il.fromHex(8815977),o.Il.fromHex(7696225),o.Il.fromHex(5792096),o.Il.fromHex(6388099)],reuse:!0})}}const f=u}},function(e){var r=(8283,e(e.s=8283)),l=window;for(var o in r)l[o]=r[o];r.__esModule&&Object.defineProperty(l,"__esModule",{value:!0})}]);
//# sourceMappingURL=Moonrise.js.map

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -1,7 +1,5 @@
.o_action_manager {
overflow: auto !important;
}
.construction_dashboard {
overflow: auto !important;
background-color: #f8f9fa !important;
height: 100%;
.d-flex {

View File

@@ -45,8 +45,6 @@ export class ConstructionDashboard extends Component {
// args: [false, false],
// });
const result = await this.orm.call("tk.construction.dashboard", "get_construction_state", [false, false]);
console.log(result,'dasdsa');
this.state.stats = result;
this.state.sites = Object.entries(result.con_sites || {}).map(([id, name]) => ({ id, name }));
this.renderCharts(result);
@@ -82,7 +80,6 @@ export class ConstructionDashboard extends Component {
this.state.stats = data;
this.renderCharts(data);
console.log(this.state.stats,'dasdsa');
}
openAction(name, resModel, domain = []) {
@@ -221,6 +218,16 @@ export class ConstructionDashboard extends Component {
renderGraph(el, options) {
if (!el) return;
el.innerHTML = "";
// NaN-guard: ApexCharts emits 'M NaN NaN A NaN' SVG warnings when series is
// empty or all zeros (donut/pie divide by total=0). Show 'No data' instead.
const series = options.series || [];
const hasData = series.some((s) =>
typeof s === 'number' ? s > 0 : (s && s.data && s.data.length > 0)
);
if (!hasData) {
el.innerHTML = '<div class="text-muted text-center p-4">No data</div>';
return;
}
const chart = new ApexCharts(el, options);
chart.render();
}

View File

@@ -334,11 +334,7 @@
</page>
</notebook>
</sheet>
<div class="oe_chatter">
<field name="message_follower_ids"/>
<field name="activity_ids"/>
<field name="message_ids"/>
</div>
<chatter/>
</form>
</field>
</record>

View File

@@ -306,11 +306,7 @@
</page>
</notebook>
</sheet>
<div class="oe_chatter">
<field name="message_follower_ids"/>
<field name="activity_ids"/>
<field name="message_ids"/>
</div>
<chatter/>
</form>
</field>
</record>

View File

@@ -164,11 +164,7 @@
</page>
</notebook>
</sheet>
<div class="oe_chatter">
<field name="message_follower_ids"/>
<field name="activity_ids"/>
<field name="message_ids"/>
</div>
<chatter/>
</form>
</field>
</record>

View File

@@ -186,11 +186,7 @@
</group>
</group>
</sheet>
<div class="oe_chatter">
<field name="message_follower_ids"/>
<field name="activity_ids"/>
<field name="message_ids"/>
</div>
<chatter/>
</form>
</field>
</record>

View File

@@ -234,11 +234,7 @@
</page>
</notebook>
</sheet>
<div class="oe_chatter">
<field name="message_follower_ids"/>
<field name="activity_ids"/>
<field name="message_ids"/>
</div>
<chatter/>
</form>
</field>
</record>

View File

@@ -228,11 +228,7 @@
</page>
</notebook>
</sheet>
<div class="oe_chatter">
<field name="message_follower_ids"/>
<field name="activity_ids"/>
<field name="message_ids"/>
</div>
<chatter/>
</form>
</field>
</record>

View File

@@ -55,11 +55,7 @@
<field name="reject_reason" invisible="status != 'reject'"/>
</group>
</sheet>
<div class="oe_chatter">
<field name="message_follower_ids"/>
<field name="activity_ids"/>
<field name="message_ids"/>
</div>
<chatter/>
</form>
</field>
</record>

View File

@@ -65,11 +65,7 @@
<field name="reject_reason" invisible="qc_status != 'reject'"/>
</group>
</sheet>
<div class="oe_chatter">
<field name="message_follower_ids"/>
<field name="activity_ids"/>
<field name="message_ids"/>
</div>
<chatter/>
</form>
</field>
</record>
@@ -179,11 +175,7 @@
<field name="reject_reason" invisible="qc_status != 'reject'"/>
</group>
</sheet>
<div class="oe_chatter">
<field name="message_follower_ids"/>
<field name="activity_ids"/>
<field name="message_ids"/>
</div>
<chatter/>
</form>
</field>
</record>
@@ -293,11 +285,7 @@
<field name="reject_reason" invisible="qc_status != 'reject'"/>
</group>
</sheet>
<div class="oe_chatter">
<field name="message_follower_ids"/>
<field name="activity_ids"/>
<field name="message_ids"/>
</div>
<chatter/>
</form>
</field>
</record>

View File

@@ -103,11 +103,7 @@
<field name="total_amount"/>
</group>
</sheet>
<div class="oe_chatter">
<field name="message_follower_ids"/>
<field name="activity_ids"/>
<field name="message_ids"/>
</div>
<chatter/>
</form>
</field>
</record>

View File

@@ -9,7 +9,7 @@
<field name="arch" type="xml">
<xpath expr="//form" position="inside">
<app class="app_settings_block" data-string="Construction" string="Construction"
data-key="tk_construction_management">
data-key="tk_construction_management" name="tk_construction_management">
<h2>Sequences</h2>
<setting class="row mt16 o_settings_container">
<div class="col-lg-12 o_setting_box">

View File

@@ -699,11 +699,7 @@
</page>
</notebook>
</sheet>
<div class="oe_chatter">
<field name="message_follower_ids"/>
<field name="activity_ids"/>
<field name="message_ids"/>
</div>
<chatter/>
</form>
</field>
</record>

View File

@@ -0,0 +1,5 @@
# compat-bootstrap
Tower's CompatSeedLoader fetches `seeded-ci.json` from this directory.
Generated by `.gitea/workflows/seed-compat.yml` (cron 03:00 UTC daily).
Don't edit by hand — the next workflow run will overwrite.

View File

@@ -0,0 +1,4 @@
-----BEGIN PUBLIC KEY-----
MFkwEwYHKoZIzj0CAQYIKoZIzj0DAQcDQgAEgbGcCGMzThWEY5aaVK249Q+ZNm1w
BznDxfRvzL9AGdb1vkUngdcVmGXZBwg/rHXSkYJjt4t9Ky9mZkB9pB02BQ==
-----END PUBLIC KEY-----

View File

@@ -0,0 +1,151 @@
{
"stampId": "sha256:a4a88515128708329a00e506761cbdc6513d4a0442f7e6555fd98625433668ba",
"schemaVersion": 1,
"source": "qualify-addon-v1",
"rows": [
{
"addonCode": "accounting_pdf_reports",
"addonVersion": "19.0.1.0.2",
"odooMajor": "19.0",
"postgresMajor": "17",
"outcome": "success"
},
{
"addonCode": "cetmix_tower",
"addonVersion": "18.0.1.0.0",
"odooMajor": "18.0",
"postgresMajor": "16",
"outcome": "success"
},
{
"addonCode": "cetmix_tower_aws",
"addonVersion": "18.0.1.0.1",
"odooMajor": "18.0",
"postgresMajor": "16",
"outcome": "success"
},
{
"addonCode": "cetmix_tower_git",
"addonVersion": "18.0.1.0.2",
"odooMajor": "18.0",
"postgresMajor": "16",
"outcome": "failed_install",
"errorClass": "pip-deps-missing"
},
{
"addonCode": "cetmix_tower_ovh",
"addonVersion": "18.0.1.0.1",
"odooMajor": "18.0",
"postgresMajor": "16",
"outcome": "success"
},
{
"addonCode": "cetmix_tower_server",
"addonVersion": "18.0.2.0.0",
"odooMajor": "18.0",
"postgresMajor": "16",
"outcome": "failed_install",
"errorClass": "pip-deps-missing"
},
{
"addonCode": "cetmix_tower_server_queue",
"addonVersion": "18.0.2.0.0",
"odooMajor": "18.0",
"postgresMajor": "16",
"outcome": "success"
},
{
"addonCode": "cetmix_tower_webhook",
"addonVersion": "18.0.1.0.1",
"odooMajor": "18.0",
"postgresMajor": "16",
"outcome": "success"
},
{
"addonCode": "cetmix_tower_yaml",
"addonVersion": "18.0.2.0.0",
"odooMajor": "18.0",
"postgresMajor": "16",
"outcome": "failed_install",
"errorClass": "pip-deps-missing"
},
{
"addonCode": "cx_web_refresh_from_backend",
"addonVersion": "18.0.1.0.0",
"odooMajor": "18.0",
"postgresMajor": "16",
"outcome": "success"
},
{
"addonCode": "ks_dashboard_ninja",
"addonVersion": "18.0.1.1.7",
"odooMajor": "18.0",
"postgresMajor": "16",
"outcome": "failed_install",
"errorClass": "pip-deps-missing"
},
{
"addonCode": "laundry_management",
"addonVersion": "19.0.19.0.4",
"odooMajor": "19.0",
"postgresMajor": "17",
"outcome": "success"
},
{
"addonCode": "om_account_accountant",
"addonVersion": "19.0.1.0.3",
"odooMajor": "19.0",
"postgresMajor": "17",
"outcome": "success"
},
{
"addonCode": "om_account_asset",
"addonVersion": "19.0.1.0.0",
"odooMajor": "19.0",
"postgresMajor": "17",
"outcome": "success"
},
{
"addonCode": "om_account_budget",
"addonVersion": "19.0.1.0.1",
"odooMajor": "19.0",
"postgresMajor": "17",
"outcome": "success"
},
{
"addonCode": "om_account_daily_reports",
"addonVersion": "19.0.1.0.1",
"odooMajor": "19.0",
"postgresMajor": "17",
"outcome": "success"
},
{
"addonCode": "om_account_followup",
"addonVersion": "19.0.1.0.2",
"odooMajor": "19.0",
"postgresMajor": "17",
"outcome": "success"
},
{
"addonCode": "om_fiscal_year",
"addonVersion": "19.0.1.0.1",
"odooMajor": "19.0",
"postgresMajor": "17",
"outcome": "success"
},
{
"addonCode": "om_recurring_payments",
"addonVersion": "19.0.1.0.0",
"odooMajor": "19.0",
"postgresMajor": "17",
"outcome": "success"
},
{
"addonCode": "tk_construction_management",
"addonVersion": "18.0.2.0.8",
"odooMajor": "18.0",
"postgresMajor": "16",
"outcome": "success"
}
]
}

View File

@@ -0,0 +1 @@
MEYCIQDv8fKzsqoS96xtx/8Q2Vi592gDXv4wp97UooyvB11lpQIhANMXW51u2fqu+CB88cqWg93M6V4rgoUDRY4EpjWwwMgD

248
scripts/emit-compat-bootstrap.py Executable file
View File

@@ -0,0 +1,248 @@
#!/usr/bin/env python3
"""
emit-compat-bootstrap.py — Pillar 2 / Phase 4 emitter.
Walks every addon directory under --addons-root, runs qualify-addon.py
against each, and emits a consolidated JSON snapshot Tower's
CompatSeedLoader replays into the matrix on startup.
Wire shape (matches CompatSeedBootstrap in compat_seed_loader.go):
{
"stampId": "sha256:<canonical-row-hash>",
"schemaVersion": 1,
"source": "qualify-addon-v1",
"rows": [
{"addonCode": "report_carbone", "addonVersion": "18.0.1.0.9",
"odooMajor": "18.0", "postgresMajor": "16",
"outcome": "success"},
...
]
}
Usage (typical Gitea Actions invocation):
python3 scripts/emit-compat-bootstrap.py \\
--addons-root . \\
--output compat-bootstrap/seeded-ci.json \\
--pg-by-major '{"18.0":"16","19.0":"17"}'
Exit codes:
0 snapshot written successfully (file may or may not have changed)
2 bad usage / I/O error
"""
from __future__ import annotations
import argparse
import ast
import hashlib
import json
import subprocess
import sys
from pathlib import Path
# Static-lint check name → matrix error_class taxonomy. Aligns with
# the keys in backend/cmd/api/error_translations.yaml so Phase 2
# aggregation can group static (seeded_ci) and runtime (real_install)
# evidence under the same bucket.
ERROR_CLASS_MAP = {
'manifest': 'manifest-malformed',
'pip-deps': 'pip-deps-missing',
'app-name': 'settings-page-broken',
'menu-icon': 'menu-icon-missing',
'hoot-import': 'bundle-module-missing',
'webpack-chunk': 'bundle-module-missing',
}
def find_addon_dirs(root: Path) -> list[Path]:
"""Return every direct child of `root` that ships a __manifest__.py.
Skips dotfiles, .git, and the bootstrap output directory itself so
re-running over a previously emitted tree doesn't try to lint
JSON files.
"""
out: list[Path] = []
for child in sorted(root.iterdir()):
if not child.is_dir():
continue
if child.name.startswith('.') or child.name == 'compat-bootstrap':
continue
if (child / '__manifest__.py').is_file():
out.append(child)
return out
def parse_manifest_version(addon_dir: Path) -> tuple[str, str]:
"""Return (addonVersion, odooMajor). Best-effort: empty strings on parse failure.
Odoo's convention is `<odoo-major>.<addon-x>.<addon-y>.<addon-z>`,
where odoo-major itself is `X.0` (e.g. `18.0`, `19.0`). We split
on the first two dots → odoo_major = "18.0", addon_version = full.
"""
try:
src = (addon_dir / '__manifest__.py').read_text(encoding='utf-8')
manifest = ast.literal_eval(src)
except Exception:
return '', ''
version = str(manifest.get('version', '')).strip()
if not version:
return '', ''
parts = version.split('.')
if len(parts) >= 2 and parts[0].isdigit() and parts[1] == '0':
odoo_major = f'{parts[0]}.0'
else:
odoo_major = ''
return version, odoo_major
def run_qualifier(qualifier: Path, addon_dirs: list[Path]) -> list[dict]:
"""Invoke qualify-addon.py --json over every addon dir at once."""
if not addon_dirs:
return []
cmd = ['python3', str(qualifier), '--json'] + [str(p) for p in addon_dirs]
proc = subprocess.run(cmd, capture_output=True, text=True)
if proc.returncode == 2:
# Bad usage / I/O — surface stderr and abort.
sys.stderr.write(proc.stderr)
sys.exit(2)
# returncode 0 (all qualified) or 1 (some failed) both produce the
# JSON array on stdout. Parse regardless.
try:
return json.loads(proc.stdout)
except json.JSONDecodeError as e:
sys.stderr.write(f'qualifier JSON parse failed: {e}\nstdout was:\n{proc.stdout}\n')
sys.exit(2)
def first_error_class(findings: list[dict]) -> str:
"""Pick the first ERROR-severity finding's check name and translate
it through ERROR_CLASS_MAP. Returns '' if no ERROR-level finding."""
for f in findings:
if f.get('severity') == 'ERROR':
check = f.get('check', '')
return ERROR_CLASS_MAP.get(check, check or 'unmatched')
return ''
def build_rows(qualifier_results: list[dict],
manifests: dict[str, tuple[str, str]],
pg_by_major: dict[str, str]) -> list[dict]:
"""Map qualifier output → matrix-shaped rows.
Skips addons whose manifest didn't parse (no addonVersion) — the
matrix needs both code and version to be useful, and we'd rather
drop a row than seed it with empty strings the aggregator would
later have to special-case. Dropped addons are reported on stderr
so a regression in one __manifest__.py is visible (M4 fix).
"""
rows: list[dict] = []
dropped: list[str] = []
for r in qualifier_results:
code = r['addon']
version, odoo_major = manifests.get(code, ('', ''))
if not version or not odoo_major:
dropped.append(code)
continue
pg_major = pg_by_major.get(odoo_major, '')
if r.get('qualified'):
outcome = 'success'
error_class = ''
else:
outcome = 'failed_install'
error_class = first_error_class(r.get('findings', []))
row = {
'addonCode': code,
'addonVersion': version,
'odooMajor': odoo_major,
'postgresMajor': pg_major,
'outcome': outcome,
}
if error_class:
row['errorClass'] = error_class
rows.append(row)
rows.sort(key=lambda r: (r['addonCode'], r['addonVersion'],
r['odooMajor'], r['postgresMajor']))
if dropped:
sys.stderr.write(
f'WARN: {len(dropped)} addon(s) dropped (manifest unparseable or version unrecognised): '
+ ', '.join(dropped) + '\n')
return rows
def stamp_id(rows: list[dict]) -> str:
"""Content hash of the canonical row set. Same rows → same stamp,
so Tower's idempotency check works across emitter invocations."""
canonical = json.dumps(rows, sort_keys=True, separators=(',', ':'))
h = hashlib.sha256(canonical.encode('utf-8')).hexdigest()
return f'sha256:{h}'
def main(argv: list[str]) -> int:
p = argparse.ArgumentParser()
p.add_argument('--addons-root', required=True, type=Path,
help='directory whose immediate children are addon dirs')
p.add_argument('--qualifier', type=Path,
default=Path(__file__).parent / 'qualify-addon.py',
help='path to qualify-addon.py (defaults to sibling script)')
p.add_argument('--output', required=True, type=Path,
help='where to write the bootstrap JSON')
p.add_argument('--pg-by-major', type=str,
default='{"18.0":"16","19.0":"17"}',
help='JSON map: odoo_major → recommended postgres_major')
p.add_argument('--source', type=str, default='qualify-addon-v1',
help='source identifier persisted in compat_seed_stamps.source')
p.add_argument('--min-rows', type=int, default=0,
help='minimum row count; emit failure exits non-zero (M2 sanity floor). '
'Set per-major to catch silent qualifier breakage that would ship an empty seed.')
args = p.parse_args(argv[1:])
if not args.addons_root.is_dir():
sys.stderr.write(f'addons-root not a directory: {args.addons_root}\n')
return 2
if not args.qualifier.is_file():
sys.stderr.write(f'qualifier not found: {args.qualifier}\n')
return 2
try:
pg_by_major = json.loads(args.pg_by_major)
except json.JSONDecodeError as e:
sys.stderr.write(f'--pg-by-major must be JSON: {e}\n')
return 2
addon_dirs = find_addon_dirs(args.addons_root)
manifests = {d.name: parse_manifest_version(d) for d in addon_dirs}
qualifier_results = run_qualifier(args.qualifier, addon_dirs)
rows = build_rows(qualifier_results, manifests, pg_by_major)
# M2 — refuse to ship a suspiciously small bootstrap. Caller picks
# the floor based on expected catalog size; matching the floor with
# one safety margin (e.g. branch has 10 addons → --min-rows 8)
# catches "qualifier crashed mid-pass" without flapping on legit
# one-or-two-addon removals.
if args.min_rows > 0 and len(rows) < args.min_rows:
sys.stderr.write(
f'ERROR: produced {len(rows)} rows, expected at least {args.min_rows}'
'refusing to ship a thin bootstrap (set --min-rows lower if catalog has shrunk)\n')
return 2
# No generatedAt field — it would change every run and defeat the
# workflow's "git diff --quiet → no commit" idempotency. The git
# commit timestamp + stampId in commit message carry the same info.
bootstrap = {
'stampId': stamp_id(rows),
'schemaVersion': 1,
'source': args.source,
'rows': rows,
}
args.output.parent.mkdir(parents=True, exist_ok=True)
args.output.write_text(json.dumps(bootstrap, indent=2) + '\n', encoding='utf-8')
print(f'wrote {len(rows)} rows to {args.output} (stamp {bootstrap["stampId"]})',
file=sys.stderr)
return 0
if __name__ == '__main__':
sys.exit(main(sys.argv))