Phase 3: Platform Integration - Implementation Guide¶
Overview¶
Phase 3 represents full integration into the Meridian platform ecosystem. This is reserved for data products that require customer-facing interfaces, complex workflows, or deep integration with core business systems. The investment is significant, so business justification must be clear.
When to Graduate to Phase 3¶
Graduate from Phase 2 only when:
✅ Customer-facing requirement: External users need access
✅ Complex UI needs: Streamlit limitations prevent desired UX
✅ Deep platform integration: Must integrate with Meridian workflows
✅ Enterprise requirements: Security, compliance, or scale demands
✅ Business value justifies cost: ROI supports 3-6 months engineering investment
Warning Signs - Don't Graduate If:
❌ Internal tool working well in Phase 2
❌ UI needs can be met with Streamlit components
❌ Integration requirements are minimal
❌ No clear customer-facing need
❌ Engineering resources are constrained
Technology Stack Evolution¶
graph TB
subgraph "Phase 3: Meridian Platform Integration"
direction TB
subgraph "Frontend Layer"
NEXT[Next.js Components<br/>React + TypeScript]
UI[Custom UI Components<br/>Tailwind CSS]
AUTH[NextAuth.js<br/>Authentication]
end
subgraph "API Layer"
API[API Routes<br/>Next.js API + tRPC]
MIDDLEWARE[Middleware<br/>Auth + Rate Limiting]
VALIDATION[Zod Validation<br/>Type Safety]
end
subgraph "Data Layer"
SUPABASE[Supabase Core Schema<br/>Production Data]
RLS[Row Level Security<br/>Multi-tenant Access]
REALTIME[Realtime Subscriptions<br/>Live Updates]
end
subgraph "Infrastructure"
VERCEL[Vercel Deployment<br/>Edge Functions]
CDN[Global CDN<br/>Static Assets]
MONITORING[Monitoring<br/>DataDog/Sentry]
end
NEXT --> API
API --> SUPABASE
NEXT --> AUTH
API --> MIDDLEWARE
API --> VALIDATION
SUPABASE --> RLS
SUPABASE --> REALTIME
NEXT --> VERCEL
VERCEL --> CDN
VERCEL --> MONITORING
end
style NEXT fill:#61DAFB
style SUPABASE fill:#3ECF8E
style VERCEL fill:#000000
Architecture Migration Strategy¶
Data Schema Promotion (Analytics → Core)¶
-- 1. Create production schema structure in core
CREATE SCHEMA IF NOT EXISTS core;
-- 2. Migrate table structure with enhanced constraints
CREATE TABLE core.margin_reports (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
created_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
updated_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
-- Tenant isolation for multi-customer support
tenant_id UUID NOT NULL REFERENCES core.tenants(id),
-- Business data (enhanced)
account_number TEXT NOT NULL,
account_title TEXT,
client_id TEXT,
report_date DATE NOT NULL,
trade_date DATE,
-- Structured data (normalized from JSONB)
positions JSONB NOT NULL DEFAULT '[]',
account_totals JSONB NOT NULL DEFAULT '{}',
risk_metrics JSONB NOT NULL DEFAULT '{}',
-- Quality and processing metadata
extraction_confidence NUMERIC CHECK (extraction_confidence >= 0 AND extraction_confidence <= 1),
data_quality_score NUMERIC CHECK (data_quality_score >= 0 AND data_quality_score <= 1),
validation_status TEXT CHECK (validation_status IN ('pending', 'approved', 'rejected', 'flagged')),
processing_status TEXT CHECK (processing_status IN ('queued', 'processing', 'completed', 'failed')),
-- Audit and compliance
created_by UUID NOT NULL REFERENCES auth.users(id),
approved_by UUID REFERENCES auth.users(id),
approved_at TIMESTAMP WITH TIME ZONE,
source_file_name TEXT,
source_file_hash TEXT,
processing_metadata JSONB DEFAULT '{}',
-- Performance tracking
processing_time_ms INTEGER,
file_size_bytes INTEGER,
-- Constraints for data integrity
UNIQUE(tenant_id, account_number, report_date),
CHECK (approved_at IS NULL OR approved_by IS NOT NULL)
);
-- 3. Advanced indexing for enterprise scale
CREATE INDEX idx_margin_reports_tenant ON core.margin_reports(tenant_id);
CREATE INDEX idx_margin_reports_account ON core.margin_reports(tenant_id, account_number);
CREATE INDEX idx_margin_reports_date_range ON core.margin_reports(tenant_id, report_date);
CREATE INDEX idx_margin_reports_status ON core.margin_reports(validation_status, processing_status);
CREATE INDEX idx_margin_reports_created_by ON core.margin_reports(created_by);
-- 4. Enterprise Row Level Security
ALTER TABLE core.margin_reports ENABLE ROW LEVEL SECURITY;
-- Policy for tenant isolation
CREATE POLICY "Users can access their tenant's reports" ON core.margin_reports
FOR ALL TO authenticated
USING (
tenant_id IN (
SELECT tenant_id FROM core.user_tenants
WHERE user_id = auth.uid()
)
);
-- Policy for role-based access
CREATE POLICY "Admins can access all reports" ON core.margin_reports
FOR ALL TO authenticated
USING (
EXISTS (
SELECT 1 FROM core.user_roles ur
JOIN core.roles r ON ur.role_id = r.id
WHERE ur.user_id = auth.uid()
AND r.name = 'admin'
)
);
-- 5. Data migration from analytics schema
INSERT INTO core.margin_reports (
account_number, report_date, extracted_data,
extraction_confidence, created_by, source_file_name,
tenant_id, processing_time_ms
)
SELECT
account_number,
report_date,
extracted_data,
extraction_confidence,
(SELECT id FROM auth.users WHERE email = 'system@aic.com' LIMIT 1), -- System user
source_file_name,
(SELECT id FROM core.tenants WHERE name = 'default' LIMIT 1), -- Default tenant
processing_time_ms
FROM analytics.margin_reports
WHERE validation_status = 'approved';
API Layer Development¶
1. tRPC Router Setup (src/server/api/routers/marginiq.ts)¶
import { z } from "zod";
import { createTRPCRouter, protectedProcedure } from "~/server/api/trpc";
import { TRPCError } from "@trpc/server";
// Input validation schemas
const marginReportInputSchema = z.object({
accountNumber: z.string().min(1).max(50),
reportDate: z.date(),
extractedData: z.record(z.unknown()),
extractionConfidence: z.number().min(0).max(1),
sourceFileName: z.string().optional(),
});
const marginReportQuerySchema = z.object({
accountNumber: z.string().optional(),
dateFrom: z.date().optional(),
dateTo: z.date().optional(),
limit: z.number().min(1).max(100).default(20),
offset: z.number().min(0).default(0),
});
export const marginIqRouter = createTRPCRouter({
// Create new margin report
create: protectedProcedure
.input(marginReportInputSchema)
.mutation(async ({ input, ctx }) => {
const { supabase, user } = ctx;
try {
// Get user's tenant
const { data: userTenant } = await supabase
.from('user_tenants')
.select('tenant_id')
.eq('user_id', user.id)
.single();
if (!userTenant) {
throw new TRPCError({
code: 'FORBIDDEN',
message: 'User not associated with any tenant',
});
}
// Insert margin report
const { data, error } = await supabase
.from('margin_reports')
.insert({
tenant_id: userTenant.tenant_id,
account_number: input.accountNumber,
report_date: input.reportDate.toISOString().split('T')[0],
positions: input.extractedData,
extraction_confidence: input.extractionConfidence,
source_file_name: input.sourceFileName,
created_by: user.id,
processing_status: 'completed',
validation_status: 'pending',
})
.select()
.single();
if (error) {
throw new TRPCError({
code: 'INTERNAL_SERVER_ERROR',
message: `Failed to create margin report: ${error.message}`,
});
}
return data;
} catch (error) {
console.error('Error creating margin report:', error);
throw error;
}
}),
// Get margin reports with filtering
getAll: protectedProcedure
.input(marginReportQuerySchema)
.query(async ({ input, ctx }) => {
const { supabase, user } = ctx;
try {
let query = supabase
.from('margin_reports')
.select(`
*,
created_by_user:auth.users!created_by(email, full_name),
approved_by_user:auth.users!approved_by(email, full_name)
`)
.order('created_at', { ascending: false })
.range(input.offset, input.offset + input.limit - 1);
// Apply filters
if (input.accountNumber) {
query = query.eq('account_number', input.accountNumber);
}
if (input.dateFrom) {
query = query.gte('report_date', input.dateFrom.toISOString().split('T')[0]);
}
if (input.dateTo) {
query = query.lte('report_date', input.dateTo.toISOString().split('T')[0]);
}
const { data, error, count } = await query;
if (error) {
throw new TRPCError({
code: 'INTERNAL_SERVER_ERROR',
message: `Failed to fetch margin reports: ${error.message}`,
});
}
return {
reports: data || [],
total: count || 0,
hasMore: (input.offset + input.limit) < (count || 0),
};
} catch (error) {
console.error('Error fetching margin reports:', error);
throw error;
}
}),
// Get analytics dashboard data
getAnalytics: protectedProcedure
.input(z.object({
days: z.number().min(1).max(365).default(30)
}))
.query(async ({ input, ctx }) => {
const { supabase } = ctx;
const startDate = new Date();
startDate.setDate(startDate.getDate() - input.days);
try {
// Get aggregated statistics
const { data, error } = await supabase
.rpc('get_margin_analytics', {
start_date: startDate.toISOString().split('T')[0],
end_date: new Date().toISOString().split('T')[0]
});
if (error) {
throw new TRPCError({
code: 'INTERNAL_SERVER_ERROR',
message: `Failed to fetch analytics: ${error.message}`,
});
}
return data;
} catch (error) {
console.error('Error fetching analytics:', error);
throw error;
}
}),
// Approve margin report
approve: protectedProcedure
.input(z.object({
id: z.string().uuid(),
notes: z.string().optional(),
}))
.mutation(async ({ input, ctx }) => {
const { supabase, user } = ctx;
// Check if user has approval permissions
const { data: permissions } = await supabase
.rpc('check_user_permission', {
user_id: user.id,
permission: 'approve_margin_reports'
});
if (!permissions) {
throw new TRPCError({
code: 'FORBIDDEN',
message: 'User does not have approval permissions',
});
}
const { data, error } = await supabase
.from('margin_reports')
.update({
validation_status: 'approved',
approved_by: user.id,
approved_at: new Date().toISOString(),
processing_metadata: {
approval_notes: input.notes
}
})
.eq('id', input.id)
.select()
.single();
if (error) {
throw new TRPCError({
code: 'INTERNAL_SERVER_ERROR',
message: `Failed to approve report: ${error.message}`,
});
}
return data;
}),
});
2. Database Functions for Analytics (supabase/functions.sql)¶
-- Analytics function for dashboard
CREATE OR REPLACE FUNCTION get_margin_analytics(
start_date DATE,
end_date DATE
)
RETURNS JSON AS $$
DECLARE
result JSON;
BEGIN
WITH report_stats AS (
SELECT
COUNT(*) as total_reports,
COUNT(DISTINCT account_number) as unique_accounts,
AVG(extraction_confidence) as avg_confidence,
AVG(data_quality_score) as avg_quality_score,
COUNT(*) FILTER (WHERE validation_status = 'approved') as approved_reports,
COUNT(*) FILTER (WHERE validation_status = 'rejected') as rejected_reports,
COUNT(*) FILTER (WHERE validation_status = 'pending') as pending_reports
FROM core.margin_reports
WHERE report_date BETWEEN start_date AND end_date
),
processing_stats AS (
SELECT
AVG(processing_time_ms) as avg_processing_time,
MIN(processing_time_ms) as min_processing_time,
MAX(processing_time_ms) as max_processing_time
FROM core.margin_reports
WHERE report_date BETWEEN start_date AND end_date
AND processing_time_ms IS NOT NULL
),
trend_data AS (
SELECT
report_date,
COUNT(*) as daily_reports,
AVG(extraction_confidence) as daily_confidence
FROM core.margin_reports
WHERE report_date BETWEEN start_date AND end_date
GROUP BY report_date
ORDER BY report_date
)
SELECT json_build_object(
'summary', json_build_object(
'total_reports', rs.total_reports,
'unique_accounts', rs.unique_accounts,
'avg_confidence', ROUND(rs.avg_confidence::numeric, 3),
'avg_quality_score', ROUND(rs.avg_quality_score::numeric, 3),
'approval_rate', CASE
WHEN rs.total_reports > 0
THEN ROUND((rs.approved_reports::numeric / rs.total_reports) * 100, 1)
ELSE 0
END,
'pending_count', rs.pending_reports
),
'performance', json_build_object(
'avg_processing_time', ROUND(ps.avg_processing_time::numeric, 0),
'min_processing_time', ps.min_processing_time,
'max_processing_time', ps.max_processing_time
),
'trends', (
SELECT json_agg(
json_build_object(
'date', report_date,
'reports', daily_reports,
'confidence', ROUND(daily_confidence::numeric, 3)
)
)
FROM trend_data
)
) INTO result
FROM report_stats rs
CROSS JOIN processing_stats ps;
RETURN result;
END;
$$ LANGUAGE plpgsql SECURITY DEFINER;
-- Permission check function
CREATE OR REPLACE FUNCTION check_user_permission(
user_id UUID,
permission TEXT
)
RETURNS BOOLEAN AS $$
BEGIN
RETURN EXISTS (
SELECT 1
FROM core.user_roles ur
JOIN core.roles r ON ur.role_id = r.id
JOIN core.role_permissions rp ON r.id = rp.role_id
JOIN core.permissions p ON rp.permission_id = p.id
WHERE ur.user_id = user_id
AND p.name = permission
);
END;
$$ LANGUAGE plpgsql SECURITY DEFINER;
Frontend Components Development¶
1. Main MarginIQ Dashboard (src/pages/marginiq/index.tsx)¶
import { type NextPage } from "next";
import { useState } from "react";
import Head from "next/head";
import { api } from "~/utils/api";
import { useSession } from "next-auth/react";
import {
Chart as ChartJS,
CategoryScale,
LinearScale,
PointElement,
LineElement,
Title,
Tooltip,
Legend,
} from 'chart.js';
import { Line } from 'react-chartjs-2';
import { Card, CardContent, CardHeader, CardTitle } from "~/components/ui/card";
import { Button } from "~/components/ui/button";
import { Badge } from "~/components/ui/badge";
import { LoadingSpinner } from "~/components/ui/loading";
import { ErrorAlert } from "~/components/ui/error";
ChartJS.register(
CategoryScale,
LinearScale,
PointElement,
LineElement,
Title,
Tooltip,
Legend
);
const MarginIQDashboard: NextPage = () => {
const { data: session, status } = useSession();
const [selectedPeriod, setSelectedPeriod] = useState(30);
// API queries
const {
data: reportsData,
isLoading: reportsLoading,
error: reportsError
} = api.marginIq.getAll.useQuery({
limit: 10,
offset: 0,
});
const {
data: analytics,
isLoading: analyticsLoading
} = api.marginIq.getAnalytics.useQuery({
days: selectedPeriod,
});
if (status === "loading") {
return <LoadingSpinner />;
}
if (!session) {
return <ErrorAlert message="Please sign in to access MarginIQ" />;
}
if (reportsError) {
return <ErrorAlert message={`Error loading data: ${reportsError.message}`} />;
}
// Chart configuration
const chartData = {
labels: analytics?.trends?.map(t => t.date) || [],
datasets: [
{
label: 'Daily Reports',
data: analytics?.trends?.map(t => t.reports) || [],
borderColor: 'rgb(75, 192, 192)',
backgroundColor: 'rgba(75, 192, 192, 0.2)',
yAxisID: 'y',
},
{
label: 'Confidence Score',
data: analytics?.trends?.map(t => t.confidence) || [],
borderColor: 'rgb(255, 99, 132)',
backgroundColor: 'rgba(255, 99, 132, 0.2)',
yAxisID: 'y1',
},
],
};
const chartOptions = {
responsive: true,
interaction: {
mode: 'index' as const,
intersect: false,
},
scales: {
x: {
display: true,
title: {
display: true,
text: 'Date'
}
},
y: {
type: 'linear' as const,
display: true,
position: 'left' as const,
title: {
display: true,
text: 'Number of Reports'
}
},
y1: {
type: 'linear' as const,
display: true,
position: 'right' as const,
title: {
display: true,
text: 'Confidence Score'
},
grid: {
drawOnChartArea: false,
},
},
},
};
return (
<>
<Head>
<title>MarginIQ Dashboard | AIC Holdings</title>
<meta name="description" content="Margin report analysis dashboard" />
</Head>
<div className="container mx-auto px-4 py-8">
<div className="flex justify-between items-center mb-8">
<div>
<h1 className="text-3xl font-bold text-gray-900">MarginIQ Dashboard</h1>
<p className="text-gray-600 mt-2">Margin report analysis and risk management</p>
</div>
<div className="flex gap-2">
<Button
variant={selectedPeriod === 7 ? "default" : "outline"}
onClick={() => setSelectedPeriod(7)}
>
7 Days
</Button>
<Button
variant={selectedPeriod === 30 ? "default" : "outline"}
onClick={() => setSelectedPeriod(30)}
>
30 Days
</Button>
<Button
variant={selectedPeriod === 90 ? "default" : "outline"}
onClick={() => setSelectedPeriod(90)}
>
90 Days
</Button>
</div>
</div>
{/* Summary Cards */}
<div className="grid grid-cols-1 md:grid-cols-4 gap-6 mb-8">
<Card>
<CardHeader className="flex flex-row items-center justify-between space-y-0 pb-2">
<CardTitle className="text-sm font-medium">Total Reports</CardTitle>
</CardHeader>
<CardContent>
<div className="text-2xl font-bold">
{analyticsLoading ? <LoadingSpinner size="sm" /> : analytics?.summary?.total_reports || 0}
</div>
</CardContent>
</Card>
<Card>
<CardHeader className="flex flex-row items-center justify-between space-y-0 pb-2">
<CardTitle className="text-sm font-medium">Unique Accounts</CardTitle>
</CardHeader>
<CardContent>
<div className="text-2xl font-bold">
{analyticsLoading ? <LoadingSpinner size="sm" /> : analytics?.summary?.unique_accounts || 0}
</div>
</CardContent>
</Card>
<Card>
<CardHeader className="flex flex-row items-center justify-between space-y-0 pb-2">
<CardTitle className="text-sm font-medium">Avg Confidence</CardTitle>
</CardHeader>
<CardContent>
<div className="text-2xl font-bold">
{analyticsLoading ? <LoadingSpinner size="sm" /> : `${(analytics?.summary?.avg_confidence || 0) * 100}%`}
</div>
</CardContent>
</Card>
<Card>
<CardHeader className="flex flex-row items-center justify-between space-y-0 pb-2">
<CardTitle className="text-sm font-medium">Approval Rate</CardTitle>
</CardHeader>
<CardContent>
<div className="text-2xl font-bold">
{analyticsLoading ? <LoadingSpinner size="sm" /> : `${analytics?.summary?.approval_rate || 0}%`}
</div>
</CardContent>
</Card>
</div>
{/* Charts */}
<div className="grid grid-cols-1 lg:grid-cols-2 gap-6 mb-8">
<Card>
<CardHeader>
<CardTitle>Report Trends</CardTitle>
</CardHeader>
<CardContent>
{analyticsLoading ? (
<div className="flex justify-center p-8">
<LoadingSpinner />
</div>
) : (
<Line data={chartData} options={chartOptions} />
)}
</CardContent>
</Card>
<Card>
<CardHeader>
<CardTitle>Processing Performance</CardTitle>
</CardHeader>
<CardContent>
<div className="space-y-4">
<div>
<div className="text-sm text-gray-600">Average Processing Time</div>
<div className="text-lg font-semibold">
{analytics?.performance?.avg_processing_time || 0}ms
</div>
</div>
<div>
<div className="text-sm text-gray-600">Range</div>
<div className="text-sm">
{analytics?.performance?.min_processing_time || 0}ms - {analytics?.performance?.max_processing_time || 0}ms
</div>
</div>
</div>
</CardContent>
</Card>
</div>
{/* Recent Reports */}
<Card>
<CardHeader>
<CardTitle>Recent Reports</CardTitle>
</CardHeader>
<CardContent>
{reportsLoading ? (
<div className="flex justify-center p-8">
<LoadingSpinner />
</div>
) : (
<div className="overflow-x-auto">
<table className="w-full text-sm">
<thead>
<tr className="border-b">
<th className="text-left p-2">Account</th>
<th className="text-left p-2">Report Date</th>
<th className="text-left p-2">Confidence</th>
<th className="text-left p-2">Status</th>
<th className="text-left p-2">Created</th>
</tr>
</thead>
<tbody>
{reportsData?.reports?.map((report) => (
<tr key={report.id} className="border-b hover:bg-gray-50">
<td className="p-2 font-medium">{report.account_number}</td>
<td className="p-2">{new Date(report.report_date).toLocaleDateString()}</td>
<td className="p-2">{(report.extraction_confidence * 100).toFixed(1)}%</td>
<td className="p-2">
<Badge
variant={
report.validation_status === 'approved' ? 'default' :
report.validation_status === 'rejected' ? 'destructive' :
'secondary'
}
>
{report.validation_status}
</Badge>
</td>
<td className="p-2 text-gray-600">
{new Date(report.created_at).toLocaleDateString()}
</td>
</tr>
))}
</tbody>
</table>
</div>
)}
</CardContent>
</Card>
</div>
</>
);
};
export default MarginIQDashboard;
Deployment & DevOps¶
1. Environment Configuration¶
# Production environment variables
NEXTAUTH_SECRET=your_production_secret
NEXTAUTH_URL=https://meridian.aic.com
SUPABASE_URL=https://your-project.supabase.co
SUPABASE_ANON_KEY=your_anon_key
SUPABASE_SERVICE_ROLE_KEY=your_service_role_key
DATABASE_URL=postgresql://user:pass@host:5432/db
# Analytics & Monitoring
DATADOG_API_KEY=your_datadog_key
SENTRY_DSN=your_sentry_dsn
# Feature Flags
ENABLE_MARGINIQ=true
MARGINIQ_MAX_FILE_SIZE=50MB
2. CI/CD Pipeline (.github/workflows/deploy.yml)¶
name: Deploy to Production
on:
push:
branches: [main]
paths: ['src/pages/marginiq/**', 'src/server/api/routers/marginiq.ts']
jobs:
test:
name: Test MarginIQ Integration
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3
with:
node-version: 18
cache: npm
- name: Install dependencies
run: npm ci
- name: Run type checking
run: npm run type-check
- name: Run tests
run: npm run test:marginiq
- name: Run integration tests
run: npm run test:integration
env:
TEST_DATABASE_URL: ${{ secrets.TEST_DATABASE_URL }}
deploy:
name: Deploy to Vercel
needs: test
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Deploy to Vercel
uses: amondnet/vercel-action@v25
with:
vercel-token: ${{ secrets.VERCEL_TOKEN }}
vercel-org-id: ${{ secrets.VERCEL_ORG_ID }}
vercel-project-id: ${{ secrets.VERCEL_PROJECT_ID }}
vercel-args: '--prod'
3. Monitoring & Alerting Setup¶
// src/utils/monitoring.ts
import { type NextRequest } from "next/server";
export class MonitoringService {
static trackMarginIQUsage(
userId: string,
action: string,
metadata?: Record<string, unknown>
) {
// DataDog metrics
if (typeof window !== 'undefined' && window.DD_RUM) {
window.DD_RUM.addAction(action, {
userId,
feature: 'marginiq',
...metadata,
});
}
// Custom analytics
fetch('/api/internal/analytics', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
event: 'marginiq_action',
userId,
action,
timestamp: new Date().toISOString(),
metadata,
}),
}).catch(console.error);
}
static trackMarginIQError(
error: Error,
context?: Record<string, unknown>
) {
// Sentry error tracking
if (typeof window !== 'undefined' && window.Sentry) {
window.Sentry.captureException(error, {
tags: { feature: 'marginiq' },
extra: context,
});
}
// Log to console in development
if (process.env.NODE_ENV === 'development') {
console.error('MarginIQ Error:', error, context);
}
}
static async trackApiPerformance(
request: NextRequest,
startTime: number,
endTime: number,
success: boolean
) {
const duration = endTime - startTime;
// Custom metrics API
await fetch('/api/internal/metrics', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
metric: 'api_request_duration',
value: duration,
tags: {
endpoint: request.url,
method: request.method,
success: success.toString(),
feature: 'marginiq',
},
}),
}).catch(console.error);
}
}
Success Metrics for Phase 3¶
Technical Metrics¶
- Performance: API response time < 200ms (p95)
- Reliability: Uptime > 99.9%
- Security: Zero critical vulnerabilities
- Scalability: Support 1000+ concurrent users
Business Metrics¶
- User Adoption: Customer usage > internal usage
- Revenue Impact: Measurable business value
- User Satisfaction: NPS > 50
- Integration Success: Seamless workflow integration
Operational Metrics¶
- Development Velocity: Feature delivery maintained
- Support Burden: <2 hours/week maintenance
- Error Rate: <0.1% of requests fail
- Cost Efficiency: ROI positive within 6 months
Migration Checklist¶
Pre-Migration (Planning Phase)¶
- Business case approved and ROI projected
- Engineering resources allocated (3-6 months)
- User research completed for UX requirements
- Technical architecture designed and reviewed
- Data migration strategy planned
- Security review completed
- Performance requirements defined
Migration Phase¶
- Database schema created in core with proper RLS
- API routes implemented with full test coverage
- Frontend components built with accessibility compliance
- Authentication and authorization integrated
- Data migration executed and validated
- Integration tests passing
- Performance benchmarks met
- Security penetration testing completed
Post-Migration¶
- User training completed
- Documentation updated
- Monitoring and alerting configured
- Support processes established
- Success metrics baseline captured
- Phase 2 system deprecated (if applicable)
Phase 3 represents the culmination of the data product evolution - a fully integrated, enterprise-grade solution that delivers maximum business value through seamless user experience and robust technical foundation.