NDSS CRM Manual / Technical / Chapter 20: Security & Data Protection
V3.8 · 2024/2025

Chapter 20: Security & Data Protection

Comprehensive security architecture, authentication and authorisation controls, data encryption, NDIS-specific data protection requirements, audit logging, infrastructure security, backup and disaster recovery, compliance standards, vulnerability management, and incident response procedures.

20.1 Security Overview

NDSS CRM handles highly sensitive personal information including NDIS participant health records, disability diagnoses, behaviour support plans, financial data, and staff employment records. The platform is designed with a defence-in-depth security architecture that implements multiple overlapping layers of protection. A breach of any single layer does not compromise the entire system.

The security model is built to satisfy the requirements of the Australian Privacy Act 1988, the Australian Privacy Principles (APPs), the NDIS Quality and Safeguards Commission standards, and industry best practices aligned with ISO 27001 and the OWASP Top 10.

20.1.1 Security Architecture Layers

NDSS CRM - Defence-in-Depth Security Architecture
+====================================================================+ | LAYER 1: NETWORK PERIMETER | | +---------------------+ +-------------------+ +-------------+ | | | Vercel Edge Network | | DDoS Protection | | WAF Rules | | | | CDN + Edge Caching | | Automatic | | OWASP Top10 | | | +---------------------+ +-------------------+ +-------------+ | +====================================================================+ | LAYER 2: TRANSPORT SECURITY | | +---------------------+ +-------------------+ +-------------+ | | | TLS 1.3 | | HSTS Headers | | Cert Pinning| | | | All traffic encrypted| | max-age=31536000 | | API clients | | | +---------------------+ +-------------------+ +-------------+ | +====================================================================+ | LAYER 3: APPLICATION SECURITY | | +---------------------+ +-------------------+ +-------------+ | | | Authentication | | Authorisation | | Input Valid.| | | | JWT + Refresh Tokens | | RBAC (24 roles) | | Zod schemas | | | | Bcrypt hashing | | Row-Level Security| | Sanitisation| | | +---------------------+ +-------------------+ +-------------+ | +====================================================================+ | LAYER 4: DATA SECURITY | | +---------------------+ +-------------------+ +-------------+ | | | Encryption at Rest | | Column Encryption| | Audit Trail | | | | AES-256 (Supabase / Oracle) | | PII fields | | Immutable | | | +---------------------+ +-------------------+ +-------------+ | +====================================================================+ | LAYER 5: INFRASTRUCTURE | | +---------------------+ +-------------------+ +-------------+ | | | Supabase / Oracle Platform | | Network Isolation| | Backups | | | | SOC 2 Type II | | VPC, Firewalls | | Daily auto | | | +---------------------+ +-------------------+ +-------------+ | +====================================================================+

20.1.2 Security Responsibility Model

NDSS CRM operates on a shared responsibility model between the platform (Newdawn Support Services engineering team), the infrastructure providers (Supabase / Oracle, Vercel), and the organisation deploying the platform.

Responsibility AreaNDSS CRM PlatformInfrastructure ProviderOrganisation
Application code securityYes--
Authentication and authorisationYesPartial (Supabase / Oracle Auth)-
Data encryption at rest-Yes (Supabase / Oracle)-
Data encryption in transitYes (TLS config)Yes (certificates)-
Network security and DDoS-Yes (Vercel/Supabase / Oracle)-
User account managementYes (tooling)-Yes (admin actions)
Password policy enforcementYes (code-level)-Yes (user compliance)
Physical security of data centres-Yes-
Staff security trainingYes (documentation)-Yes (delivery)
Incident response coordinationYesYesYes

20.2 Authentication Security

Authentication is the first line of defence. NDSS CRM uses Supabase / Oracle Auth as the identity provider, with additional application-level security controls layered on top. Every user interaction with the platform begins with identity verification.

20.2.1 Password Security

All passwords are processed using the following security measures:

  • Hashing Algorithm: Passwords are hashed using bcrypt with a cost factor of 12. Plaintext passwords are never stored, logged, or transmitted after initial processing.
  • Minimum Complexity Requirements:
    • Minimum 12 characters in length
    • At least one uppercase letter (A-Z)
    • At least one lowercase letter (a-z)
    • At least one numeric digit (0-9)
    • At least one special character (!@#$%^&*)
  • Password History: The system maintains a history of the last 5 password hashes. Users cannot reuse any of their previous 5 passwords.
  • Breach Detection: New passwords are checked against the Have I Been Pwned database of compromised passwords (via k-anonymity API, no full password transmitted).
  • Expiry Policy: Passwords expire after 90 days. Users receive warnings at 14 days and 7 days before expiry.

20.2.2 Session Management

  • Access Tokens: JSON Web Tokens (JWT) with a 60-minute expiry. Tokens contain the user ID, role, and issued-at timestamp. Tokens are signed using HS256 with a 256-bit secret key.
  • Refresh Tokens: Opaque tokens stored in the database with a 30-day expiry. Each refresh token can only be used once (rotation on use). If a refresh token is reused, all sessions for that user are immediately revoked.
  • Concurrent Session Limit: Users are limited to a maximum of 3 active sessions. Creating a 4th session automatically terminates the oldest session.
  • Session Timeout: Inactive sessions are terminated after 30 minutes of inactivity. A warning dialog appears at 25 minutes.
  • Secure Cookie Attributes: Session tokens stored in cookies use HttpOnly, Secure, SameSite=Strict, and Path=/ attributes.

20.2.3 CSRF Protection

Cross-Site Request Forgery protection is implemented using the synchronizer token pattern. Every form and state-changing request includes a CSRF token that is validated server-side. The CSRF token is generated per session and rotated on every authentication event. API requests authenticated via Bearer tokens in the Authorization header are exempt from CSRF checks (as they are not susceptible to CSRF attacks).

20.2.4 Brute-Force Protection

  • Login Rate Limiting: Maximum 5 failed login attempts per email address within a 15-minute window. After 5 failures, the account is temporarily locked for 30 minutes.
  • Progressive Delays: After each failed attempt, an increasing delay is introduced before the next attempt is processed (1s, 2s, 4s, 8s, 16s).
  • IP-Based Limiting: Maximum 20 failed login attempts from a single IP address within a 15-minute window, regardless of which accounts are targeted.
  • Account Lockout: After 10 consecutive failed attempts, the account is locked until an administrator manually unlocks it or the user resets their password via email.
  • Notification: Account holders receive an email notification after 3 consecutive failed login attempts from an unrecognised IP address or device.

20.3 Authorisation & Role-Based Access Control (RBAC)

NDSS CRM implements a comprehensive RBAC system with 24 distinct roles. Authorisation is enforced at three levels: route-level middleware, component-level rendering, and database-level Row Level Security (RLS) policies.

20.3.1 Middleware-Based Route Protection

Every page and API route is protected by Next.js middleware that verifies the user's role against the required permissions for that route.

// src/middleware.ts - Route protection implementation import { NextResponse } from 'next/server'; import type { NextRequest } from 'next/server'; import { verifyToken, getRolePermissions } from '@/lib/auth'; const routePermissions: Record<string, string[]> = { '/dashboard': ['*'], // All authenticated roles '/clients': ['master_admin', 'administrator', 'service_coordinator', 'support_worker', 'intake'], '/staff': ['master_admin', 'administrator', 'hr_manager'], '/finance': ['master_admin', 'administrator', 'finance'], '/compliance': ['master_admin', 'administrator', 'compliance_officer'], '/admin/settings': ['master_admin', 'administrator'], '/admin/users': ['master_admin'], }; export function middleware(request: NextRequest) { const token = request.cookies.get('access_token')?.value; if (!token) { return NextResponse.redirect(new URL('/auth/login', request.url)); } const decoded = verifyToken(token); if (!decoded) { return NextResponse.redirect(new URL('/auth/login', request.url)); } const path = request.nextUrl.pathname; const allowedRoles = routePermissions[path]; if (allowedRoles && !allowedRoles.includes('*') && !allowedRoles.includes(decoded.role)) { return NextResponse.redirect(new URL('/unauthorized', request.url)); } return NextResponse.next(); }

20.3.2 Database Row Level Security (RLS)

Supabase / Oracle RLS policies provide a final layer of authorisation directly at the database level. Even if application-level checks were bypassed, RLS ensures users can only access data their role permits.

-- RLS policy: Support workers can only view their own assigned clients CREATE POLICY "support_worker_client_access" ON clients FOR SELECT USING ( auth.role() = 'master_admin' OR auth.role() = 'administrator' OR ( auth.role() = 'support_worker' AND id IN ( SELECT client_id FROM shift_assignments WHERE staff_id = auth.uid() AND status IN ('scheduled', 'in_progress') ) ) ); -- RLS policy: Finance role can view all invoices but only edit own drafts CREATE POLICY "finance_invoice_update" ON invoices FOR UPDATE USING ( auth.role() = 'master_admin' OR ( auth.role() = 'finance' AND (status = 'draft' OR status = 'pending') ) );

20.3.3 Role Hierarchy Summary

Access LevelRolesScope
Full Accessmaster_adminAll modules, all data, system configuration
AdministrativeadministratorAll modules, all data (except system config)
Departmentalfinance, hr_manager, compliance_officerFull access within department scope
Operationalservice_coordinator, intake, allocation_rosteringAssigned clients/staff within region
Clinicalbehaviour_support, occupational_therapist, speech_pathologist, psychologist, nurseAssigned clients, clinical records only
Frontlinesupport_worker, team_leaderAssigned shifts and client interactions
Limitedclient_portal, family_carerOwn records, linked participant only

20.4 Data Encryption

20.4.1 Encryption at Rest

All data stored in the Supabase / Oracle PostgreSQL database is encrypted at rest using AES-256 encryption. This includes all tables, indexes, backups, and write-ahead logs (WAL). The encryption is managed by the Supabase / Oracle infrastructure and uses AWS Key Management Service (KMS) for key storage and rotation.

  • Algorithm: AES-256-GCM
  • Key Storage: AWS KMS (FIPS 140-2 Level 2 validated)
  • Key Rotation: Automatic rotation every 365 days
  • Backup Encryption: All database backups are independently encrypted with separate keys
  • File Storage: Uploaded documents in Supabase / Oracle Storage are encrypted using AES-256 with per-object keys

20.4.2 Column-Level Encryption for Sensitive Fields

In addition to full-disk encryption, specific columns containing highly sensitive personally identifiable information (PII) are encrypted at the application level before storage:

TableEncrypted ColumnsAlgorithmReason
clientstax_file_number, medicare_numberAES-256-GCMAustralian tax and health identifiers
clientsmedical_conditions, medicationsAES-256-GCMProtected health information
staffbank_account_bsb, bank_account_numberAES-256-GCMFinancial payment details
stafftax_file_numberAES-256-GCMAustralian tax identifier
behaviour_support_plansrestrictive_practicesAES-256-GCMHighly sensitive clinical data
incident_reportsdetailed_descriptionAES-256-GCMMay contain sensitive participant details

20.4.3 Encryption in Transit

  • Protocol: TLS 1.3 is enforced for all connections. TLS 1.0 and 1.1 are completely disabled. TLS 1.2 is accepted only as a fallback for legacy integrations.
  • Cipher Suites: Only strong cipher suites are permitted:
    • TLS_AES_256_GCM_SHA384
    • TLS_CHACHA20_POLY1305_SHA256
    • TLS_AES_128_GCM_SHA256
  • HSTS: HTTP Strict Transport Security is enabled with max-age=31536000; includeSubDomains; preload.
  • Certificate: TLS certificates are issued by Let's Encrypt with automatic renewal via Vercel. Certificates use 2048-bit RSA keys.
  • Internal Communication: All communication between the Next.js application, Python services, PHP services, and the Supabase / Oracle database uses encrypted connections (SSL/TLS required, certificate verification enforced).

20.4.4 Key Management

Application-level encryption keys used for column-level encryption are stored as environment variables in Vercel and are never committed to source control. Keys are rotated annually, and re-encryption of all affected columns is performed during the maintenance window.

20.5 NDIS Data Protection

NDSS CRM is designed specifically for the Australian NDIS sector and must comply with the data protection requirements imposed by the NDIS Act 2013, the Privacy Act 1988, and the Australian Privacy Principles (APPs). This section describes how the platform addresses each relevant principle.

20.5.1 Australian Privacy Principles (APP) Compliance

APPTitleNDSS CRM Implementation
APP 1Open and transparent managementPrivacy policy accessible via Client Portal. Data handling procedures documented in this manual.
APP 2Anonymity and pseudonymityNot applicable - NDIS service delivery requires identification of participants.
APP 3Collection of solicited informationPlatform only collects information necessary for service delivery. Required fields are clearly marked. Optional fields are documented.
APP 4Dealing with unsolicited informationImport validation rejects fields not in the approved schema. Admins can purge unsolicited data.
APP 5Notification of collectionConsent forms and collection notices built into intake workflow. Client Portal displays data collection statements.
APP 6Use or disclosureRBAC restricts data access to authorised purposes. Audit logging tracks all data access events.
APP 7Direct marketingPlatform does not support marketing functionality. No participant data is used for marketing.
APP 8Cross-border disclosureAll data is stored within Australian data centres (Sydney region, ap-southeast-2). No cross-border transfer.
APP 9Government identifiersNDIS numbers and Medicare numbers are stored with column-level encryption and restricted access.
APP 10Quality of personal informationValidation rules enforce data quality. Duplicate detection prevents redundant records.
APP 11Security of personal informationMulti-layer encryption, RBAC, audit logging, and vulnerability management as described throughout this chapter.
APP 12Access to personal informationParticipants can view their records via Client Portal. Data export available in CSV format.
APP 13Correction of personal informationParticipants can request corrections via Client Portal. Correction history is maintained in audit log.

20.5.2 NDIS-Specific Data Retention Requirements

  • Participant Records: Minimum 7 years after the participant's last service delivery date or until the participant turns 25 (whichever is later, for participants under 18 at time of service).
  • Financial Records: Minimum 7 years from the date of the financial transaction, in accordance with the Australian Taxation Office (ATO) requirements.
  • Incident Reports: Minimum 7 years from the date of incident resolution. Reports involving death or serious injury are retained permanently.
  • Staff Records: Minimum 7 years after the termination of employment.
  • Audit Logs: Retained permanently. Audit logs are append-only and cannot be modified or deleted by any user role, including master_admin.
  • Behaviour Support Plans: Retained for the duration of the participant's active service plus 7 years, or as directed by the NDIS Quality and Safeguards Commission.

20.5.3 Data Minimisation

The platform implements data minimisation principles by only collecting and displaying information that is necessary for the specific function being performed. For example, support workers viewing a shift assignment see only the client's name, address, service type, and relevant care notes - they do not see financial data, full medical history, or other participants' records.

20.6 Input Validation & Injection Prevention

20.6.1 Zod Schema Validation

All API request bodies are validated against strict Zod schemas before processing. Zod is a TypeScript-first schema validation library that provides compile-time type safety and runtime validation. Every API endpoint defines a request schema that specifies required fields, data types, value ranges, string patterns, and custom validation rules.

// Example: Client creation validation schema import { z } from 'zod'; export const createClientSchema = z.object({ firstName: z.string() .min(1, 'First name is required') .max(100, 'First name must be 100 characters or fewer') .regex(/^[a-zA-Z\s'-]+$/, 'First name contains invalid characters'), lastName: z.string() .min(1, 'Last name is required') .max(100), ndisNumber: z.string() .regex(/^\d{9}$/, 'NDIS number must be exactly 9 digits'), dateOfBirth: z.string() .regex(/^\d{4}-\d{2}-\d{2}$/, 'Date must be in YYYY-MM-DD format') .refine((val) => { const dob = new Date(val); return dob < new Date(); }, 'Date of birth must be in the past'), primaryDisability: z.string().min(1), email: z.string().email().optional(), phone: z.string() .regex(/^\+61\d{9}$/, 'Australian phone number required (+61XXXXXXXXX)') .optional(), region: z.enum(['NSW', 'VIC', 'QLD', 'WA', 'SA', 'TAS', 'ACT', 'NT']), });

20.6.2 SQL Injection Prevention

NDSS CRM uses parameterised queries exclusively for all database interactions. The platform never constructs SQL queries through string concatenation. The Supabase / Oracle client library enforces parameterised queries by default, and the Python service layer uses psycopg2 parameterised queries. The PHP layer uses Laravel's Eloquent ORM with prepared statements.

  • Next.js/TypeScript: All database access goes through the Supabase / Oracle JavaScript client, which uses parameterised queries internally.
  • Python: All queries use psycopg2 parameterised execution (cursor.execute(query, params)).
  • PHP: All queries use Laravel Eloquent ORM or the Query Builder with bound parameters.
  • Stored Procedures: PostgreSQL stored procedures used for complex operations accept typed parameters only.

20.6.3 Cross-Site Scripting (XSS) Prevention

  • React Auto-Escaping: React's JSX rendering automatically escapes all dynamic content, preventing stored and reflected XSS attacks in the user interface.
  • Content Security Policy (CSP): A strict CSP header is configured to restrict script execution sources: Content-Security-Policy: default-src 'self'; script-src 'self' 'nonce-{random}'; style-src 'self' 'unsafe-inline'; img-src 'self' data: https://supabase.co; connect-src 'self' https://*.supabase.co wss://*.supabase.co; frame-ancestors 'none'; base-uri 'self'; form-action 'self';
  • Server-Side Sanitisation: All user input is sanitised using DOMPurify before storage. Rich text fields (progress notes, incident descriptions) are processed through a strict allowlist of permitted HTML tags.
  • HTTP Headers: X-Content-Type-Options: nosniff, X-Frame-Options: DENY, and Referrer-Policy: strict-origin-when-cross-origin headers are set on all responses.

20.7 Audit Logging

NDSS CRM maintains a comprehensive, immutable audit trail that records every significant action performed within the platform. The audit log is a critical compliance requirement for NDIS providers and serves as the primary forensic tool for security investigations.

20.7.1 Events Captured

The following categories of events are captured in the audit log:

CategoryEvents Logged
AuthenticationLogin success, login failure, logout, password change, password reset request, session timeout, account lockout, account unlock
User ManagementUser created, user updated, role changed, user deactivated, user reactivated, user deleted
Client RecordsClient created, client updated (with field-level diff), client archived, client restored, NDIS plan updated, goal created/updated/completed
Staff RecordsStaff created, staff updated, qualification added/removed, certification expiry updated, availability changed
ShiftsShift created, shift updated, shift cancelled, shift assigned, clock-in, clock-out, progress note submitted
FinanceInvoice created, invoice submitted, invoice approved, invoice rejected, invoice voided, payment recorded, budget adjusted
IncidentsIncident reported, status changed, investigation notes added, corrective action assigned, incident closed
Data AccessSensitive record viewed (PII fields), report generated, data exported, bulk data accessed
SystemSettings changed, role permissions modified, webhook configured, integration connected/disconnected, maintenance mode toggled

20.7.2 Audit Log Record Structure

{ "id": "aud-001-uuid", "timestamp": "2025-01-20T14:30:22.456Z", "userId": "s001-uuid", "userEmail": "l.chen@newdawnsupport.com.au", "userRole": "service_coordinator", "action": "client.update", "resourceType": "client", "resourceId": "c001-uuid", "changes": { "phone": { "from": "+61400111222", "to": "+61400555666" }, "riskLevel": { "from": "medium", "to": "low" } }, "ipAddress": "203.0.113.42", "userAgent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) ...", "sessionId": "sess-abc123" }

20.7.3 Immutability Guarantees

  • The audit log table has no UPDATE or DELETE permissions for any role, including master_admin.
  • RLS policies on the audit log table permit INSERT only (via the application service account) and SELECT for master_admin and compliance_officer roles.
  • Database triggers prevent any attempt to modify or remove audit log entries.
  • Audit logs are replicated to a secondary, read-only database instance for redundancy.

20.8 Infrastructure Security

20.8.1 Supabase / Oracle Platform Security

  • SOC 2 Type II Certified: Supabase / Oracle maintains SOC 2 Type II certification, providing assurance on security, availability, and confidentiality controls.
  • Data Centre Location: NDSS CRM uses the Supabase / Oracle ap-southeast-2 (Sydney, Australia) region to ensure all data remains within Australian borders.
  • Network Isolation: The PostgreSQL database is deployed within a Virtual Private Cloud (VPC) and is not directly accessible from the public internet. All access goes through the Supabase / Oracle API gateway.
  • Connection Pooling: Supabase / Oracle uses PgBouncer for connection pooling, with SSL/TLS required for all database connections.
  • Automatic Updates: The Supabase / Oracle platform handles PostgreSQL security patches and version upgrades automatically.

20.8.2 Vercel Platform Security

  • Edge Network: Vercel's global edge network provides DDoS protection and automatic traffic filtering.
  • Serverless Functions: API routes run as isolated serverless functions with no shared state between invocations.
  • Environment Variables: All secrets (database connection strings, API keys, encryption keys) are stored as encrypted environment variables in Vercel, never in source code.
  • Deploy Previews: Preview deployments use separate environments and do not have access to production data or secrets.
  • Automatic HTTPS: All Vercel deployments are served over HTTPS with automatic certificate provisioning and renewal.

20.8.3 Network Security Controls

ControlImplementationPurpose
WAF (Web Application Firewall)Vercel Edge WAF rulesBlock known attack patterns, OWASP Top 10
DDoS ProtectionVercel + Cloudflare integrationAutomatic detection and mitigation
IP AllowlistingSupabase / Oracle connection policiesRestrict direct DB access to known IPs
API GatewayNext.js middlewareRequest validation, rate limiting, auth
Internal Service CommunicationmTLS between servicesAuthenticated, encrypted inter-service calls
DNS SecurityDNSSEC enabledPrevent DNS spoofing attacks

20.9 Backup & Disaster Recovery

20.9.1 Backup Schedule

Backup TypeFrequencyRetentionStorage Location
Automated Daily BackupEvery 24 hours at 02:00 AEST30 daysSupabase / Oracle automated backups (ap-southeast-2)
Point-in-Time Recovery (PITR)Continuous WAL archiving7 daysSupabase / Oracle WAL archive (ap-southeast-2)
Weekly Full BackupEvery Sunday at 01:00 AEST90 daysEncrypted S3 bucket (ap-southeast-2)
Monthly Archival Backup1st of each month at 00:00 AEST7 yearsAWS S3 Glacier (ap-southeast-2)
File Storage BackupEvery 24 hours30 daysS3 cross-region replication

20.9.2 Recovery Objectives

MetricTargetDescription
Recovery Time Objective (RTO)4 hoursMaximum acceptable time to restore full platform functionality after a disaster event
Recovery Point Objective (RPO)1 hourMaximum acceptable data loss measured in time (achieved via PITR with continuous WAL archiving)
Service Level Objective (SLO)99.9% uptimeTarget availability measured monthly (maximum 43 minutes of downtime per month)

20.9.3 Disaster Recovery Procedures

  1. Detection: Automated monitoring detects service disruption via health checks running every 30 seconds. Alerts are sent to the on-call engineering team via SMS, email, and Slack.
  2. Assessment: The on-call engineer assesses the scope and severity of the incident within 15 minutes of detection.
  3. Communication: A status page update is published and affected organisation administrators are notified via email.
  4. Recovery Execution: Depending on the failure mode:
    • Application failure: Redeploy from last known good commit via Vercel (typically under 5 minutes).
    • Database corruption: Restore from PITR to the last healthy point (typically 30-60 minutes).
    • Full region failure: Activate cross-region backup restoration procedure (up to 4 hours).
  5. Verification: Run automated health checks and data integrity verification scripts.
  6. Post-Incident Review: Conduct a blameless post-incident review within 48 hours. Document findings and preventive measures.

20.9.4 Backup Testing

Backup restoration is tested quarterly. Each test involves restoring the most recent daily backup to a separate, isolated environment, running the full automated test suite against the restored data, and verifying data integrity checksums. Results are documented and reviewed by the engineering team. Any failures trigger an immediate investigation and remediation.

20.10 Compliance Standards

20.10.1 NDIS Quality and Safeguards Commission

NDSS CRM supports compliance with the NDIS Practice Standards through the following platform features:

NDIS Practice StandardNDSS CRM Feature
Rights and ResponsibilitiesService agreements module, consent management, complaint tracking
Person-Centred SupportsIndividual goal tracking, progress notes, care plan management, participant feedback collection
Provision of SupportsRostering and shift management, staff qualifications tracking, continuity of care reporting
Support Provision EnvironmentSIL property management, safety checklists, environmental risk assessments
Information ManagementEncrypted data storage, RBAC, audit logging, data retention policies, backup procedures
Feedback and ComplaintsComplaint intake forms, investigation workflows, resolution tracking, trend analysis
Incident ManagementIncident reporting module, severity classification, investigation workflows, NDIS Commission notifications
Human Resource ManagementWorker screening verification, qualification tracking, training records, performance management
Continuity of SupportsSuccession planning tools, transition documentation, handover reports
Restrictive PracticesAuthorisation tracking, behaviour support plan management, reporting to Commission

20.10.2 Privacy Act 1988 (Cth)

NDSS CRM complies with the Privacy Act 1988 and all 13 Australian Privacy Principles as detailed in Section 20.5. The platform is designed to handle personal information in accordance with the Act's requirements for collection, use, disclosure, storage, access, and correction of personal information.

20.10.3 ISO 27001 Alignment

While NDSS CRM does not currently hold ISO 27001 certification, the platform's security controls are designed in alignment with ISO 27001 Annex A controls. The following control domains are addressed:

  • A.5 Information Security Policies: Documented in this manual and the organisation's security policy.
  • A.6 Organisation of Information Security: Roles and responsibilities defined in Chapter 1 and this chapter.
  • A.7 Human Resource Security: Staff onboarding security training, role-based access, offboarding procedures.
  • A.8 Asset Management: Data classification (public, internal, confidential, restricted), asset inventory maintained.
  • A.9 Access Control: RBAC with 24 roles, least-privilege principle, regular access reviews.
  • A.10 Cryptography: AES-256 encryption at rest, TLS 1.3 in transit, key management procedures.
  • A.12 Operations Security: Change management, capacity monitoring, malware protection, logging.
  • A.14 System Acquisition and Development: Secure development lifecycle, code review, automated testing.
  • A.16 Incident Management: Incident response plan (see Section 20.12).
  • A.17 Business Continuity: Backup and disaster recovery (see Section 20.9).
  • A.18 Compliance: Regular compliance audits, legal requirement tracking.

20.11 Vulnerability Management

20.11.1 Dependency Scanning

NDSS CRM uses automated dependency scanning to identify known vulnerabilities in third-party packages across all three technology stacks:

StackToolFrequencyAction
Next.js / TypeScriptnpm audit, GitHub DependabotOn every pull request + daily automated scanCritical/High: patch within 24 hours. Medium: patch within 7 days.
Pythonpip-audit, Safety DBOn every pull request + daily automated scanSame patching SLA as above.
PHPcomposer audit, Roave Security AdvisoriesOn every pull request + daily automated scanSame patching SLA as above.

20.11.2 Static Application Security Testing (SAST)

  • ESLint Security Rules: The eslint-plugin-security plugin is configured to detect common security anti-patterns in TypeScript code.
  • CodeQL Analysis: GitHub CodeQL runs on every pull request to detect security vulnerabilities, injection flaws, and data flow issues.
  • Bandit (Python): Static analysis for Python code to detect common security issues.
  • PHPStan + Psalm: Static analysis for PHP code with security-focused rule sets.

20.11.3 Penetration Testing

External penetration testing is conducted annually by an independent, CREST-accredited security firm. The testing scope includes:

  • Web application penetration testing (OWASP Testing Guide methodology)
  • API security testing (authentication, authorisation, injection, business logic)
  • Infrastructure security assessment (Supabase / Oracle configuration, Vercel configuration)
  • Social engineering assessment (phishing simulation for staff)

Findings are classified by severity (Critical, High, Medium, Low, Informational) and remediated according to the following SLAs:

SeverityRemediation SLAVerification
Critical24 hoursRe-test by security firm within 48 hours
High7 daysRe-test by security firm within 14 days
Medium30 daysInternal verification, included in next pen test
Low90 daysIncluded in next pen test
InformationalBest effortTracked in backlog

20.12 Incident Response Plan

This section describes the security incident response plan for NDSS CRM. A security incident is defined as any event that compromises or threatens to compromise the confidentiality, integrity, or availability of the platform or its data.

20.12.1 Incident Severity Classification

SeverityDefinitionExamplesResponse Time
P1 - CriticalActive data breach, full service outage, or compromise of participant PIIDatabase exfiltration, ransomware, authentication bypassImmediate (within 15 minutes)
P2 - HighAttempted breach, partial service outage, or vulnerability actively being exploitedBrute force attack, SQL injection attempt detected, single service failureWithin 1 hour
P3 - MediumSecurity policy violation, suspicious activity, or vulnerability discoveredUnauthorised access attempt, new CVE in dependency, configuration driftWithin 4 hours
P4 - LowMinor policy deviation or informational security eventFailed login spikes, non-critical vulnerability, audit findingWithin 24 hours

20.12.2 Incident Response Phases

Phase 1: Detection and Identification

  • Automated monitoring systems detect anomalies (unusual login patterns, API rate spikes, error rate increases).
  • Staff can report suspected incidents via the internal incident reporting form or by contacting the security team directly.
  • The on-call engineer performs initial triage and assigns a severity level.

Phase 2: Containment

  • Short-term containment: Isolate affected systems, block malicious IPs, revoke compromised credentials.
  • Long-term containment: Apply patches, update firewall rules, rotate encryption keys if necessary.
  • Evidence preservation: Capture logs, database snapshots, and network traffic captures before any remediation.

Phase 3: Eradication

  • Identify and remove the root cause of the incident.
  • Patch vulnerabilities, remove malicious code, and update security configurations.
  • Scan all systems for indicators of compromise (IOCs).

Phase 4: Recovery

  • Restore affected systems from verified clean backups if necessary.
  • Gradually restore services with enhanced monitoring.
  • Verify data integrity across all affected tables.

Phase 5: Post-Incident Review

  • Conduct a blameless post-incident review within 48 hours.
  • Document timeline, root cause, impact, and remediation actions.
  • Identify preventive measures and add them to the security improvement backlog.
  • Update this incident response plan if gaps were identified.

20.12.3 Notification Requirements

In the event of a data breach involving personal information, NDSS CRM follows the mandatory notification requirements set out in the Notifiable Data Breaches (NDB) scheme under Part IIIC of the Privacy Act 1988:

Notification RecipientTimeframeContent Required
Office of the Australian Information Commissioner (OAIC)Within 30 days of becoming aware (or as soon as practicable)Nature of breach, types of information involved, recommended steps for affected individuals
NDIS Quality and Safeguards CommissionWithin 24 hours for incidents involving participant harmIncident details, affected participants, immediate actions taken
Affected IndividualsAs soon as practicable after notifying OAICDescription of breach, types of information involved, recommended protective steps
Organisation AdministratorsImmediately upon confirmationFull incident details, impact assessment, remediation timeline
Emergency Contact

For security incidents requiring immediate attention, contact the Newdawn Support Services Security Team at security@newdawnsupport.com.au or call the 24/7 security hotline at 1800 NDSS SEC (1800 637 773).