Compare commits

...

22 Commits

Author SHA1 Message Date
8b4119e872 Forced backend to use 3306 as potentially Feb 2026 updates resulted in mismatch issues being stricter.
All checks were successful
Build & Deploy Backend / build (push) Successful in 1m38s
Build & Deploy Backend / deploy (push) Successful in 3s
2026-03-04 06:55:58 +08:00
be8e5b0cdf Additional PatchHistory entities to expand the information stored.
All checks were successful
Build & Deploy Backend / build (push) Successful in 49s
Build & Deploy Backend / deploy (push) Successful in 32s
2025-11-05 09:22:29 +08:00
6be9c671ff Introduction of WindowsUpdate endpoint, allows for /api/patch-compliance to accept POSTS
All checks were successful
Build & Deploy Backend / build (push) Successful in 1m6s
Build & Deploy Backend / deploy (push) Successful in 32s
2025-11-05 08:45:48 +08:00
1a08230291 Fixed issue with drives list not being initialized first as an array. Was silently killing the DeviceService and not saving new devices.
All checks were successful
Build & Deploy Backend / build (push) Successful in 1m20s
Build & Deploy Backend / deploy (push) Successful in 32s
2025-11-04 08:27:52 +08:00
89c88d6de9 Added ambiguity to accepting PUT updates from admin end.
All checks were successful
Build & Deploy Backend / build (push) Successful in 51s
Build & Deploy Backend / deploy (push) Successful in 31s
2025-10-29 12:03:24 +08:00
b0e1928ca3 adjusted PUT methods for updating backend users.
All checks were successful
Build & Deploy Backend / build (push) Successful in 51s
Build & Deploy Backend / deploy (push) Successful in 1s
2025-10-29 11:55:07 +08:00
988c5ad527 Further AI fixes on snakecase vs camelcase
All checks were successful
Build & Deploy Backend / build (push) Successful in 51s
Build & Deploy Backend / deploy (push) Successful in 1s
2025-10-29 11:39:43 +08:00
859fc20ae8 AI fixing the SQL Limit objects on the reporting.
All checks were successful
Build & Deploy Backend / build (push) Successful in 52s
Build & Deploy Backend / deploy (push) Successful in 2s
2025-10-29 11:35:22 +08:00
afceca70d9 AI changes to reporting
All checks were successful
Build & Deploy Backend / build (push) Successful in 1m28s
Build & Deploy Backend / deploy (push) Successful in 34s
2025-10-29 11:30:20 +08:00
7797da78d0 Forgot missing import for JsonIgnoreProperties....
All checks were successful
Build & Deploy Backend / build (push) Successful in 43s
Build & Deploy Backend / deploy (push) Successful in 31s
2025-10-13 11:16:12 +08:00
faa4e579b4 Added an ignoreunknown to SystemInfoDTO for ease of flexibility on intake.
Some checks failed
Build & Deploy Backend / build (push) Failing after 40s
Build & Deploy Backend / deploy (push) Has been skipped
2025-10-13 11:14:28 +08:00
d3c3d93057 Lowered max range to 120 days in application.properties.
All checks were successful
Build & Deploy Backend / build (push) Successful in 1m21s
Build & Deploy Backend / deploy (push) Successful in 31s
2025-10-13 06:28:59 +08:00
43f2442d0b Resolved missing import.
All checks were successful
Build & Deploy Backend / build (push) Successful in 51s
Build & Deploy Backend / deploy (push) Successful in 31s
2025-10-10 11:14:18 +08:00
0844780949 Missing tokenResolver.
Some checks failed
Build & Deploy Backend / build (push) Failing after 41s
Build & Deploy Backend / deploy (push) Has been skipped
2025-10-10 10:56:31 +08:00
69721ba411 Resolved refresh endpoint issues.
Some checks failed
Build & Deploy Backend / build (push) Failing after 36s
Build & Deploy Backend / deploy (push) Has been skipped
2025-10-10 10:54:37 +08:00
9f68959a29 Added a keep-alive function to the AuthController
Some checks failed
Build & Deploy Backend / build (push) Failing after 35s
Build & Deploy Backend / deploy (push) Has been skipped
2025-10-10 10:50:42 +08:00
d639090419 The key changes:
All checks were successful
Build & Deploy Backend / build (push) Successful in 55s
Build & Deploy Backend / deploy (push) Successful in 31s
1. Script updated: Now runs fetchCVE_v2.js instead of fetchCVE.js (line 49)
  2. Proper configuration injection: Uses @Value to inject database credentials, API keys, and directory paths from your
  application properties
  3. Working directory: Sets the script directory properly with pb.directory(scriptDir)
  4. Environment setup: Uses the same DB connection extraction methods (extractHost, extractPort, extractDbName) as your
  ScriptController
  5. Locale settings: Added UTF-8 and Node options to match your ScriptController setup
  6. Log file location: Uses logsDirectory configuration instead of hardcoded path

  The scheduler will now run fetchCVE_v2.js (which executes importCVEEnrichmentFast) every 8 hours at midnight, 8am, and 4pm.
2025-10-10 09:56:34 +08:00
2b5aaa1401 Updated fetchcve scripts with enrichments.
All checks were successful
Build & Deploy Backend / build (push) Successful in 47s
Build & Deploy Backend / deploy (push) Successful in 31s
2025-10-08 11:07:04 +08:00
c43b3a65c5 New Github support for CVE verification.
All checks were successful
Build & Deploy Backend / build (push) Successful in 47s
Build & Deploy Backend / deploy (push) Successful in 31s
2025-10-08 10:41:19 +08:00
80c6aae9c2 Updated CVE scripts to include backfill support.
All checks were successful
Build & Deploy Backend / build (push) Successful in 55s
Build & Deploy Backend / deploy (push) Successful in 31s
2025-10-08 10:02:35 +08:00
2517db791c Updating scripts to hopefully work with migrated DB.
All checks were successful
Build & Deploy Backend / build (push) Successful in 58s
Build & Deploy Backend / deploy (push) Successful in 32s
2025-10-08 09:10:17 +08:00
1da5d77e5c Added explicit Timezone references.
All checks were successful
Build & Deploy Backend / build (push) Successful in 1m5s
Build & Deploy Backend / deploy (push) Successful in 32s
2025-10-08 07:52:17 +08:00
47 changed files with 2346670 additions and 65 deletions

1
.gitignore vendored
View File

@@ -35,3 +35,4 @@ out/
### VS Code ###
.vscode/
scripts/cve-sync.log

View File

@@ -0,0 +1,231 @@
# Reporting API Implementation
## Overview
Backend API endpoints for compliance reporting system providing security metrics, vulnerability data, and software inventory reporting.
## Implemented Components
### 1. DTOs (Data Transfer Objects)
Location: `src/main/java/com/psg/dlsysinfo/dl_sysinfo_server/dto/`
#### ComplianceSummaryDTO
High-level security compliance metrics:
- `totalDevices` - Total device count
- `vulnerableDevices` - Devices with at least one vulnerability
- `totalVulnerabilities` - Total vulnerability count
- `criticalVulns`, `highVulns`, `mediumVulns`, `lowVulns` - Counts by severity
- `totalSoftware` - Total unique software packages
- `vulnerableSoftware` - Software with CVEs
- `lastUpdated` - Timestamp of last vulnerability scan
#### TopVulnerabilityDTO
Critical vulnerability information:
- `cveId` - CVE identifier
- `title` - Vulnerability description
- `severity` - CRITICAL/HIGH/MEDIUM/LOW
- `score` - CVSS score
- `affectedDevices` - Number of affected devices
#### VulnerableSoftwareDTO
Vulnerable software metrics:
- `softwareName` - Software package name
- `totalInstances` - Total installations
- `vulnerableInstances` - Installations with CVEs
- `totalCves` - CVE count for this software
### 2. Repository Layer
Location: `src/main/java/com/psg/dlsysinfo/dl_sysinfo_server/repository/ReportingRepository.java`
#### Compliance Summary Queries (JPQL)
- `countTotalDevices()` - Count all devices for client
- `countVulnerableDevices()` - Count devices with vulnerabilities
- `countTotalVulnerabilities()` - Total vulnerability count
- `countCriticalVulnerabilities()` - CRITICAL severity count
- `countHighVulnerabilities()` - HIGH severity count
- `countMediumVulnerabilities()` - MEDIUM severity count
- `countLowVulnerabilities()` - LOW severity count
- `countTotalSoftware()` - Unique software package count
- `countVulnerableSoftware()` - Software with CVEs > 0
- `findLastVulnerabilityScanDate()` - Last scan timestamp
#### Advanced Queries (Native SQL)
- `findTopVulnerabilitiesNative()` - Top 20 vulnerabilities by severity and device count
- Sorted: CRITICAL > HIGH > MEDIUM > LOW, then by affected device count
- Returns: cveId, title, severity, score, affectedDevices
- `findVulnerableSoftwareNative()` - Top 20 vulnerable software by risk score
- Risk formula: (vulnerableInstances / totalInstances) × totalCves
- Returns: softwareName, totalInstances, vulnerableInstances, totalCves
### 3. Service Layer
Location: `src/main/java/com/psg/dlsysinfo/dl_sysinfo_server/service/ReportingService.java`
#### Methods with Caching
1. **getComplianceSummary(clientId)**
- Cache: 15 minutes
- Aggregates multiple queries into single DTO
- Handles null values with defaults
2. **getTopVulnerabilities(clientId)**
- Cache: 30 minutes (configurable in CacheConfig)
- Maps native query results to DTOs
- Returns top 20 vulnerabilities
3. **getVulnerableSoftware(clientId)**
- Cache: 60 minutes (configurable in CacheConfig)
- Maps native query results to DTOs
- Returns top 20 vulnerable software packages
### 4. Controller Layer
Location: `src/main/java/com/psg/dlsysinfo/dl_sysinfo_server/controller/ReportingController.java`
#### Endpoints
**GET /api/reporting/compliance-summary**
- Authentication: Required (`@PreAuthorize("isAuthenticated()")`)
- Returns: `ComplianceSummaryDTO`
- Filters by authenticated user's client
- Error handling: Returns 500 with error details on failure
**GET /api/reporting/top-vulnerabilities**
- Authentication: Required
- Returns: `List<TopVulnerabilityDTO>`
- Filters by authenticated user's client
- Error handling: Returns 500 with error details on failure
**GET /api/reporting/vulnerable-software**
- Authentication: Required
- Returns: `List<VulnerableSoftwareDTO>`
- Filters by authenticated user's client
- Error handling: Returns 500 with error details on failure
### 5. Caching Configuration
Location: `src/main/java/com/psg/dlsysinfo/dl_sysinfo_server/config/CacheConfig.java`
- **Provider**: Caffeine Cache
- **Configuration**:
- Maximum size: 1000 entries per cache
- Default TTL: 15 minutes
- Cache names: `complianceSummary`, `topVulnerabilities`, `vulnerableSoftware`
### 6. Build Dependencies
Location: `build.gradle`
Added dependencies:
```gradle
implementation 'org.springframework.boot:spring-boot-starter-cache'
implementation 'com.github.ben-manes.caffeine:caffeine'
```
## Data Flow
1. **Request Flow**:
```
Client Request
→ Controller (authentication/authorization)
→ Service (cache check)
→ Repository (database query)
→ Service (DTO mapping)
→ Controller (response)
```
2. **Security**:
- All endpoints require authentication via JWT
- Client isolation: queries automatically filter by authenticated user's clientId
- No cross-client data access possible
3. **Performance Optimization**:
- Multi-tiered caching (15min/30min/60min)
- Native SQL for complex queries with LIMIT
- Database queries use indexed columns (client_id, device_id, severity)
## Database Schema Requirements
### Tables Used
1. `devices` - Device inventory (indexed on client_id)
2. `cached_device_vulns` - Device-vulnerability junction (indexed on device_id, severity)
3. `cached_installed_software` - Software inventory (indexed on device_id, total_cves)
4. `clients` - Client/organization data
### Key Columns
- `client_id` - Organization identifier (FK in devices)
- `device_id` - Device identifier (FK in cached tables)
- `severity` - Vulnerability severity level
- `total_cves` - CVE count per software installation
- `last_updated` - Scan timestamp
## Error Handling
All endpoints include try-catch blocks that:
- Log errors with slf4j
- Return HTTP 500 with JSON error response
- Format: `{"error": "message", "code": 500}`
## Testing Recommendations
1. **Unit Tests**:
- Test repository queries with H2 in-memory database
- Mock service layer for controller tests
- Verify DTO mapping from Object[] arrays
2. **Integration Tests**:
- Test with realistic dataset (100+ devices)
- Verify cache behavior
- Test client isolation
3. **Load Tests**:
- Verify performance under concurrent requests
- Test cache effectiveness
- Monitor query execution times
## Cache Invalidation Strategy
Currently, cache TTL is time-based. For production, consider:
1. Event-based invalidation on vulnerability scan completion
2. Manual cache eviction endpoint for admins
3. Cache warming on application startup
Example cache eviction (for future implementation):
```java
@CacheEvict(value = {"complianceSummary", "topVulnerabilities", "vulnerableSoftware"}, allEntries = true)
public void invalidateReportingCache() {
log.info("Reporting cache invalidated");
}
```
## Future Enhancements
1. **Pagination**: Add pagination support for top vulnerabilities and software lists
2. **Date Ranges**: Add time-based filtering for trend analysis
3. **Export**: Add CSV/PDF export functionality
4. **Webhooks**: Notify on critical vulnerability detection
5. **Custom Thresholds**: Allow clients to set custom severity thresholds
6. **Audit Logging**: Track all reporting API access for compliance
## Deployment Notes
1. Build the project: `./gradlew build`
2. Verify application starts without errors
3. Test endpoints with valid JWT token
4. Monitor cache hit rates in production
5. Adjust cache TTL based on scan frequency
## API Usage Examples
### Compliance Summary
```bash
curl -X GET https://your-domain:8443/api/reporting/compliance-summary \
-H "Authorization: Bearer YOUR_JWT_TOKEN"
```
### Top Vulnerabilities
```bash
curl -X GET https://your-domain:8443/api/reporting/top-vulnerabilities \
-H "Authorization: Bearer YOUR_JWT_TOKEN"
```
### Vulnerable Software
```bash
curl -X GET https://your-domain:8443/api/reporting/vulnerable-software \
-H "Authorization: Bearer YOUR_JWT_TOKEN"
```

View File

@@ -60,6 +60,10 @@ dependencies {
// Email dependancy for SMTP
implementation 'org.springframework.boot:spring-boot-starter-mail'
// Caching with Caffeine
implementation 'org.springframework.boot:spring-boot-starter-cache'
implementation 'com.github.ben-manes.caffeine:caffeine'
// Lombok (optional for reducing boilerplate)
compileOnly 'org.projectlombok:lombok'
annotationProcessor 'org.projectlombok:lombok'

View File

@@ -1,6 +1,6 @@
DB_HOST=localhost
DB_USER=root
DB_PASSWORD=6DRR4xWvHBhSqLGtIOEKa7gHjKnX33Hf
DB_HOST=db.psg.net.au:3307
DB_USER=svc_sysinfo
DB_PASSWORD=2pT08pEuxqFiN6eD348vBlgoMfyfOjGB
DB_NAME=db_ld-spring-backend
NVD_API_KEY=42b4f093-e8c4-4110-a7d1-6ab2ba6234aa

5
scripts/.env.test Normal file
View File

@@ -0,0 +1,5 @@
DB_HOST=db.psg.net.au
DB_PORT=3307
DB_USER=svc_sysinfo
DB_PASSWORD=2pT08pEuxqFiN6eD348vBlgoMfyfOjGB
DB_NAME=db_ld-spring-backend

View File

@@ -1 +0,0 @@
2001-08-20T00:00:00.000Z

View File

@@ -0,0 +1,169 @@
#!/usr/bin/env node
import mysql from 'mysql2/promise';
function log(msg) {
const now = new Date().toLocaleString('en-AU', {
day: '2-digit', month: 'short', year: 'numeric',
hour: '2-digit', minute: '2-digit', second: '2-digit', hour12: true
}).replace(/\b(AM|PM)\b/, m => m.toLowerCase());
console.log(`[${now}] ${msg}`);
}
async function analyzeDatabaseQuality() {
const db = await mysql.createConnection({
host: process.env.DB_HOST,
port: process.env.DB_PORT || 3306,
user: process.env.DB_USER,
password: process.env.DB_PASSWORD,
database: process.env.DB_NAME,
});
log('🔍 Analyzing CVE Database Quality...\n');
log('━'.repeat(70));
// Total CVEs
const [total] = await db.query('SELECT COUNT(*) as count FROM cves');
log(`📊 Total CVEs: ${total[0].count.toLocaleString()}`);
// CVEs with CVSS v3 scores
const [v3] = await db.query('SELECT COUNT(*) as count FROM cves WHERE cvss_score_v3 IS NOT NULL');
const v3Pct = ((v3[0].count / total[0].count) * 100).toFixed(1);
log(`📈 CVEs with CVSS v3 scores: ${v3[0].count.toLocaleString()} (${v3Pct}%)`);
// CVEs with CVSS v2 scores
const [v2] = await db.query('SELECT COUNT(*) as count FROM cves WHERE cvss_score_v2 IS NOT NULL');
const v2Pct = ((v2[0].count / total[0].count) * 100).toFixed(1);
log(`📈 CVEs with CVSS v2 scores: ${v2[0].count.toLocaleString()} (${v2Pct}%)`);
// CVEs with CVSS v4 scores
const [v4] = await db.query('SELECT COUNT(*) as count FROM cves WHERE cvss_score_v4 IS NOT NULL');
const v4Pct = ((v4[0].count / total[0].count) * 100).toFixed(1);
log(`📈 CVEs with CVSS v4 scores: ${v4[0].count.toLocaleString()} (${v4Pct}%)`);
// CVEs with CWE IDs
const [cwe] = await db.query('SELECT COUNT(*) as count FROM cves WHERE cwe_ids IS NOT NULL AND cwe_ids != ""');
const cwePct = ((cwe[0].count / total[0].count) * 100).toFixed(1);
log(`🏷️ CVEs with CWE IDs: ${cwe[0].count.toLocaleString()} (${cwePct}%)`);
// CVEs with references
const [refs] = await db.query('SELECT COUNT(*) as count FROM cves WHERE `references` IS NOT NULL AND `references` != ""');
const refsPct = ((refs[0].count / total[0].count) * 100).toFixed(1);
log(`🔗 CVEs with references: ${refs[0].count.toLocaleString()} (${refsPct}%)`);
// KEV data
const [kev] = await db.query('SELECT COUNT(*) as count FROM kev_catalog');
log(`🛡️ Known Exploited Vulnerabilities (KEV): ${kev[0].count.toLocaleString()}`);
// Microsoft CVEs
const [msrc] = await db.query('SELECT COUNT(*) as count FROM microsoft_cves');
log(`🖥️ Microsoft CVEs: ${msrc[0].count.toLocaleString()}`);
// CPE matches
const [cpe] = await db.query('SELECT COUNT(*) as count FROM cpe_matches');
log(`💿 CPE matches (affected software): ${cpe[0].count.toLocaleString()}`);
log('\n━'.repeat(70));
log('📅 CVEs by Severity (CVSS v3):');
log('━'.repeat(70));
const [severity] = await db.query(`
SELECT
severity_v3,
COUNT(*) as count,
ROUND(AVG(cvss_score_v3), 1) as avg_score
FROM cves
WHERE severity_v3 IS NOT NULL
GROUP BY severity_v3
ORDER BY
CASE severity_v3
WHEN 'CRITICAL' THEN 1
WHEN 'HIGH' THEN 2
WHEN 'MEDIUM' THEN 3
WHEN 'LOW' THEN 4
ELSE 5
END
`);
severity.forEach(row => {
const pct = ((row.count / total[0].count) * 100).toFixed(1);
const icon = {
'CRITICAL': '🔴',
'HIGH': '🟠',
'MEDIUM': '🟡',
'LOW': '🟢'
}[row.severity_v3] || '⚪';
log(`${icon} ${(row.severity_v3 || 'UNKNOWN').padEnd(10)} ${row.count.toString().padStart(8)} (${pct.padStart(5)}%) - Avg: ${row.avg_score}`);
});
log('\n━'.repeat(70));
log('📅 Recent Activity (Last 30 Days):');
log('━'.repeat(70));
const [recent] = await db.query(`
SELECT COUNT(*) as count
FROM cves
WHERE last_modified_date >= DATE_SUB(NOW(), INTERVAL 30 DAY)
`);
log(`🆕 CVEs modified in last 30 days: ${recent[0].count.toLocaleString()}`);
const [recentPub] = await db.query(`
SELECT COUNT(*) as count
FROM cves
WHERE published_date >= DATE_SUB(NOW(), INTERVAL 30 DAY)
`);
log(`📝 CVEs published in last 30 days: ${recentPub[0].count.toLocaleString()}`);
log('\n━'.repeat(70));
log('🎯 Data Quality Score:');
log('━'.repeat(70));
const qualityScore = (
(parseFloat(v3Pct) * 0.3) + // 30% weight on CVSS v3
(parseFloat(cwePct) * 0.2) + // 20% weight on CWE
(parseFloat(refsPct) * 0.2) + // 20% weight on references
((cpe[0].count > 0 ? 100 : 0) * 0.15) + // 15% weight on CPE existence
((kev[0].count > 0 ? 100 : 0) * 0.15) // 15% weight on KEV data
);
log(`Overall Quality: ${qualityScore.toFixed(1)}%`);
if (qualityScore >= 90) log('✅ Excellent - Highly enriched database');
else if (qualityScore >= 75) log('✅ Good - Well enriched database');
else if (qualityScore >= 60) log('⚠️ Fair - Some enrichment needed');
else log('❌ Poor - Significant enrichment needed');
log('\n━'.repeat(70));
log('💡 Recommendations:');
log('━'.repeat(70));
if (parseFloat(v3Pct) < 80) {
log('⚠️ Run CVE enrichment to get more CVSS v3 scores');
log(' Use: POST /api/admin/scripts/fetch-cve (runs fetchCVE_v2.js in enrichment mode)');
}
if (parseFloat(cwePct) < 70) {
log('⚠️ Low CWE coverage - consider running enrichment');
}
if (kev[0].count < 1000) {
log('⚠️ KEV data seems low - run: POST /api/admin/scripts/fetch-kev');
}
if (msrc[0].count < 10000) {
log('⚠️ Microsoft CVE data seems low - run: POST /api/admin/scripts/fetch-msrc');
}
if (recent[0].count < 100) {
log('⚠️ No recent updates detected - run daily sync to stay current');
log(' Use: POST /api/admin/scripts/fetch-cve');
}
await db.end();
log('\n✅ Analysis complete!\n');
}
analyzeDatabaseQuality().catch(err => {
console.error('❌ Error:', err.message);
process.exit(1);
});

File diff suppressed because it is too large Load Diff

1
scripts/cvelistV5 Submodule

Submodule scripts/cvelistV5 added at d18b6e1ab0

View File

@@ -8,6 +8,7 @@ dotenv.config({ path: '.env.local' });
const DB = await mysql.createConnection({
host: process.env.DB_HOST,
port: process.env.DB_PORT || 3306,
user: process.env.DB_USER,
password: process.env.DB_PASSWORD,
database: process.env.DB_NAME,

View File

@@ -66,6 +66,7 @@ function addDaysToISO(dateISO, days) {
const DB = await mysql.createConnection({
host: process.env.DB_HOST,
port: process.env.DB_PORT || 3306,
user: process.env.DB_USER,
password: process.env.DB_PASSWORD,
database: process.env.DB_NAME,
@@ -109,18 +110,82 @@ async function fetchCVEPage(startIndex, startDate, endDate) {
async function processCVE(cveWrapper) {
const cve = cveWrapper.cve;
const cveId = cve.id;
const title = cve.titles?.find(t => t.lang === 'en')?.title || '';
const desc = cve.descriptions.find(d => d.lang === 'en')?.value ?? '';
const published = formatDate(cve.published);
const modified = formatDate(cve.lastModified);
const severity = cve.metrics?.cvssMetricV31?.[0]?.cvssData?.baseSeverity ?? null;
const score = cve.metrics?.cvssMetricV31?.[0]?.cvssData?.baseScore ?? null;
// CVSSv2
const metricV2 = cve.metrics?.cvssMetricV2?.[0];
const severityV2 = metricV2?.cvssData?.baseSeverity || null;
const scoreV2 = metricV2?.cvssData?.baseScore || null;
const vectorV2 = metricV2?.cvssData?.vectorString || '';
// CVSSv3
const metricV3 = cve.metrics?.cvssMetricV31?.[0];
const severityV3 = metricV3?.cvssData?.baseSeverity || null;
const scoreV3 = metricV3?.cvssData?.baseScore || null;
const vectorV3 = metricV3?.cvssData?.vectorString || '';
// CVSSv4
const metricV4 = cve.metrics?.cvssMetricV40?.[0] || cve.metrics?.cvssMetricV4?.[0];
const severityV4 = metricV4?.cvssData?.baseSeverity || null;
const scoreV4 = metricV4?.cvssData?.baseScore || null;
const vectorV4 = metricV4?.cvssData?.vectorString || '';
// CWE IDs
const cweIds = (cve.weaknesses || [])
.flatMap(w => w.description || [])
.filter(desc => desc.lang === 'en')
.map(desc => desc.value)
.join(',');
// References
const references = (cve.references || [])
.map(ref => ref.url)
.join(',');
// Tags
const cveTags = cve.cveMetadata?.cveTags || [];
const hasKev = cveTags.includes('Known_Exploited_Vulnerability');
const hasCertNotes = cveTags.includes('CERT-VN');
const hasCertAlerts = cveTags.includes('US-CERT-TA');
try {
await DB.execute(
`INSERT INTO cves (id, description, published_date, last_modified_date, severity, cvss_score)
VALUES (?, ?, ?, ?, ?, ?)
ON DUPLICATE KEY UPDATE last_modified_date = VALUES(last_modified_date)`,
[cveId, desc, published, modified, severity, score]
`INSERT INTO cves (
id, title, description, published_date, last_modified_date,
severity_v2, cvss_score_v2, cvss_vector_v2,
severity_v3, cvss_score_v3, cvss_vector_v3,
severity_v4, cvss_score_v4, cvss_vector_v4,
cwe_ids, \`references\`, hasKev, hasCertNotes, hasCertAlerts, source
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
ON DUPLICATE KEY UPDATE
last_modified_date = VALUES(last_modified_date),
title = IFNULL(title, VALUES(title)),
severity_v2 = IFNULL(severity_v2, VALUES(severity_v2)),
cvss_score_v2 = IFNULL(cvss_score_v2, VALUES(cvss_score_v2)),
cvss_vector_v2 = IFNULL(cvss_vector_v2, VALUES(cvss_vector_v2)),
severity_v3 = IFNULL(severity_v3, VALUES(severity_v3)),
cvss_score_v3 = IFNULL(cvss_score_v3, VALUES(cvss_score_v3)),
cvss_vector_v3 = IFNULL(cvss_vector_v3, VALUES(cvss_vector_v3)),
severity_v4 = IFNULL(severity_v4, VALUES(severity_v4)),
cvss_score_v4 = IFNULL(cvss_score_v4, VALUES(cvss_score_v4)),
cvss_vector_v4 = IFNULL(cvss_vector_v4, VALUES(cvss_vector_v4)),
cwe_ids = IFNULL(cwe_ids, VALUES(cwe_ids)),
\`references\` = IFNULL(\`references\`, VALUES(\`references\`)),
hasKev = VALUES(hasKev),
hasCertNotes = VALUES(hasCertNotes),
hasCertAlerts = VALUES(hasCertAlerts),
source = VALUES(source)
`,
[
cveId, title, desc, published, modified,
severityV2, scoreV2, vectorV2,
severityV3, scoreV3, vectorV3,
severityV4, scoreV4, vectorV4,
cweIds, references, hasKev ? 1 : 0, hasCertNotes ? 1 : 0, hasCertAlerts ? 1 : 0, 'NVD'
]
);
} catch (err) {
log(`❌ Error inserting CVE ${cveId}: ${err.message}`);

View File

@@ -14,6 +14,7 @@ const logFile = fs.createWriteStream('cve-sync.log', {
const RESUME_FILE = '.enrichment_resume';
const DB = await mysql.createConnection({
host: process.env.DB_HOST,
port: process.env.DB_PORT || 3306,
user: process.env.DB_USER,
password: process.env.DB_PASSWORD,
database: process.env.DB_NAME,
@@ -450,6 +451,7 @@ async function importCVEEnrichmentFast() {
// This script runs enrichment mode - use fetchCVE_withMORE.js for backfill
importCVEEnrichmentFast().catch((err) => {
log(`❌ Fatal error during enrichment: ${err.message}`);
logFile.end();

View File

@@ -77,6 +77,7 @@ function addDaysToISO(dateISO, days) {
const DB = await mysql.createConnection({
host: process.env.DB_HOST,
port: process.env.DB_PORT || 3306,
user: process.env.DB_USER,
password: process.env.DB_PASSWORD,
database: process.env.DB_NAME,
@@ -222,21 +223,28 @@ async function importCVEFeed() {
}
async function importCVEFeedBackfill() {
const now = new Date();
const resumeFrom = loadLastSyncedDate();
let startFrom = resumeFrom ? new Date(resumeFrom) : now;
const EARLIEST_CVE_DATE = new Date('2002-01-01T00:00:00.000Z');
const MAX_RANGE_DAYS = 120;
const resumeFrom = loadLastSyncedDate();
let currentStart = resumeFrom ? new Date(resumeFrom) : EARLIEST_CVE_DATE;
log(resumeFrom
? `🔁 Resuming CVE backfill from ${formatShortDate(startFrom.toISOString())}`
: `⏮️ Starting CVE backfill from today (${formatShortDate(startFrom.toISOString())})`
? `🔁 Resuming CVE backfill from ${formatShortDate(currentStart.toISOString())}`
: `⏮️ Starting CVE backfill from ${formatShortDate(EARLIEST_CVE_DATE.toISOString())}`
);
while (true) {
const end = new Date(startFrom);
const start = new Date(startFrom);
start.setDate(start.getDate() - MAX_RANGE_DAYS + 1); // 120-day window
const now = new Date();
while (currentStart < now) {
const start = new Date(currentStart);
const end = new Date(currentStart);
end.setDate(end.getDate() + MAX_RANGE_DAYS - 1); // 120-day window
// Don't go past today
if (end > now) {
end.setTime(now.getTime());
}
const startISO = start.toISOString();
const endISO = end.toISOString();
@@ -259,7 +267,7 @@ async function importCVEFeedBackfill() {
break;
}
log(`📄 Page ${++pageCount}${vulnerabilities.length} CVEs from index ${startIndex}`);
log(`📄 Page ${++pageCount}${vulnerabilities.length} CVEs from index ${startIndex} of ~${totalResults}`);
for (const vuln of vulnerabilities) {
await processCVE(vuln);
@@ -269,16 +277,20 @@ async function importCVEFeedBackfill() {
await new Promise((r) => setTimeout(r, 6000));
} while (startIndex < totalResults);
// Move the window backward
saveLastSyncedDate(start.toISOString());
startFrom = start;
// Move the window forward
currentStart.setDate(currentStart.getDate() + MAX_RANGE_DAYS);
saveLastSyncedDate(currentStart.toISOString());
log(`✅ Completed ${humanRange}. Next start: ${formatShortDate(currentStart.toISOString())}`);
} catch (err) {
log(`❌ Error during ${humanRange}: ${err.message}`);
log(`💾 Progress saved. You can restart to resume from ${formatShortDate(currentStart.toISOString())}`);
break;
}
if (start < new Date('2002-01-01')) {
log(`🛑 Reached earliest supported CVE publication date — halting backfill.`);
// Check if we've reached today
if (currentStart >= now) {
log(`🎉 Reached current date — backfill complete!`);
break;
}
}
@@ -290,9 +302,8 @@ async function importCVEFeedBackfill() {
//importCVEFeed().catch((err) => {
importCVEFeedBackfill(9000) // ~25 years (goes back to 2000)
.catch((err) => {
// Use importCVEFeed() for daily sync or importCVEFeedBackfill() for full backfill
importCVEFeedBackfill().catch((err) => {
log(`❌ Fatal error during import: ${err.message}`);
logFile.end();
});

View File

@@ -7,6 +7,7 @@ const KEV_URL = 'https://www.cisa.gov/sites/default/files/feeds/known_exploited_
const DB = await mysql.createConnection({
host: process.env.DB_HOST,
port: process.env.DB_PORT || 3306,
user: process.env.DB_USER,
password: process.env.DB_PASSWORD,
database: process.env.DB_NAME,

View File

@@ -9,6 +9,7 @@ dotenv.config({ path: '.env.local' });
const DB = await mysql.createConnection({
host: process.env.DB_HOST,
port: process.env.DB_PORT || 3306,
user: process.env.DB_USER,
password: process.env.DB_PASSWORD,
database: process.env.DB_NAME,

198
scripts/verifyCVECount.js Executable file
View File

@@ -0,0 +1,198 @@
#!/usr/bin/env node
import fs from 'fs';
import { execSync } from 'child_process';
import mysql from 'mysql2/promise';
import path from 'path';
const REPO_URL = 'https://github.com/CVEProject/cvelistV5.git';
const REPO_DIR = './cvelistV5';
function log(msg) {
const now = new Date().toLocaleString('en-AU', {
day: '2-digit', month: 'short', year: 'numeric',
hour: '2-digit', minute: '2-digit', second: '2-digit', hour12: true
}).replace(/\b(AM|PM)\b/, m => m.toLowerCase());
const line = `[${now}] ${msg}`;
console.log(line);
}
async function cloneOrPullRepo() {
log('📦 Checking CVE repository...');
if (fs.existsSync(REPO_DIR)) {
log('🔄 Repository exists, pulling latest changes...');
try {
execSync('git pull', { cwd: REPO_DIR, stdio: 'inherit' });
log('✅ Repository updated');
} catch (err) {
log(`⚠️ Git pull failed, trying fresh clone: ${err.message}`);
execSync(`rm -rf ${REPO_DIR}`);
execSync(`git clone --depth 1 ${REPO_URL} ${REPO_DIR}`, { stdio: 'inherit' });
}
} else {
log('📥 Cloning CVE repository (this may take a while)...');
execSync(`git clone --depth 1 ${REPO_URL} ${REPO_DIR}`, { stdio: 'inherit' });
log('✅ Repository cloned');
}
}
function countCVEsInRepo() {
log('🔍 Counting CVEs in repository...');
const cvesDir = path.join(REPO_DIR, 'cves');
let totalCount = 0;
const yearCounts = {};
if (!fs.existsSync(cvesDir)) {
log('❌ CVEs directory not found');
return { total: 0, byYear: {} };
}
// Iterate through year directories (e.g., cves/2023/)
const years = fs.readdirSync(cvesDir).filter(f => /^\d{4}$/.test(f));
for (const year of years) {
const yearPath = path.join(cvesDir, year);
if (!fs.statSync(yearPath).isDirectory()) continue;
let yearCount = 0;
// Each year has subdirectories like 0xxx, 1xxx, etc.
const subdirs = fs.readdirSync(yearPath);
for (const subdir of subdirs) {
const subdirPath = path.join(yearPath, subdir);
if (!fs.statSync(subdirPath).isDirectory()) continue;
// Count .json files in this subdirectory
const files = fs.readdirSync(subdirPath).filter(f => f.endsWith('.json'));
yearCount += files.length;
}
totalCount += yearCount;
yearCounts[year] = yearCount;
}
log(`📊 Found ${totalCount} CVE JSON files in repository`);
return { total: totalCount, byYear: yearCounts };
}
async function countCVEsInDatabase() {
log('🗄️ Counting CVEs in database...');
const db = await mysql.createConnection({
host: process.env.DB_HOST,
port: process.env.DB_PORT || 3306,
user: process.env.DB_USER,
password: process.env.DB_PASSWORD,
database: process.env.DB_NAME,
});
// Total count
const [totalRows] = await db.query('SELECT COUNT(*) as total FROM cves');
const total = totalRows[0].total;
// Count by year
const [yearRows] = await db.query(`
SELECT
YEAR(published_date) as year,
COUNT(*) as count
FROM cves
WHERE published_date IS NOT NULL
GROUP BY YEAR(published_date)
ORDER BY year
`);
const byYear = {};
yearRows.forEach(row => {
byYear[row.year] = row.count;
});
await db.end();
log(`📊 Found ${total} CVEs in database`);
return { total, byYear };
}
function compareResults(repo, db) {
log('\n📋 Comparison Report:');
log('━'.repeat(60));
log(`Repository Total: ${repo.total.toLocaleString()}`);
log(`Database Total: ${db.total.toLocaleString()}`);
log(`Difference: ${(db.total - repo.total).toLocaleString()}`);
log('━'.repeat(60));
// Get all years from both sources
const allYears = new Set([
...Object.keys(repo.byYear),
...Object.keys(db.byYear)
]);
const sortedYears = Array.from(allYears).sort();
log('\n📅 Year-by-Year Breakdown:');
log('━'.repeat(60));
log('Year | Repository | Database | Difference');
log('━'.repeat(60));
const missingYears = [];
for (const year of sortedYears) {
const repoCount = repo.byYear[year] || 0;
const dbCount = db.byYear[year] || 0;
const diff = dbCount - repoCount;
const diffStr = diff >= 0 ? `+${diff}` : diff.toString();
log(`${year} | ${repoCount.toString().padStart(10)} | ${dbCount.toString().padStart(9)} | ${diffStr}`);
if (Math.abs(diff) > 100) {
missingYears.push({ year, repoCount, dbCount, diff });
}
}
log('━'.repeat(60));
if (missingYears.length > 0) {
log('\n⚠ Years with significant differences (>100):');
missingYears.forEach(({ year, repoCount, dbCount, diff }) => {
log(` ${year}: ${diff > 0 ? 'Extra' : 'Missing'} ${Math.abs(diff)} CVEs`);
});
}
const percentComplete = ((db.total / repo.total) * 100).toFixed(2);
log(`\n✅ Database is ${percentComplete}% complete`);
if (db.total >= repo.total) {
log('🎉 Your database has all CVEs from the official repository!');
} else {
log(`⚠️ Missing ${(repo.total - db.total).toLocaleString()} CVEs`);
}
}
async function main() {
try {
log('🚀 Starting CVE verification...\n');
// Step 1: Clone or pull repo
await cloneOrPullRepo();
// Step 2: Count CVEs in repo
const repoStats = countCVEsInRepo();
// Step 3: Count CVEs in database
const dbStats = await countCVEsInDatabase();
// Step 4: Compare
compareResults(repoStats, dbStats);
log('\n✅ Verification complete!');
} catch (err) {
log(`❌ Error: ${err.message}`);
console.error(err);
process.exit(1);
}
}
main();

View File

@@ -0,0 +1,209 @@
#!/usr/bin/env node
import axios from 'axios';
import mysql from 'mysql2/promise';
function log(msg) {
const now = new Date().toLocaleString('en-AU', {
day: '2-digit', month: 'short', year: 'numeric',
hour: '2-digit', minute: '2-digit', second: '2-digit', hour12: true
}).replace(/\b(AM|PM)\b/, m => m.toLowerCase());
const line = `[${now}] ${msg}`;
console.log(line);
}
async function getGitHubCVEStats() {
log('📡 Fetching CVE statistics from GitHub API...');
try {
// Get repository tree (cves directory structure)
const response = await axios.get(
'https://api.github.com/repos/CVEProject/cvelistV5/git/trees/main?recursive=1',
{
headers: {
'Accept': 'application/vnd.github.v3+json',
'User-Agent': 'CVE-Verification-Script'
}
}
);
const tree = response.data.tree;
// Count .json files in cves/ directory
const cveFiles = tree.filter(item =>
item.path.startsWith('cves/') &&
item.path.endsWith('.json') &&
item.type === 'blob'
);
// Group by year
const byYear = {};
cveFiles.forEach(file => {
const match = file.path.match(/cves\/(\d{4})\//);
if (match) {
const year = match[1];
byYear[year] = (byYear[year] || 0) + 1;
}
});
log(`✅ Found ${cveFiles.length} CVE files in GitHub repository`);
return {
total: cveFiles.length,
byYear,
lastCommit: response.data.sha
};
} catch (err) {
log(`❌ GitHub API error: ${err.message}`);
if (err.response?.status === 403) {
log('⚠️ GitHub API rate limit exceeded. Try again later or clone the repository.');
}
throw err;
}
}
async function countCVEsInDatabase() {
log('🗄️ Counting CVEs in database...');
const db = await mysql.createConnection({
host: process.env.DB_HOST,
port: process.env.DB_PORT || 3306,
user: process.env.DB_USER,
password: process.env.DB_PASSWORD,
database: process.env.DB_NAME,
});
// Total count
const [totalRows] = await db.query('SELECT COUNT(*) as total FROM cves');
const total = totalRows[0].total;
// Count by year
const [yearRows] = await db.query(`
SELECT
YEAR(published_date) as year,
COUNT(*) as count
FROM cves
WHERE published_date IS NOT NULL
GROUP BY YEAR(published_date)
ORDER BY year
`);
const byYear = {};
yearRows.forEach(row => {
byYear[row.year] = row.count;
});
// Get date range
const [rangeRows] = await db.query(`
SELECT
MIN(published_date) as earliest,
MAX(published_date) as latest
FROM cves
`);
await db.end();
log(`✅ Found ${total} CVEs in database`);
return {
total,
byYear,
earliest: rangeRows[0].earliest,
latest: rangeRows[0].latest
};
}
function compareResults(github, db) {
log('\n📋 Comparison Report:');
log('━'.repeat(70));
log(`GitHub Repository: ${github.total.toLocaleString()} CVEs`);
log(`Your Database: ${db.total.toLocaleString()} CVEs`);
log(`Difference: ${(db.total - github.total).toLocaleString()}`);
log('━'.repeat(70));
log(`\nDatabase Date Range:`);
log(` Earliest: ${db.earliest ? new Date(db.earliest).toLocaleDateString() : 'N/A'}`);
log(` Latest: ${db.latest ? new Date(db.latest).toLocaleDateString() : 'N/A'}`);
// Get all years
const allYears = new Set([
...Object.keys(github.byYear),
...Object.keys(db.byYear)
]);
const sortedYears = Array.from(allYears).sort();
log('\n📅 Year-by-Year Breakdown:');
log('━'.repeat(70));
log('Year | GitHub | Database | Difference | % Complete');
log('━'.repeat(70));
const significantDiffs = [];
for (const year of sortedYears) {
const githubCount = github.byYear[year] || 0;
const dbCount = db.byYear[year] || 0;
const diff = dbCount - githubCount;
const pctComplete = githubCount > 0 ? ((dbCount / githubCount) * 100).toFixed(1) : '0.0';
const diffStr = diff >= 0 ? `+${diff}` : diff.toString();
log(`${year} | ${githubCount.toString().padStart(10)} | ${dbCount.toString().padStart(10)} | ${diffStr.padStart(10)} | ${pctComplete.padStart(6)}%`);
if (Math.abs(diff) > 100) {
significantDiffs.push({ year, githubCount, dbCount, diff });
}
}
log('━'.repeat(70));
if (significantDiffs.length > 0) {
log('\n⚠ Years with significant differences (>100):');
significantDiffs.forEach(({ year, githubCount, dbCount, diff }) => {
if (diff < 0) {
log(` ${year}: Missing ${Math.abs(diff)} CVEs (${dbCount}/${githubCount})`);
} else {
log(` ${year}: Extra ${diff} CVEs (database has more than GitHub)`);
}
});
}
const percentComplete = github.total > 0
? ((db.total / github.total) * 100).toFixed(2)
: '0.00';
log(`\n📊 Overall Completion: ${percentComplete}%`);
if (db.total >= github.total) {
log('🎉 Your database has all CVEs from the official GitHub repository!');
if (db.total > github.total) {
log(' (You may have older CVEs or modified entries not in the current repository)');
}
} else {
const missing = github.total - db.total;
log(`⚠️ Missing ${missing.toLocaleString()} CVEs from GitHub repository`);
log(` Run the backfill script to sync missing CVEs.`);
}
}
async function main() {
try {
log('🚀 Starting CVE verification using GitHub API...\n');
// Step 1: Get stats from GitHub API
const githubStats = await getGitHubCVEStats();
// Step 2: Count CVEs in database
const dbStats = await countCVEsInDatabase();
// Step 3: Compare
compareResults(githubStats, dbStats);
log('\n✅ Verification complete!');
} catch (err) {
log(`❌ Error: ${err.message}`);
console.error(err);
process.exit(1);
}
}
main();

View File

@@ -0,0 +1,37 @@
package com.psg.dlsysinfo.dl_sysinfo_server.config;
import com.github.benmanes.caffeine.cache.Caffeine;
import org.springframework.cache.CacheManager;
import org.springframework.cache.annotation.EnableCaching;
import org.springframework.cache.caffeine.CaffeineCacheManager;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import java.util.concurrent.TimeUnit;
@Configuration
@EnableCaching
public class CacheConfig {
@Bean
public CacheManager cacheManager() {
CaffeineCacheManager cacheManager = new CaffeineCacheManager(
"complianceSummary",
"topVulnerabilities",
"vulnerableSoftware"
);
cacheManager.setCaffeine(Caffeine.newBuilder()
.maximumSize(1000)
.expireAfterWrite(15, TimeUnit.MINUTES));
return cacheManager;
}
@Bean
public Caffeine<Object, Object> caffeineConfig() {
return Caffeine.newBuilder()
.maximumSize(1000)
.expireAfterWrite(15, TimeUnit.MINUTES);
}
}

View File

@@ -68,6 +68,16 @@ public class AdminController {
return ResponseEntity.ok(userService.getAllDecryptedUsers());
}
@PutMapping("/users/{userId}")
public ResponseEntity<UserDTO> updateUser(
@PathVariable Long userId,
@RequestBody UserDTO userDto,
@AuthenticationPrincipal CurrentUser user
) {
UserDTO updatedUser = userService.updateUser(userId, userDto);
return ResponseEntity.ok(updatedUser);
}
@PutMapping("/users/{userId}/enabled")
public ResponseEntity<Void> setUserEnabled(
@PathVariable Long userId,

View File

@@ -7,6 +7,7 @@ import com.psg.dlsysinfo.dl_sysinfo_server.entity.UserAuth;
import com.psg.dlsysinfo.dl_sysinfo_server.security.CurrentUser;
import com.psg.dlsysinfo.dl_sysinfo_server.security.EncryptionService;
import com.psg.dlsysinfo.dl_sysinfo_server.security.JwtUtil;
import com.psg.dlsysinfo.dl_sysinfo_server.security.TokenResolver;
import com.psg.dlsysinfo.dl_sysinfo_server.service.UserService;
import jakarta.servlet.http.HttpServletRequest;
import jakarta.servlet.http.HttpServletResponse;
@@ -33,6 +34,7 @@ public class AuthController {
private final JwtUtil jwtUtil;
private final UserService userAuthService;
private final EncryptionService encryptionService;
private final TokenResolver tokenResolver;
@Autowired
private PasswordEncoder passwordEncoder;
@@ -41,11 +43,13 @@ public class AuthController {
JwtUtil jwtUtil,
UserService userAuthService,
EncryptionService encryptionService,
TokenResolver tokenResolver,
Environment environment) {
this.authenticationManager = authenticationManager;
this.jwtUtil = jwtUtil;
this.userAuthService = userAuthService;
this.encryptionService = encryptionService;
this.tokenResolver = tokenResolver;
}
private static String stripBOM(String s) {
@@ -162,6 +166,36 @@ public class AuthController {
return ResponseEntity.status(HttpStatus.BAD_REQUEST).body("Incorrect current password");
}
}
@PostMapping("/refresh")
public ResponseEntity<?> refreshToken(HttpServletRequest request, HttpServletResponse response) {
String token = tokenResolver.resolveToken(request);
if (token != null && jwtUtil.validateToken(token)) {
String username = jwtUtil.extractUsername(token);
String displayName = jwtUtil.extractDisplayName(token);
String clientIdentifier = jwtUtil.extractClientIdentifier(token);
Long userId = jwtUtil.extractUserId(token);
List<String> roles = jwtUtil.extractRoles(token);
// Generate new token with extended expiry
String newToken = jwtUtil.generateToken(username, displayName, clientIdentifier, userId, roles);
// Set new cookie
ResponseCookie cookie = ResponseCookie.from("authToken", newToken)
.httpOnly(true)
.secure(true)
.path("/")
.sameSite("None")
.maxAge(60 * 60)
.domain(resolveCookieDomain(request))
.build();
response.addHeader(HttpHeaders.SET_COOKIE, cookie.toString());
return ResponseEntity.ok(Map.of("message", "Token refreshed"));
}
return ResponseEntity.status(401).body(Map.of("error", "Invalid token"));
}
}

View File

@@ -0,0 +1,92 @@
package com.psg.dlsysinfo.dl_sysinfo_server.controller;
import com.psg.dlsysinfo.dl_sysinfo_server.dto.ComplianceSummaryDTO;
import com.psg.dlsysinfo.dl_sysinfo_server.dto.TopVulnerabilityDTO;
import com.psg.dlsysinfo.dl_sysinfo_server.dto.VulnerableSoftwareDTO;
import com.psg.dlsysinfo.dl_sysinfo_server.repository.ClientRepository;
import com.psg.dlsysinfo.dl_sysinfo_server.security.CurrentUser;
import com.psg.dlsysinfo.dl_sysinfo_server.service.ReportingService;
import lombok.RequiredArgsConstructor;
import lombok.extern.slf4j.Slf4j;
import org.springframework.http.ResponseEntity;
import org.springframework.security.access.prepost.PreAuthorize;
import org.springframework.security.core.annotation.AuthenticationPrincipal;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
import java.util.List;
import java.util.Map;
@RestController
@RequestMapping("/api/reporting")
@RequiredArgsConstructor
@Slf4j
public class ReportingController {
private final ReportingService reportingService;
private final ClientRepository clientRepository;
/**
* GET /api/reporting/compliance-summary
* Provides high-level security compliance metrics for executive dashboard
*/
@PreAuthorize("isAuthenticated()")
@GetMapping("/compliance-summary")
public ResponseEntity<?> getComplianceSummary(@AuthenticationPrincipal CurrentUser user) {
try {
var client = clientRepository.findByClientIdentifier(user.getClientIdentifier())
.orElseThrow(() -> new RuntimeException("Client not found"));
ComplianceSummaryDTO summary = reportingService.getComplianceSummary(client.getClientId());
return ResponseEntity.ok(summary);
} catch (Exception e) {
log.error("Failed to generate compliance summary", e);
return ResponseEntity.internalServerError()
.body(Map.of("error", "Failed to generate compliance report", "code", 500));
}
}
/**
* GET /api/reporting/top-vulnerabilities
* Provides list of most critical vulnerabilities for detailed reporting
*/
@PreAuthorize("isAuthenticated()")
@GetMapping("/top-vulnerabilities")
public ResponseEntity<?> getTopVulnerabilities(@AuthenticationPrincipal CurrentUser user) {
try {
var client = clientRepository.findByClientIdentifier(user.getClientIdentifier())
.orElseThrow(() -> new RuntimeException("Client not found"));
List<TopVulnerabilityDTO> vulnerabilities = reportingService.getTopVulnerabilities(client.getClientId());
return ResponseEntity.ok(vulnerabilities);
} catch (Exception e) {
log.error("Failed to fetch top vulnerabilities", e);
return ResponseEntity.internalServerError()
.body(Map.of("error", "Failed to generate compliance report", "code", 500));
}
}
/**
* GET /api/reporting/vulnerable-software
* Identifies software packages posing highest security risk
*/
@PreAuthorize("isAuthenticated()")
@GetMapping("/vulnerable-software")
public ResponseEntity<?> getVulnerableSoftware(@AuthenticationPrincipal CurrentUser user) {
try {
var client = clientRepository.findByClientIdentifier(user.getClientIdentifier())
.orElseThrow(() -> new RuntimeException("Client not found"));
List<VulnerableSoftwareDTO> software = reportingService.getVulnerableSoftware(client.getClientId());
return ResponseEntity.ok(software);
} catch (Exception e) {
log.error("Failed to fetch vulnerable software", e);
return ResponseEntity.internalServerError()
.body(Map.of("error", "Failed to generate compliance report", "code", 500));
}
}
}

View File

@@ -42,55 +42,81 @@ public class ScriptController {
@Value("${nvd.max-range-days:7}")
private String nvdMaxRangeDays;
private final File cveLogFile = new File("scripts/cve-sync.log");
private final File kevLogFile = new File("scripts/kev-sync.log");
private final File msrcLogFile = new File("scripts/msrc-sync.log");
@Value("${scripts.directory:/home/sonder/ld-sysinfo-server/scripts}")
private String scriptsDirectory;
@Value("${scripts.logs.directory:/home/sonder/ld-sysinfo-server/scripts}")
private String logsDirectory;
private File getCveLogFile() {
return new File(logsDirectory, "cve-sync.log");
}
private File getKevLogFile() {
return new File(logsDirectory, "kev-sync.log");
}
private File getMsrcLogFile() {
return new File(logsDirectory, "msrc-sync.log");
}
@PreAuthorize("hasRole('ADMIN')")
@PostMapping("/fetch-cve")
public ResponseEntity<String> runCveScript(@AuthenticationPrincipal Object user) {
return triggerScript("fetchCVE.js", "📡 CVE sync launched in background.", cveLogFile);
return triggerScript("fetchCVE_v2.js", "📡 CVE enrichment sync launched (runs importCVEEnrichmentFast).", getCveLogFile());
}
@PreAuthorize("hasRole('ADMIN')")
@PostMapping("/fetch-cve-backfill")
public ResponseEntity<String> runCveBackfillScript(@AuthenticationPrincipal Object user) {
return triggerScript("fetchCVE_withMORE.js", "📡 CVE backfill launched - will sync back to 2002.", getCveLogFile());
}
@PreAuthorize("hasRole('ADMIN')")
@PostMapping("/fetch-kev")
public ResponseEntity<String> runKevScript(@AuthenticationPrincipal Object user) {
return triggerScript("fetchKEV.js", "📡 KEV sync launched in background.", kevLogFile);
return triggerScript("fetchKEV.js", "📡 KEV sync launched in background.", getKevLogFile());
}
@PreAuthorize("hasRole('ADMIN')")
@PostMapping("/fetch-msrc")
public ResponseEntity<String> runMsrcScript(@AuthenticationPrincipal Object user) {
return triggerScript("enrichCVE_MSRC.js", "📡 MSRC sync launched in background.", msrcLogFile);
return triggerScript("enrichCVE_MSRC.js", "📡 MSRC sync launched in background.", getMsrcLogFile());
}
@PreAuthorize("hasRole('ADMIN')")
@PostMapping("/verify-cve-count")
public ResponseEntity<String> verifyCveCount(@AuthenticationPrincipal Object user) {
return triggerScript("verifyCVECountAPI.js", "🔍 CVE verification started - comparing with GitHub API.", getCveLogFile());
}
@PreAuthorize("hasRole('ADMIN')")
@GetMapping("/fetch-cve/logs")
public ResponseEntity<String> fetchLogs(@AuthenticationPrincipal Object user) {
return readLogs(cveLogFile);
return readLogs(getCveLogFile());
}
@PreAuthorize("hasRole('ADMIN')")
@PostMapping("/fetch-cve/clear-logs")
public ResponseEntity<String> clearLogs(@AuthenticationPrincipal Object user) {
return clearLogs(cveLogFile);
return clearLogs(getCveLogFile());
}
@PreAuthorize("hasRole('ADMIN')")
@PostMapping("/fetch-kev/clear-logs")
public ResponseEntity<String> clearKevLogs(@AuthenticationPrincipal Object user) {
return clearLogs(kevLogFile);
return clearLogs(getKevLogFile());
}
@PreAuthorize("hasRole('ADMIN')")
@PostMapping("/fetch-msrc/clear-logs")
public ResponseEntity<String> clearMsrcLogs(@AuthenticationPrincipal Object user) {
return clearLogs(msrcLogFile);
return clearLogs(getMsrcLogFile());
}
@PreAuthorize("hasRole('ADMIN')")
@GetMapping(value = "/fetch-cve/logs/stream", produces = MediaType.TEXT_EVENT_STREAM_VALUE)
public Flux<String> streamLogs(@AuthenticationPrincipal Object user) {
Path logFile = Paths.get("scripts/cve-sync.log");
Path logFile = getCveLogFile().toPath();
return Flux.interval(Duration.ofSeconds(1))
.map(tick -> {
@@ -110,7 +136,7 @@ public class ScriptController {
@PreAuthorize("hasRole('ADMIN')")
@GetMapping(value = "/fetch-kev/logs/stream", produces = MediaType.TEXT_EVENT_STREAM_VALUE)
public Flux<String> streamKevLogs(@AuthenticationPrincipal Object user) {
Path logFile = Paths.get("scripts/kev-sync.log");
Path logFile = getKevLogFile().toPath();
return Flux.<String>create(emitter -> {
final long[] lastKnownPosition = {0};
@@ -155,7 +181,7 @@ public class ScriptController {
@PreAuthorize("hasRole('ADMIN')")
@GetMapping(value = "/fetch-msrc/logs/stream", produces = MediaType.TEXT_EVENT_STREAM_VALUE)
public Flux<String> streamMsrcLogs(@AuthenticationPrincipal Object user) {
Path logFile = Paths.get("scripts/msrc-sync.log");
Path logFile = getMsrcLogFile().toPath();
return Flux.interval(Duration.ofSeconds(1))
.map(tick -> {
@@ -174,7 +200,8 @@ public class ScriptController {
private ResponseEntity<String> triggerScript(String scriptName, String message, File targetLogFile) {
File scriptFile = new File("scripts", scriptName);
File scriptDir = new File(scriptsDirectory);
File scriptFile = new File(scriptDir, scriptName);
if (!scriptFile.exists()) {
return ResponseEntity.status(404).body("" + scriptName + " not found at: " + scriptFile.getAbsolutePath());
}
@@ -184,7 +211,7 @@ public class ScriptController {
}
private void runNodeScript(String scriptName, String startMessage, File logTarget) {
File scriptDir = new File("scripts");
File scriptDir = new File(scriptsDirectory);
File scriptFile = new File(scriptDir, scriptName);
if (!scriptFile.exists()) {
@@ -204,6 +231,7 @@ public class ScriptController {
Map<String, String> env = builder.environment();
env.put("DB_HOST", extractHost(dbUrl));
env.put("DB_PORT", extractPort(dbUrl));
env.put("DB_NAME", extractDbName(dbUrl));
env.put("DB_USER", dbUser);
env.put("DB_PASSWORD", dbPass);
@@ -235,7 +263,37 @@ public class ScriptController {
}
private String extractHost(String url) {
return url.replace("jdbc:mysql://", "").split(":")[0].split("/")[0];
String clean = url.replace("jdbc:mysql://", "").split("/")[0];
// Handle IPv6 addresses like [::1]:3307
if (clean.startsWith("[")) {
int closeBracket = clean.indexOf("]");
if (closeBracket > 0) {
return clean.substring(1, closeBracket);
}
}
// Handle standard host:port format
return clean.split(":")[0];
}
private String extractPort(String url) {
String clean = url.replace("jdbc:mysql://", "").split("/")[0];
// Handle IPv6: [::1]:3307
if (clean.contains("]:")) {
return clean.substring(clean.indexOf("]:") + 2);
}
// Handle standard: hostname:3307
if (clean.contains(":") && !clean.startsWith("[")) {
String[] parts = clean.split(":");
if (parts.length > 1) {
return parts[1];
}
}
return "3306"; // default MySQL port
}
private String extractDbName(String url) {

View File

@@ -2,12 +2,14 @@ package com.psg.dlsysinfo.dl_sysinfo_server.controller;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.psg.dlsysinfo.dl_sysinfo_server.dto.InstalledAppDTO;
import com.psg.dlsysinfo.dl_sysinfo_server.dto.PatchComplianceDTO;
import com.psg.dlsysinfo.dl_sysinfo_server.dto.SystemInfoDTO;
import com.psg.dlsysinfo.dl_sysinfo_server.entity.Client;
import com.psg.dlsysinfo.dl_sysinfo_server.entity.Devices;
import com.psg.dlsysinfo.dl_sysinfo_server.entity.InstalledSoftware;
import com.psg.dlsysinfo.dl_sysinfo_server.entity.*;
import com.psg.dlsysinfo.dl_sysinfo_server.repository.ClientRepository;
import com.psg.dlsysinfo.dl_sysinfo_server.repository.InstalledSoftwareRepository;
import com.psg.dlsysinfo.dl_sysinfo_server.repository.InstalledPatchRepository;
import com.psg.dlsysinfo.dl_sysinfo_server.repository.WindowsUpdateHistoryRepository;
import com.psg.dlsysinfo.dl_sysinfo_server.repository.PatchComplianceSummaryRepository;
import com.psg.dlsysinfo.dl_sysinfo_server.security.EncryptionService;
import com.psg.dlsysinfo.dl_sysinfo_server.security.JwtUtil;
import com.psg.dlsysinfo.dl_sysinfo_server.security.TokenResolver;
@@ -42,6 +44,9 @@ public class SystemInfoController {
@Autowired private ClientRepository clientRepository;
@Autowired private InstalledSoftwareRepository installedSoftwareRepository;
@Autowired private InstalledPatchRepository installedPatchRepository;
@Autowired private WindowsUpdateHistoryRepository windowsUpdateHistoryRepository;
@Autowired private PatchComplianceSummaryRepository patchComplianceSummaryRepository;
@Autowired private DeviceService deviceService;
@Autowired private ObjectMapper objectMapper;
@@ -67,18 +72,25 @@ public class SystemInfoController {
String decryptedData = encryptionService.decryptData(encryptedData);
System.out.println("🔓 Decrypted payload size: " + decryptedData.length());
System.out.println("🔓 Decrypted payload content: " + decryptedData);
SystemInfoDTO dto;
try {
dto = objectMapper.readValue(decryptedData, SystemInfoDTO.class);
System.out.println("✅ Successfully deserialized SystemInfoDTO");
} catch (Exception ex) {
System.out.println("❌ Failed to deserialize SystemInfoDTO: " + ex.getMessage());
ex.printStackTrace();
return errorResponse(response, "Failed to parse payload.", 400);
}
System.out.println("🔍 DTO clientIdentifier: " + dto.getClientIdentifier());
System.out.println("🔍 DTO hostname: " + dto.getHostname());
System.out.println("🔍 DTO drives: " + (dto.getDrives() != null ? dto.getDrives().size() : "null"));
if (dto.getClientIdentifier() == null || dto.getHostname() == null) {
System.out.println("❌ Missing required fields");
return errorResponse(response, "Missing clientIdentifier or hostname in payload.", 400);
}
@@ -132,6 +144,120 @@ public class SystemInfoController {
}
}
@PreAuthorize("isAuthenticated()")
@PostMapping("/patch-compliance")
public ResponseEntity<Map<String, String>> receivePatchCompliance(
@AuthenticationPrincipal Object currentUser,
HttpServletRequest request,
@RequestBody PatchComplianceDTO patchData) {
Map<String, String> response = new HashMap<>();
try {
String token = tokenResolver.resolveToken(request);
if (!jwtUtil.validateToken(token)) {
return errorResponse(response, "Invalid or expired token.", 403);
}
if (patchData == null) {
return errorResponse(response, "No patch data found in request body.", 400);
}
String clientIdentifier = jwtUtil.extractClientIdentifier(token);
System.out.println("🔍 Receiving patch compliance data for client: " + clientIdentifier);
Client client = clientRepository.findByClientIdentifier(clientIdentifier)
.orElseThrow(() -> new IllegalArgumentException("Client not registered"));
// Get device using hostname from the payload
String hostname = patchData.getHostname();
if (hostname == null || hostname.isEmpty()) {
return errorResponse(response, "Hostname is required in patch compliance data.", 400);
}
String hashedHostname = encryptionService.hashString(hostname);
Devices device = deviceService.findOrCreateDevice(hostname, hashedHostname, client);
System.out.println("🔍 Processing patch compliance for device: " + device.getDeviceId() + " (" + hostname + ")");
// Step 1: Delete and insert installed patches
installedPatchRepository.deleteByDevice(device);
int patchesProcessed = 0;
if (patchData.getInstalledPatches() != null) {
for (var patchDTO : patchData.getInstalledPatches()) {
if (patchDTO.getHotFixId() == null) continue;
InstalledPatch patch = new InstalledPatch();
patch.setDevice(device);
patch.setHotfixId(patchDTO.getHotFixId());
patch.setCaption(patchDTO.getCaption());
patch.setDescription(patchDTO.getDescription());
patch.setInstalledBy(patchDTO.getInstalledBy());
patch.setInstalledOn(patchDTO.getInstalledOn());
patch.setRecordedAt(LocalDateTime.now());
installedPatchRepository.save(patch);
patchesProcessed++;
}
}
// Step 2: Delete and insert update history
windowsUpdateHistoryRepository.deleteByDevice(device);
int historyProcessed = 0;
if (patchData.getRecentUpdateHistory() != null) {
for (var historyDTO : patchData.getRecentUpdateHistory()) {
WindowsUpdateHistory history = new WindowsUpdateHistory();
history.setDevice(device);
history.setUpdateDate(historyDTO.getDate());
history.setTitle(historyDTO.getTitle());
history.setUpdateId(historyDTO.getUpdateIdentity());
history.setOperation(historyDTO.getOperation());
history.setResultCode(historyDTO.getResultCode());
history.setRecordedAt(LocalDateTime.now());
windowsUpdateHistoryRepository.save(history);
historyProcessed++;
}
}
// Step 3: Update or insert summary
long successfulUpdates = patchData.getRecentUpdateHistory() != null
? patchData.getRecentUpdateHistory().stream()
.filter(h -> "Succeeded".equalsIgnoreCase(h.getResultCode()))
.count()
: 0;
PatchComplianceSummary summary = patchComplianceSummaryRepository
.findByDevice(device)
.orElseGet(() -> {
PatchComplianceSummary newSummary = new PatchComplianceSummary();
newSummary.setDevice(device);
return newSummary;
});
summary.setTotalInstalledPatches(patchesProcessed);
summary.setRecentSuccessfulUpdates((int) successfulUpdates);
summary.setLastCollectedAt(LocalDateTime.now());
summary.setRecordedAt(LocalDateTime.now());
patchComplianceSummaryRepository.save(summary);
response.put("status", "success");
response.put("message", "Patch compliance data received and stored successfully.");
response.put("patchesProcessed", String.valueOf(patchesProcessed));
response.put("historyProcessed", String.valueOf(historyProcessed));
response.put("successfulUpdates", String.valueOf(successfulUpdates));
return ResponseEntity.ok(response);
} catch (Exception e) {
System.out.println("❌ Unexpected exception in patch-compliance: " + e.getMessage());
e.printStackTrace();
return errorResponse(response, "Server error occurred: " + e.getMessage(), 500);
}
}
private ResponseEntity<Map<String, String>> errorResponse(Map<String, String> response, String message, int statusCode) {
response.put("status", "error");
response.put("message", message);

View File

@@ -0,0 +1,25 @@
package com.psg.dlsysinfo.dl_sysinfo_server.dto;
import lombok.AllArgsConstructor;
import lombok.Builder;
import lombok.Data;
import lombok.NoArgsConstructor;
import java.time.LocalDateTime;
@Data
@Builder
@NoArgsConstructor
@AllArgsConstructor
public class ComplianceSummaryDTO {
private Long totalDevices;
private Long vulnerableDevices;
private Long totalVulnerabilities;
private Long criticalVulns;
private Long highVulns;
private Long mediumVulns;
private Long lowVulns;
private Long totalSoftware;
private Long vulnerableSoftware;
private LocalDateTime lastUpdated;
}

View File

@@ -0,0 +1,24 @@
package com.psg.dlsysinfo.dl_sysinfo_server.dto;
import com.fasterxml.jackson.annotation.JsonProperty;
import lombok.Data;
import lombok.NoArgsConstructor;
@Data
@NoArgsConstructor
public class InstalledPatchDTO {
@JsonProperty("HotFixID")
private String hotFixId;
@JsonProperty("Caption")
private String caption;
@JsonProperty("Description")
private String description;
@JsonProperty("InstalledBy")
private String installedBy;
@JsonProperty("InstalledOn")
private String installedOn;
}

View File

@@ -0,0 +1,14 @@
package com.psg.dlsysinfo.dl_sysinfo_server.dto;
import lombok.Data;
import lombok.NoArgsConstructor;
import java.util.List;
@Data
@NoArgsConstructor
public class PatchComplianceDTO {
private String hostname;
private List<InstalledPatchDTO> installedPatches;
private List<UpdateHistoryDTO> recentUpdateHistory;
}

View File

@@ -1,10 +1,12 @@
package com.psg.dlsysinfo.dl_sysinfo_server.dto;
import com.fasterxml.jackson.annotation.JsonProperty;
import com.fasterxml.jackson.annotation.JsonIgnoreProperties;
import java.util.ArrayList;
import java.util.List;
@JsonIgnoreProperties(ignoreUnknown = true)
public class SystemInfoDTO {
@JsonProperty("clientIdentifier")

View File

@@ -0,0 +1,18 @@
package com.psg.dlsysinfo.dl_sysinfo_server.dto;
import lombok.AllArgsConstructor;
import lombok.Builder;
import lombok.Data;
import lombok.NoArgsConstructor;
@Data
@Builder
@NoArgsConstructor
@AllArgsConstructor
public class TopVulnerabilityDTO {
private String cveId;
private String title;
private String severity;
private Double score;
private Long affectedDevices;
}

View File

@@ -0,0 +1,14 @@
package com.psg.dlsysinfo.dl_sysinfo_server.dto;
import lombok.Data;
import lombok.NoArgsConstructor;
@Data
@NoArgsConstructor
public class UpdateHistoryDTO {
private String date;
private String title;
private String updateIdentity;
private String operation;
private String resultCode;
}

View File

@@ -1,5 +1,8 @@
package com.psg.dlsysinfo.dl_sysinfo_server.dto;
import com.fasterxml.jackson.annotation.JsonIgnoreProperties;
@JsonIgnoreProperties(ignoreUnknown = true)
public class UserDTO {
private Long id;
private String username;
@@ -9,6 +12,7 @@ public class UserDTO {
private String email;
private String role;
private String clientIdentifier;
private Long clientId; // Accept clientId from frontend
private String clientName;
private boolean enabled;
@@ -37,6 +41,7 @@ public class UserDTO {
public String getEmail() { return email; }
public String getRole() { return role; }
public String getClientIdentifier() { return clientIdentifier; }
public Long getClientId() { return clientId; }
public boolean isEnabled() { return enabled; }
public String getClientName() { return clientName; }
@@ -49,6 +54,7 @@ public class UserDTO {
public void setEmail(String email) { this.email = email; }
public void setRole(String role) { this.role = role; }
public void setClientIdentifier(String clientIdentifier) { this.clientIdentifier = clientIdentifier; }
public void setClientId(Long clientId) { this.clientId = clientId; }
public void setEnabled(boolean enabled) { this.enabled = enabled; }
public void setClientName(String clientName) { this.clientName = clientName; }
}

View File

@@ -0,0 +1,17 @@
package com.psg.dlsysinfo.dl_sysinfo_server.dto;
import lombok.AllArgsConstructor;
import lombok.Builder;
import lombok.Data;
import lombok.NoArgsConstructor;
@Data
@Builder
@NoArgsConstructor
@AllArgsConstructor
public class VulnerableSoftwareDTO {
private String softwareName;
private Long totalInstances;
private Long vulnerableInstances;
private Long totalCves;
}

View File

@@ -77,7 +77,7 @@ public class Devices {
@OneToMany(mappedBy = "device", cascade = CascadeType.ALL, orphanRemoval = true)
@JsonManagedReference
private List<Drive> drives;
private List<Drive> drives = new ArrayList<>();
@OneToMany(mappedBy = "device", cascade = CascadeType.ALL, orphanRemoval = true)
@JsonManagedReference

View File

@@ -0,0 +1,47 @@
package com.psg.dlsysinfo.dl_sysinfo_server.entity;
import jakarta.persistence.*;
import lombok.Getter;
import lombok.NoArgsConstructor;
import lombok.Setter;
import java.time.LocalDateTime;
@Entity
@Table(name = "installed_patches",
uniqueConstraints = @UniqueConstraint(name = "unique_device_patch", columnNames = {"device_id", "hotfix_id"}),
indexes = {
@Index(name = "idx_device_hotfix", columnList = "device_id, hotfix_id"),
@Index(name = "idx_installed_on", columnList = "installed_on")
})
@Getter
@Setter
@NoArgsConstructor
public class InstalledPatch {
@Id
@GeneratedValue(strategy = GenerationType.IDENTITY)
private Long id;
@ManyToOne
@JoinColumn(name = "device_id", nullable = false)
private Devices device;
@Column(name = "hotfix_id", nullable = false, length = 50)
private String hotfixId;
@Column(name = "caption", length = 500)
private String caption;
@Column(name = "description", length = 255)
private String description;
@Column(name = "installed_by", length = 255)
private String installedBy;
@Column(name = "installed_on", length = 50)
private String installedOn;
@Column(name = "recorded_at", nullable = false)
private LocalDateTime recordedAt = LocalDateTime.now();
}

View File

@@ -0,0 +1,40 @@
package com.psg.dlsysinfo.dl_sysinfo_server.entity;
import jakarta.persistence.*;
import lombok.Getter;
import lombok.NoArgsConstructor;
import lombok.Setter;
import java.time.LocalDateTime;
@Entity
@Table(name = "patch_compliance_summary",
uniqueConstraints = @UniqueConstraint(name = "unique_device_summary", columnNames = {"device_id"}),
indexes = {
@Index(name = "idx_last_collected", columnList = "last_collected_at")
})
@Getter
@Setter
@NoArgsConstructor
public class PatchComplianceSummary {
@Id
@GeneratedValue(strategy = GenerationType.IDENTITY)
private Long id;
@OneToOne
@JoinColumn(name = "device_id", nullable = false)
private Devices device;
@Column(name = "total_installed_patches", nullable = false)
private Integer totalInstalledPatches = 0;
@Column(name = "recent_successful_updates", nullable = false)
private Integer recentSuccessfulUpdates = 0;
@Column(name = "last_collected_at", nullable = false)
private LocalDateTime lastCollectedAt;
@Column(name = "recorded_at", nullable = false)
private LocalDateTime recordedAt = LocalDateTime.now();
}

View File

@@ -0,0 +1,36 @@
package com.psg.dlsysinfo.dl_sysinfo_server.entity;
import jakarta.persistence.*;
import lombok.Getter;
import lombok.NoArgsConstructor;
import lombok.Setter;
import java.time.LocalDateTime;
@Entity
@Table(name = "windows_updates")
@Getter
@Setter
@NoArgsConstructor
public class WindowsUpdate {
@Id
@GeneratedValue(strategy = GenerationType.IDENTITY)
private Long id;
@ManyToOne
@JoinColumn(name = "device_id", nullable = false)
private Devices device;
@Column(name = "update_date")
private String date;
@Column(name = "title", length = 1000)
private String title;
@Column(name = "update_id")
private String updateId;
@Column(name = "recorded_at", nullable = false)
private LocalDateTime recordedAt = LocalDateTime.now();
}

View File

@@ -0,0 +1,46 @@
package com.psg.dlsysinfo.dl_sysinfo_server.entity;
import jakarta.persistence.*;
import lombok.Getter;
import lombok.NoArgsConstructor;
import lombok.Setter;
import java.time.LocalDateTime;
@Entity
@Table(name = "windows_update_history",
indexes = {
@Index(name = "idx_device_date", columnList = "device_id, update_date"),
@Index(name = "idx_result_code", columnList = "result_code")
})
@Getter
@Setter
@NoArgsConstructor
public class WindowsUpdateHistory {
@Id
@GeneratedValue(strategy = GenerationType.IDENTITY)
private Long id;
@ManyToOne
@JoinColumn(name = "device_id", nullable = false)
private Devices device;
@Column(name = "update_date", length = 255)
private String updateDate;
@Column(name = "title", length = 1000)
private String title;
@Column(name = "update_id", length = 255)
private String updateId;
@Column(name = "operation", length = 50)
private String operation;
@Column(name = "result_code", length = 50)
private String resultCode;
@Column(name = "recorded_at", nullable = false)
private LocalDateTime recordedAt = LocalDateTime.now();
}

View File

@@ -0,0 +1,11 @@
package com.psg.dlsysinfo.dl_sysinfo_server.repository;
import com.psg.dlsysinfo.dl_sysinfo_server.entity.Devices;
import com.psg.dlsysinfo.dl_sysinfo_server.entity.InstalledPatch;
import org.springframework.data.jpa.repository.JpaRepository;
import org.springframework.stereotype.Repository;
@Repository
public interface InstalledPatchRepository extends JpaRepository<InstalledPatch, Long> {
void deleteByDevice(Devices device);
}

View File

@@ -0,0 +1,13 @@
package com.psg.dlsysinfo.dl_sysinfo_server.repository;
import com.psg.dlsysinfo.dl_sysinfo_server.entity.Devices;
import com.psg.dlsysinfo.dl_sysinfo_server.entity.PatchComplianceSummary;
import org.springframework.data.jpa.repository.JpaRepository;
import org.springframework.stereotype.Repository;
import java.util.Optional;
@Repository
public interface PatchComplianceSummaryRepository extends JpaRepository<PatchComplianceSummary, Long> {
Optional<PatchComplianceSummary> findByDevice(Devices device);
}

View File

@@ -0,0 +1,96 @@
package com.psg.dlsysinfo.dl_sysinfo_server.repository;
import com.psg.dlsysinfo.dl_sysinfo_server.dto.TopVulnerabilityDTO;
import com.psg.dlsysinfo.dl_sysinfo_server.dto.VulnerableSoftwareDTO;
import org.springframework.data.jpa.repository.JpaRepository;
import org.springframework.data.jpa.repository.Query;
import org.springframework.data.repository.query.Param;
import org.springframework.stereotype.Repository;
import java.time.LocalDateTime;
import java.util.List;
@Repository
public interface ReportingRepository extends JpaRepository<com.psg.dlsysinfo.dl_sysinfo_server.entity.Devices, Long> {
// Compliance Summary Queries
@Query("SELECT COUNT(DISTINCT d.deviceId) FROM Devices d WHERE d.client.clientId = :clientId")
Long countTotalDevices(@Param("clientId") Long clientId);
@Query("SELECT COUNT(DISTINCT cdv.deviceId) FROM CachedDeviceVuln cdv " +
"WHERE cdv.deviceId IN (SELECT d.deviceId FROM Devices d WHERE d.client.clientId = :clientId)")
Long countVulnerableDevices(@Param("clientId") Long clientId);
@Query("SELECT COUNT(cdv) FROM CachedDeviceVuln cdv " +
"WHERE cdv.deviceId IN (SELECT d.deviceId FROM Devices d WHERE d.client.clientId = :clientId)")
Long countTotalVulnerabilities(@Param("clientId") Long clientId);
@Query("SELECT COUNT(cdv) FROM CachedDeviceVuln cdv " +
"WHERE cdv.deviceId IN (SELECT d.deviceId FROM Devices d WHERE d.client.clientId = :clientId) " +
"AND UPPER(cdv.severity) = 'CRITICAL'")
Long countCriticalVulnerabilities(@Param("clientId") Long clientId);
@Query("SELECT COUNT(cdv) FROM CachedDeviceVuln cdv " +
"WHERE cdv.deviceId IN (SELECT d.deviceId FROM Devices d WHERE d.client.clientId = :clientId) " +
"AND UPPER(cdv.severity) = 'HIGH'")
Long countHighVulnerabilities(@Param("clientId") Long clientId);
@Query("SELECT COUNT(cdv) FROM CachedDeviceVuln cdv " +
"WHERE cdv.deviceId IN (SELECT d.deviceId FROM Devices d WHERE d.client.clientId = :clientId) " +
"AND UPPER(cdv.severity) = 'MEDIUM'")
Long countMediumVulnerabilities(@Param("clientId") Long clientId);
@Query("SELECT COUNT(cdv) FROM CachedDeviceVuln cdv " +
"WHERE cdv.deviceId IN (SELECT d.deviceId FROM Devices d WHERE d.client.clientId = :clientId) " +
"AND UPPER(cdv.severity) = 'LOW'")
Long countLowVulnerabilities(@Param("clientId") Long clientId);
@Query("SELECT COUNT(DISTINCT cis.softwareName) FROM CachedInstalledSoftware cis " +
"WHERE cis.deviceId IN (SELECT d.deviceId FROM Devices d WHERE d.client.clientId = :clientId)")
Long countTotalSoftware(@Param("clientId") Long clientId);
@Query("SELECT COUNT(DISTINCT cis.softwareName) FROM CachedInstalledSoftware cis " +
"WHERE cis.deviceId IN (SELECT d.deviceId FROM Devices d WHERE d.client.clientId = :clientId) " +
"AND cis.totalCves > 0")
Long countVulnerableSoftware(@Param("clientId") Long clientId);
@Query("SELECT MAX(cdv.lastUpdated) FROM CachedDeviceVuln cdv " +
"WHERE cdv.deviceId IN (SELECT d.deviceId FROM Devices d WHERE d.client.clientId = :clientId)")
LocalDateTime findLastVulnerabilityScanDate(@Param("clientId") Long clientId);
// Top Vulnerabilities Query - Using native SQL for LIMIT support
@Query(value = "SELECT cdv.cve_id as cveId, " +
"cdv.description as title, " +
"cdv.severity as severity, " +
"cdv.score as score, " +
"COUNT(DISTINCT cdv.device_id) as affectedDevices " +
"FROM cached_device_vulns cdv " +
"WHERE cdv.device_id IN (SELECT d.deviceId FROM devices d WHERE d.clientId = :clientId) " +
"GROUP BY cdv.cve_id, cdv.description, cdv.severity, cdv.score " +
"ORDER BY " +
"CASE WHEN UPPER(cdv.severity) = 'CRITICAL' THEN 0 " +
" WHEN UPPER(cdv.severity) = 'HIGH' THEN 1 " +
" WHEN UPPER(cdv.severity) = 'MEDIUM' THEN 2 " +
" WHEN UPPER(cdv.severity) = 'LOW' THEN 3 " +
" ELSE 4 END, " +
"COUNT(DISTINCT cdv.device_id) DESC " +
"LIMIT 20",
nativeQuery = true)
List<Object[]> findTopVulnerabilitiesNative(@Param("clientId") Long clientId);
// Vulnerable Software Query - Using native SQL for LIMIT support
@Query(value = "SELECT cis.software_name as softwareName, " +
"COUNT(cis.id) as totalInstances, " +
"SUM(CASE WHEN cis.total_cves > 0 THEN 1 ELSE 0 END) as vulnerableInstances, " +
"MAX(COALESCE(cis.total_cves, 0)) as totalCves " +
"FROM cached_installed_software cis " +
"WHERE cis.device_id IN (SELECT d.deviceId FROM devices d WHERE d.clientId = :clientId) " +
"GROUP BY cis.software_name " +
"ORDER BY (SUM(CASE WHEN cis.total_cves > 0 THEN 1 ELSE 0 END) * 1.0 / COUNT(cis.id) * MAX(COALESCE(cis.total_cves, 0))) DESC " +
"LIMIT 20",
nativeQuery = true)
List<Object[]> findVulnerableSoftwareNative(@Param("clientId") Long clientId);
}

View File

@@ -0,0 +1,11 @@
package com.psg.dlsysinfo.dl_sysinfo_server.repository;
import com.psg.dlsysinfo.dl_sysinfo_server.entity.Devices;
import com.psg.dlsysinfo.dl_sysinfo_server.entity.WindowsUpdateHistory;
import org.springframework.data.jpa.repository.JpaRepository;
import org.springframework.stereotype.Repository;
@Repository
public interface WindowsUpdateHistoryRepository extends JpaRepository<WindowsUpdateHistory, Long> {
void deleteByDevice(Devices device);
}

View File

@@ -0,0 +1,19 @@
package com.psg.dlsysinfo.dl_sysinfo_server.repository;
import com.psg.dlsysinfo.dl_sysinfo_server.entity.Devices;
import com.psg.dlsysinfo.dl_sysinfo_server.entity.WindowsUpdate;
import org.springframework.data.jpa.repository.JpaRepository;
import org.springframework.stereotype.Repository;
import java.util.List;
import java.util.Optional;
@Repository
public interface WindowsUpdateRepository extends JpaRepository<WindowsUpdate, Long> {
List<WindowsUpdate> findByDevice(Devices device);
Optional<WindowsUpdate> findByDeviceAndUpdateId(Devices device, String updateId);
void deleteByDevice(Devices device);
}

View File

@@ -3,6 +3,7 @@ package com.psg.dlsysinfo.dl_sysinfo_server.scheduling;
import com.psg.dlsysinfo.dl_sysinfo_server.service.CveStatisticsService;
import com.psg.dlsysinfo.dl_sysinfo_server.service.EmailService;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.scheduling.annotation.Scheduled;
import org.springframework.stereotype.Component;
@@ -14,7 +15,26 @@ import java.util.Map;
@Component
public class CveSyncScheduler {
private final File logFile = new File("cve-sync.log");
@Value("${spring.datasource.url}")
private String dbUrl;
@Value("${spring.datasource.username}")
private String dbUser;
@Value("${spring.datasource.password}")
private String dbPass;
@Value("${nvd.api.key}")
private String apiKey;
@Value("${nvd.max-range-days:7}")
private String nvdMaxRangeDays;
@Value("${scripts.directory:/home/sonder/ld-sysinfo-server/scripts}")
private String scriptsDirectory;
@Value("${scripts.logs.directory:/home/sonder/ld-sysinfo-server/scripts}")
private String logsDirectory;
@Autowired
private EmailService emailService;
@@ -25,24 +45,34 @@ public class CveSyncScheduler {
@Scheduled(cron = "0 0 */8 * * *") // ⏰ Every 8 hours
public void runCveSyncScript() {
File scriptFile = new File("scripts/fetchCVE.js");
File scriptDir = new File(scriptsDirectory);
File scriptFile = new File(scriptDir, "fetchCVE_v2.js");
File logFile = new File(logsDirectory, "cve-sync.log");
if (!scriptFile.exists()) {
String msg = "❌ Script not found: " + scriptFile.getAbsolutePath();
log(msg);
log(msg, logFile);
emailService.sendHtmlEmail("⚠️ CVE Sync Failed", wrapHtml(msg));
return;
}
try {
ProcessBuilder pb = new ProcessBuilder("node", scriptFile.getAbsolutePath());
ProcessBuilder pb = new ProcessBuilder("node", "fetchCVE_v2.js");
pb.directory(scriptDir);
Map<String, String> env = pb.environment();
env.put("DB_HOST", System.getenv("DB_HOST"));
env.put("DB_USER", System.getenv("DB_USER"));
env.put("DB_PASSWORD", System.getenv("DB_PASSWORD"));
env.put("DB_NAME", System.getenv("DB_NAME"));
env.put("NVD_API_KEY", System.getenv("NVD_API_KEY"));
env.put("NVD_MAX_RANGE_DAYS", System.getenv("NVD_MAX_RANGE_DAYS"));
env.put("DB_HOST", extractHost(dbUrl));
env.put("DB_PORT", extractPort(dbUrl));
env.put("DB_NAME", extractDbName(dbUrl));
env.put("DB_USER", dbUser);
env.put("DB_PASSWORD", dbPass);
env.put("NVD_API_KEY", apiKey);
env.put("NVD_MAX_RANGE_DAYS", nvdMaxRangeDays);
env.put("NODE_OPTIONS", "--no-warnings --enable-source-maps");
env.put("LC_ALL", "en_US.UTF-8");
env.put("LANG", "en_US.UTF-8");
env.put("LANGUAGE", "en_US:en");
pb.redirectErrorStream(true);
Process process = pb.start();
@@ -74,12 +104,50 @@ public class CveSyncScheduler {
} catch (Exception e) {
String err = "❌ Exception during CVE sync: " + e.getMessage();
log(err);
log(err, logFile);
emailService.sendHtmlEmail("❌ CVE Sync Error", wrapHtml(err));
}
}
private void log(String message) {
private String extractHost(String url) {
String clean = url.replace("jdbc:mysql://", "").split("/")[0];
// Handle IPv6 addresses like [::1]:3307
if (clean.startsWith("[")) {
int closeBracket = clean.indexOf("]");
if (closeBracket > 0) {
return clean.substring(1, closeBracket);
}
}
// Handle standard host:port format
return clean.split(":")[0];
}
private String extractPort(String url) {
String clean = url.replace("jdbc:mysql://", "").split("/")[0];
// Handle IPv6: [::1]:3307
if (clean.contains("]:")) {
return clean.substring(clean.indexOf("]:") + 2);
}
// Handle standard: hostname:3307
if (clean.contains(":") && !clean.startsWith("[")) {
String[] parts = clean.split(":");
if (parts.length > 1) {
return parts[1];
}
}
return "3306"; // default MySQL port
}
private String extractDbName(String url) {
return url.substring(url.lastIndexOf("/") + 1).split("\\?")[0];
}
private void log(String message, File logFile) {
try (FileWriter fw = new FileWriter(logFile, true)) {
fw.write(LocalDateTime.now() + "" + message + "\n");
} catch (IOException ignored) {}

View File

@@ -21,7 +21,7 @@ public class CveStatisticsService {
SELECT
(SELECT COUNT(*) FROM cves),
(SELECT COUNT(*) FROM cves WHERE title IS NULL OR title = ''),
(SELECT COUNT(*) FROM cves WHERE severity IS NULL OR severity = ''),
(SELECT COUNT(*) FROM cves WHERE (severity_v3 IS NULL OR severity_v3 = '') AND (severity_v2 IS NULL OR severity_v2 = '')),
(SELECT COUNT(*) FROM cves WHERE cvss_score IS NULL AND cvss_score_v2 IS NULL AND cvss_score_v3 IS NULL AND cvss_score_v4 IS NULL),
(SELECT COUNT(*) FROM cves WHERE cvss_vector IS NULL AND cvss_vector_v2 IS NULL AND cvss_vector_v3 IS NULL AND cvss_vector_v4 IS NULL),
(SELECT COUNT(*) FROM cves WHERE `references` IS NULL OR `references` = ''),

View File

@@ -293,6 +293,16 @@ public class DeviceService {
System.out.println("🔄 Reassigned device " + deviceId + " to client " + newClientId);
}
public Devices getMostRecentDeviceForClient(Client client) {
List<Devices> devices = devicesRepository.findByClient_ClientId(client.getClientId());
if (devices.isEmpty()) {
return null;
}
return devices.stream()
.max(Comparator.comparing(Devices::getLastCheckedIn))
.orElse(null);
}

View File

@@ -0,0 +1,100 @@
package com.psg.dlsysinfo.dl_sysinfo_server.service;
import com.psg.dlsysinfo.dl_sysinfo_server.dto.ComplianceSummaryDTO;
import com.psg.dlsysinfo.dl_sysinfo_server.dto.TopVulnerabilityDTO;
import com.psg.dlsysinfo.dl_sysinfo_server.dto.VulnerableSoftwareDTO;
import com.psg.dlsysinfo.dl_sysinfo_server.repository.ReportingRepository;
import lombok.RequiredArgsConstructor;
import lombok.extern.slf4j.Slf4j;
import org.springframework.cache.annotation.Cacheable;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.math.BigDecimal;
import java.math.BigInteger;
import java.time.LocalDateTime;
import java.util.List;
import java.util.stream.Collectors;
@Service
@RequiredArgsConstructor
@Slf4j
public class ReportingService {
private final ReportingRepository reportingRepository;
/**
* Get compliance summary with high-level security metrics
* Cached for 15 minutes due to high computation cost
*/
@Cacheable(value = "complianceSummary", key = "#clientId")
@Transactional(readOnly = true)
public ComplianceSummaryDTO getComplianceSummary(Long clientId) {
log.info("Generating compliance summary for client: {}", clientId);
Long totalDevices = reportingRepository.countTotalDevices(clientId);
Long vulnerableDevices = reportingRepository.countVulnerableDevices(clientId);
Long totalVulnerabilities = reportingRepository.countTotalVulnerabilities(clientId);
Long criticalVulns = reportingRepository.countCriticalVulnerabilities(clientId);
Long highVulns = reportingRepository.countHighVulnerabilities(clientId);
Long mediumVulns = reportingRepository.countMediumVulnerabilities(clientId);
Long lowVulns = reportingRepository.countLowVulnerabilities(clientId);
Long totalSoftware = reportingRepository.countTotalSoftware(clientId);
Long vulnerableSoftware = reportingRepository.countVulnerableSoftware(clientId);
LocalDateTime lastUpdated = reportingRepository.findLastVulnerabilityScanDate(clientId);
return ComplianceSummaryDTO.builder()
.totalDevices(totalDevices != null ? totalDevices : 0L)
.vulnerableDevices(vulnerableDevices != null ? vulnerableDevices : 0L)
.totalVulnerabilities(totalVulnerabilities != null ? totalVulnerabilities : 0L)
.criticalVulns(criticalVulns != null ? criticalVulns : 0L)
.highVulns(highVulns != null ? highVulns : 0L)
.mediumVulns(mediumVulns != null ? mediumVulns : 0L)
.lowVulns(lowVulns != null ? lowVulns : 0L)
.totalSoftware(totalSoftware != null ? totalSoftware : 0L)
.vulnerableSoftware(vulnerableSoftware != null ? vulnerableSoftware : 0L)
.lastUpdated(lastUpdated != null ? lastUpdated : LocalDateTime.now())
.build();
}
/**
* Get top vulnerabilities by severity and affected device count
* Cached for 30 minutes due to moderate update frequency
*/
@Cacheable(value = "topVulnerabilities", key = "#clientId")
@Transactional(readOnly = true)
public List<TopVulnerabilityDTO> getTopVulnerabilities(Long clientId) {
log.info("Fetching top vulnerabilities for client: {}", clientId);
List<Object[]> results = reportingRepository.findTopVulnerabilitiesNative(clientId);
return results.stream()
.map(row -> TopVulnerabilityDTO.builder()
.cveId((String) row[0])
.title((String) row[1])
.severity((String) row[2])
.score(row[3] != null ? ((Number) row[3]).doubleValue() : null)
.affectedDevices(row[4] != null ? ((Number) row[4]).longValue() : 0L)
.build())
.collect(Collectors.toList());
}
/**
* Get vulnerable software packages by risk score
* Cached for 1 hour due to less frequent changes
*/
@Cacheable(value = "vulnerableSoftware", key = "#clientId")
@Transactional(readOnly = true)
public List<VulnerableSoftwareDTO> getVulnerableSoftware(Long clientId) {
log.info("Fetching vulnerable software for client: {}", clientId);
List<Object[]> results = reportingRepository.findVulnerableSoftwareNative(clientId);
return results.stream()
.map(row -> VulnerableSoftwareDTO.builder()
.softwareName((String) row[0])
.totalInstances(row[1] != null ? ((Number) row[1]).longValue() : 0L)
.vulnerableInstances(row[2] != null ? ((Number) row[2]).longValue() : 0L)
.totalCves(row[3] != null ? ((Number) row[3]).longValue() : 0L)
.build())
.collect(Collectors.toList());
}
}

View File

@@ -169,4 +169,63 @@ public class UserService implements CustomUserDetailsService {
userAuthRepository.save(user);
}
public UserDTO updateUser(Long userId, UserDTO userDto) {
UserAuth user = userAuthRepository.findById(userId)
.orElseThrow(() -> new UsernameNotFoundException("User not found with id: " + userId));
try {
// Update encrypted fields
if (userDto.getDisplayName() != null) {
user.setDisplayNameHash(encryptionService.encryptData(userDto.getDisplayName()));
}
if (userDto.getFirstName() != null) {
user.setFirstNameHash(encryptionService.encryptData(userDto.getFirstName()));
}
if (userDto.getLastName() != null) {
user.setLastNameHash(encryptionService.encryptData(userDto.getLastName()));
}
if (userDto.getEmail() != null) {
user.setEmailHash(encryptionService.encryptData(userDto.getEmail()));
}
// Update role
if (userDto.getRole() != null) {
user.setRole(userDto.getRole());
}
// Update enabled state
user.setEnabled(userDto.isEnabled());
// Update client if provided (support both clientId and clientIdentifier)
if (userDto.getClientId() != null) {
Client client = clientRepository.findById(userDto.getClientId())
.orElseThrow(() -> new IllegalArgumentException("Client not found with id: " + userDto.getClientId()));
user.setClient(client);
} else if (userDto.getClientIdentifier() != null) {
Client client = clientRepository.findByClientIdentifier(userDto.getClientIdentifier())
.orElseThrow(() -> new IllegalArgumentException("Client not found with identifier: " + userDto.getClientIdentifier()));
user.setClient(client);
}
UserAuth savedUser = userAuthRepository.save(user);
// Return updated DTO
return new UserDTO(
savedUser.getId(),
savedUser.getUsername(),
encryptionService.decryptData(savedUser.getDisplayNameHash()),
encryptionService.decryptData(savedUser.getFirstNameHash()),
encryptionService.decryptData(savedUser.getLastNameHash()),
encryptionService.decryptData(savedUser.getEmailHash()),
savedUser.getRole(),
savedUser.getClient().getClientIdentifier(),
encryptionService.decryptData(savedUser.getClient().getClientNameEncrypted()),
savedUser.isEnabled()
);
} catch (Exception e) {
throw new RuntimeException("Failed to update user: " + e.getMessage(), e);
}
}
}

View File

@@ -1,7 +1,7 @@
spring.application.name=ld-sysinfo-server
# Database Configuration
spring.datasource.url=jdbc:mysql://db.psg.net.au:3307/db_ld-spring-backend
spring.datasource.url=jdbc:mysql://db.psg.net.au:3306/db_ld-spring-backend
spring.datasource.username=svc_sysinfo
spring.datasource.password=2pT08pEuxqFiN6eD348vBlgoMfyfOjGB
@@ -12,6 +12,7 @@ spring.jpa.properties.hibernate.format_sql=true
spring.jpa.hibernate.naming.physical-strategy=org.hibernate.boot.model.naming.PhysicalNamingStrategyStandardImpl
spring.jpa.properties.hibernate.dialect=org.hibernate.dialect.MySQLDialect
spring.jpa.database-platform=org.hibernate.dialect.MySQLDialect
spring.jpa.properties.hibernate.jdbc.time_zone=Australia/Perth
# Spring Security stuff
spring.autoconfigure.exclude=org.springframework.boot.autoconfigure.security.SecurityAutoConfiguration
@@ -40,7 +41,9 @@ server.ssl.key-store-type=PKCS12
# Script Controller (NVD) related
nvd.api.key=42b4f093-e8c4-4110-a7d1-6ab2ba6234aa
nvd.max-range-days=30
nvd.max-range-days=120
scripts.directory=/home/sonder/ld-sysinfo-server/scripts
scripts.logs.directory=/home/sonder/ld-sysinfo-server/scripts
# SMTP/Mail related
spring.mail.host=psg-net-au.mail.protection.outlook.com