Methodology

Every report published on Actyra Open follows the same rigorous, reproducible methodology. This page documents our complete process so that our findings can be independently verified by other researchers.

1. Software Acquisition

All software is lawfully obtained from official distribution channels:

  • Official vendor websites (direct download links)
  • Official app stores (Microsoft Store, Apple App Store, Google Play)
  • Official CDN endpoints referenced by vendor installers

We document the exact download URL, date of acquisition, and cryptographic hash (SHA-256) of every binary analyzed. This allows any researcher to verify they are examining the same binary.

Example acquisition record:

Source: https://www.ccleaner.com/ccleaner/download

Date: 2026-01-15

SHA-256: a1b2c3d4e5f6...

Size: 1,753,088 bytes

File: ccleaner_setup_online.exe

2. Static Analysis

Static analysis examines the binary without executing it. This is the foundation of every report.

Binary Identification

PE header analysis, architecture identification (x86, x64, ARM), compiler detection, import/export table examination, section analysis, and overlay detection.

Tools: lief, pefile, Detect It Easy, CFF Explorer

String Extraction

Comprehensive extraction of all embedded strings including URLs, API endpoints, registry paths, file paths, error messages, and format strings. Both ASCII and Unicode strings are extracted.

Tools: Python byte-level extraction, strings, FLOSS

Decompilation

Full binary decompilation to recover pseudocode. For native binaries (C/C++), we use Ghidra's decompiler. For .NET assemblies, we use ILSpy or dnSpy which produce near-original C# source code.

Tools: Ghidra (NSA), ILSpy, dnSpy

Control Flow Analysis

Tracing execution paths from entry points through to network calls, file operations, and registry access. This reveals the actual sequence of operations the software performs, including pre-consent activities.

Tools: Ghidra function graphs, call trees, cross-references

3. Dynamic Analysis

Where static analysis reveals what code exists, dynamic analysis confirms what code actually executes. Dynamic analysis is performed in controlled, isolated environments.

API Hooking

Runtime interception of Windows API calls (WinHTTP, WinINet, CryptoAPI, Registry, File I/O) to observe actual network requests, cryptographic operations, and system interactions.

Tools: Frida, API Monitor

Network Capture

TLS-intercepting proxy to capture and decode all network traffic, including HTTPS requests. This reveals exactly what data is transmitted, to which endpoints, and when.

Tools: mitmproxy, Wireshark, Fiddler

System Monitoring

Monitoring of file system access, registry modifications, process creation, and other system-level activities during software execution.

Tools: Process Monitor (Sysinternals), Process Explorer

4. Finding Classification

Each finding is classified into one of two categories, which determines how it is handled and published:

Security Vulnerabilities

Exploitable flaws that could compromise user security. These are subject to our Responsible Disclosure Policy (90-day vendor notification before publication). Examples: hardcoded credentials, deserialization RCE, certificate validation bypass.

Classified using: CVSS 3.1 scoring, CWE identifiers, CVE submission to MITRE.

Privacy & Transparency Findings

Undisclosed data collection, policy discrepancies, and regulatory compliance concerns. These document existing, publicly observable behavior and are published as standard research without a remediation window.

Referenced against: GDPR articles, CCPA sections, ePrivacy Directive, vendor ToS/Privacy Policy.

5. Evidence Standards

Every finding must meet the following evidence requirements before publication:

  1. Code-level proof — Decompiled pseudocode or source showing the vulnerable or privacy-impacting code path. Limited to the minimum excerpt necessary.
  2. Reproducibility — The finding must be independently reproducible by another researcher with the same binary version and tools.
  3. Context — The function address, DLL/binary name, and call chain are documented so the finding can be located in the binary.
  4. Impact assessment — Clear description of what the finding enables and who is affected, avoiding hyperbole.
  5. Vendor policy comparison — For privacy findings, the observed behavior is compared against the vendor's published privacy policy and terms of service.

6. Grading Rubric

Each analyzed software receives an overall grade based on five weighted categories:

CategoryWeightWhat We Measure
Consent25%Does it collect data before user consent?
Data Minimization20%How much data vs. what's actually needed?
Transparency20%Does the UI disclose what's collected?
Security15%Is collected data encrypted in transit?
Policy Adherence20%Does the binary match the vendor's policies?

A (90–100) | B (80–89) | C (70–79) | D (60–69) | F (0–59)

7. Peer Review

Before publication, every report undergoes internal review:

  • Technical accuracy — Are the decompiled code excerpts correct?
  • Reproducibility — Can a second analyst reproduce the finding?
  • Proportionality — Are evidence excerpts limited to what's necessary?
  • Severity calibration — Is the CVSS score / impact assessment appropriate?
  • Legal review — Does the publication comply with our legal framework?

8. Tool Inventory

All primary analysis tools are open-source or freely available:

ToolPurposeLicense
GhidraDecompilation, disassemblyApache 2.0 (NSA)
ILSpy / dnSpy.NET decompilationMIT / GPL
FridaDynamic instrumentation, API hookingwxWindows
lief / pefilePE binary analysisApache 2.0 / MIT
WiresharkNetwork traffic captureGPL 2.0
Process MonitorSystem activity monitoringSysinternals EULA
PythonCustom analysis scriptsPSF License

9. Corrections and Updates

If a software vendor or independent researcher identifies an error in our analysis, we will:

  1. Investigate the claimed error promptly
  2. Issue a correction with a clear explanation of what changed and why
  3. Update the affected report with a “Corrections” section noting the change
  4. Maintain the original finding text (struck through) alongside the correction for transparency

Report errors or corrections to hello@actyra.com.