33 Commits
v0.1.2 ... main

Author SHA1 Message Date
Omni
9000b1e080 Sync from development - prepare for v0.2.1.1 2026-01-15 18:07:49 +00:00
Omni
02f3d71a82 Sync from development - prepare for v0.2.1.1 2026-01-15 18:06:02 +00:00
Omni
29e1800074 Sync from development - prepare for v0.2.1 2026-01-12 22:15:19 +00:00
Omni
9b5310c2f9 Sync from development - prepare for v0.2.0.10 2026-01-04 22:43:32 +00:00
Omni
0d84d2f2fe Sync from development - prepare for v0.2.0.9 2025-12-31 20:56:47 +00:00
Omni
2511c9334c Sync from development - prepare for v0.2.0.8 2025-12-29 19:55:38 +00:00
Omni
5869a896a8 Sync from development - prepare for v0.2.0.7 2025-12-28 22:17:44 +00:00
Omni
99fb369d5e Sync from development - prepare for v0.2.0.6 2025-12-28 18:52:07 +00:00
Omni
a813236e51 Sync from development - prepare for v0.2.0.5 2025-12-24 21:53:12 +00:00
Omni
a7ed4b2a1e Sync from development - prepare for v0.2.0.4 2025-12-23 21:49:18 +00:00
Omni
523681a254 Sync from development - prepare for v0.2.0.3 2025-12-21 21:11:04 +00:00
Omni
abfca5268f Sync from development - prepare for v0.2.0.3 2025-12-21 21:09:38 +00:00
Omni
4de5c7f55d Sync from development - prepare for v0.2.0.2 2025-12-19 22:57:22 +00:00
Omni
9c52c0434b Sync from development - prepare for v0.2.0.1 2025-12-19 19:42:31 +00:00
Omni
e3dc62fdac Sync from development - prepare for v0.2.0 2025-12-06 20:54:48 +00:00
Omni
ce969eba1b Sync from development - prepare for v0.2.0 2025-12-06 20:09:55 +00:00
Omni
fe14e4ecfb Sync from development - prepare for v0.1.7.1 2025-11-11 20:04:32 +00:00
Omni
9680814bbb Sync from development - prepare for v0.1.7 2025-11-04 12:54:15 +00:00
Omni
91ac08afb2 Sync from development - prepare for v0.1.6.6 2025-10-29 10:28:33 +00:00
Omni
06bd94d119 Sync from development - prepare for v0.1.6.5 2025-10-28 21:18:54 +00:00
Omni
52806f4116 Sync from development - prepare for v0.1.6.4 2025-10-24 20:12:21 +01:00
Omni
956ea24465 Sync from development - prepare for v0.1.6.3 2025-10-23 23:53:18 +01:00
Omni
f039cf9c24 Sync from development - prepare for v0.1.6.2 2025-10-23 21:50:28 +01:00
Omni
d9ea1be347 Sync from development - prepare for v0.1.6.1 2025-10-21 21:11:48 +01:00
Omni
a8862475d4 Sync from development - prepare for v0.1.6.1 2025-10-21 21:07:42 +01:00
Omni
430d085287 Sync from development - prepare for v0.1.6 2025-10-16 14:44:49 +01:00
Omni
7212a58480 Sync from development - prepare for v0.1.5.3 2025-10-02 21:59:01 +01:00
Omni
80914bc76f Sync from development - prepare for v0.1.5.2 2025-10-01 22:11:14 +01:00
Omni
8661f8963e Sync from development - prepare for v0.1.5.1 2025-09-28 12:15:44 +01:00
Omni
f46ed2c0fe Sync from development - prepare for v0.1.5 2025-09-26 12:45:21 +01:00
Omni
c9bd6f60e6 Sync from development - prepare for v0.1.4 2025-09-22 20:39:58 +01:00
Omni
28cde64887 Sync from development - prepare for v0.1.2 2025-09-18 10:41:16 +01:00
Omni
64c76046ce Sync from development - prepare for v0.1.2 2025-09-18 08:44:19 +01:00
318 changed files with 46093 additions and 6040 deletions

2
.gitignore vendored
View File

@@ -35,7 +35,7 @@ Thumbs.db
docs/
testing/
# PyInstaller build files (development only)
# Build files (development only)
*.spec
hook-*.py
requirements-packaging.txt

View File

@@ -1,5 +1,534 @@
# Jackify Changelog
## v0.2.1.1 - Bug Fixes and Improvements
**Release Date:** 2026-01-15
### Critical Bug Fixes
- **AppImage Crash on Steam Deck**: Fixed `NameError: name 'Tuple' is not defined` that prevented AppImage from launching on Steam Deck. Added missing `Tuple` import to `progress_models.py`
### Bug Fixes
- **Menu Routing**: Fixed "Configure Existing Modlist (In Steam)" opening wrong section (was routing to Wabbajack Installer instead of Configure Existing screen)
- **TTW Install Dialogue**: Fixed incorrect account reference (changed "mod.db" to "ModPub" to match actual download source)
- **Duplicate Method**: Removed duplicate `_handle_missing_downloader_error` method in winetricks handler
- **Issue #142**: Removed sudo execution from modlist configuration - now auto-fixes permissions when possible, provides manual instructions only when sudo required
- **Issue #133**: Updated VDF library to 4.0 for improved Steam file format compatibility (protontricks 1.13.1+ support)
### Features
- **Wine Component Error Handling**: Enhanced error messages for missing downloaders with platform-specific installation instructions (SteamOS/Steam Deck vs other distros)
### Dependencies
- **VDF Library**: Updated from PyPI vdf 3.4 to actively maintained solsticegamestudios/vdf 4.0 (used by Gentoo)
- **Winetricks**: Removed bundled downloaders that caused segfaults on some systems - now uses system-provided downloaders (aria2c/wget/curl)
---
## v0.2.1 - Wabbajack Installer and ENB Support
**Release Date:** 2025-01-12
Y
### Major Features
- **Automated Wabbajack Installation**: While I work on Non-Premium support, there is still a call for Wabbajack via Proton. The existing legacy bash script has been proving troublesome for some users, so I've added this as a new feature within Jackify. My aim is still to not need this in future, once Jackify can cover Non-Premium accounts.
- **ENB Detection and Configuration**: Automatic detection and configuration of `enblocal.ini` with `LinuxVersion=true` for all supported games
- **ENB Proton Warning**: Dedicated dialog with Proton version recommendations when ENB is detected
### Critical Bug Fixes
- **OAuth Token Stale State**: Re-check authentication before engine launch to prevent stale token errors after revocation
- **FNV SD Card Registry**: Fixed launcher not recognizing game on SD cards (uses `D:` drive for SD, `Z:` for internal)
- **CLI FILE_PROGRESS Spam**: Filter verbose output to preserve single-line progress updates
- **Steam Double Restart**: Removed legacy code causing double restart during configuration
- **TTW Installer lz4**: Fixed bundled lz4 detection by setting correct working directory
### Improvements
- **Winetricks Bundling**: Bundled critical dependencies (wget, sha256sum, unzip, 7z) for improved reliability
- **UI/UX**: Removed per-file download speeds to match Wabbajack upstream
- **Code Cleanup**: Removed PyInstaller references, use AppImage detection only
- **Wabbajack Installer UI**: Removed unused Process Monitor tab, improved Activity window with detailed step information
- **Steam AppID Overflow Fix**: Changed AppID handling to string type to prevent overflow errors with large Steam AppIDs
---
## v0.2.0.10 - Registry & Hashing Fixes
**Release Date:** 2025-01-04
### Engine Updates
- **jackify-engine 0.4.5**: Fixed archive extraction with backslashes (including pattern matching), data directory path configuration, and removed post-download .wabbajack hash validation. Engine now auto-refreshes OAuth tokens during long installations via `NEXUS_OAUTH_INFO` environment variable.
### Critical Bug Fixes
- **InstallationThread Crash**: Fixed crash during installation with error "'InstallationThread' object has no attribute 'auth_service'". Premium detection diagnostics code assumed auth_service existed but it was never passed to the thread. Affects all users when Premium detection (including false positives) is triggered.
- **Install Start Hang**: Fixed missing `oauth_info` parameter that prevented modlist installs from starting (hung at "Starting modlist installation...")
- **OAuth Token Auto-Refresh**: Fixed OAuth tokens expiring during long modlist installations. Jackify now refreshes tokens with 15-minute buffer before passing to engine. Engine receives full OAuth state via `NEXUS_OAUTH_INFO` environment variable, enabling automatic token refresh during multi-hour downloads. Fixes "Token has expired" errors that occurred 60 minutes into installations.
- **ShowDotFiles Registry Format**: Fixed Wine registry format bug causing hidden files to remain hidden in prefixes. Python string escaping issue wrote single backslash instead of double backslash in `[Software\\Wine]` section header. Added auto-detection and fix for broken format from curated registry files.
- **Dotnet4 Registry Fixes**: Confirmed universal dotnet4.x registry fixes (`*mscoree=native` and `OnlyUseLatestCLR=1`) are applied in all three workflows (Install, Configure New, Configure Existing) across both CLI and GUI interfaces
- **Proton Path Configuration**: Fixed `proton_path` writing invalid "auto" string to config.json - now uses `null` instead, preventing jackify-engine from receiving invalid paths
### Improvements
- **Wine Binary Detection**: Enhanced detection with recursive fallback search within Proton directory when expected paths don't exist (handles different Proton version structures)
- Added Jackify version logging at workflow start
- Fixed GUI log file rotation to only run in debug mode
---
## v0.2.0.9 - Critical Configuration Fixes
**Release Date:** 2025-12-31
### Bug Fixes
- Fixed AppID conversion bug causing Configure Existing failures
- Fixed missing MessageService import crash in Configure Existing
- Fixed RecursionError in config_handler.py logger
- Fixed winetricks automatic fallback to protontricks (was silently failing)
### Improvements
- Added detailed progress indicators for configuration workflows
- Fixed progress bar completion showing 100% instead of 95%
- Removed debug logging noise from file progress widget
- Enhanced Premium detection diagnostics for Issue #111
- Flatpak protontricks now auto-granted cache access for faster subsequent installs
---
## v0.2.0.8 - Bug Fixes and Improvements
**Release Date:** 2025-12-29
### Bug Fixes
- Fixed Configure New/Existing/TTW screens missing Activity tab and progress updates
- Fixed cancel/back buttons crashing in Configure workflows
### Improvements
- Install directory now auto-appends modlist name when selected from gallery
### Known Issues
- Mod filter temporarily disabled in gallery due to technical issue (tag and game filters still work)
---
## v0.2.0.7 - Critical Auth Fix
**Release Date:** 2025-12-28
### Critical Bug Fixes
- **OAuth Token Loss**: Fixed version comparison bug that was deleting OAuth tokens every time settings were saved (affects users on v0.2.0.4+)
- Fixed internal import paths for improved stability
---
## v0.2.0.6 - Premium Detection and Engine Update
**Release Date:** 2025-12-28
**IMPORTANT:** If you are on v0.2.0.5, automatic updates will not work. You must manually download and install v0.2.0.6.
### Engine Updates
- **jackify-engine 0.4.4**: Latest engine version with improvements
### Critical Bug Fixes
- **Auto-Update System**: Fixed broken update dialog import that prevented automatic updates
- **Premium Detection**: Fixed false Premium errors caused by overly-broad detection pattern triggering on jackify-engine 0.4.3's userinfo JSON output
- **Custom Data Directory**: Fixed AppImage always creating ~/Jackify on startup, even when user configured a custom jackify_data_dir
- **Proton Auto-Selection**: Fixed auto-selection writing invalid "auto" string to config on detection failure
### Quality Improvements
- Added pre-build import validator to prevent broken imports from reaching production
---
## v0.2.0.5 - Emergency OAuth Fix
**Release Date:** 2025-12-24
### Critical Bug Fixes
- **OAuth Authentication**: Fixed regression in v0.2.0.4 that prevented OAuth token encryption/decryption, breaking Nexus authentication for users
---
## v0.2.0.4 - Bugfixes & Improvements
**Release Date:** 2025-12-23
### Engine Updates
- **jackify-engine 0.4.3**: Fixed case sensitivity issues, archive extraction crashes, and improved error messages
### Bug Fixes
- Fixed modlist gallery metadata showing outdated versions (now always fetches fresh data)
- Fixed hardcoded ~/Jackify paths preventing custom data directory settings
- Fixed update check blocking GUI startup
- Improved Steam restart reliability (3-minute timeout, better error handling)
- Fixed Protontricks Flatpak installation on Steam Deck
### Backend Changes
- GPU texture conversion now always enabled (config setting deprecated)
### UI Improvements
- Redesigned modlist detail view to show more of hero image
- Improved gallery loading with animated feedback and faster initial load
---
## v0.2.0.3 - Engine Bugfix & Settings Cleanup
**Release Date:** 2025-12-21
### Engine Updates
- **jackify-engine 0.4.3**: Bugfix release
### UI Improvements
- **Settings Dialog**: Removed GPU disable toggle - GPU usage is now always enabled (the disable option was non-functional)
---
## v0.2.0.2 - Emergency Engine Bugfix
**Release Date:** 2025-12-18
### Engine Updates
- **jackify-engine 0.4.2**: Fixed OOM issue with jackify-engine 0.4.1 due to array size
---
## v0.2.0.1 - Critical Bugfix Release
**Release Date:** 2025-12-15
### Critical Bug Fixes
- **Directory Safety Validation**: Fixed data loss bug where directories with only a `downloads/` folder were incorrectly identified as valid modlist directories
- **Flatpak Steam Restart**: Fixed Steam restart failures on Ubuntu/PopOS by removing incompatible `-foreground` flag and increasing startup wait
### Bug Fixes
- **External Links**: Fixed Ko-fi, GitHub, and Nexus links not opening on some distros using xdg-open with clean environment
- **TTW Console Output**: Filtered standalone "OK"/"DONE" noise messages from TTW installation console
- **Activity Window**: Fixed progress display updates in TTW Installer and other workflows
- **Wine Component Installation**: Added status feedback during component installation showing component list
- **Progress Parser**: Added defensive checks to prevent segfaults from malformed engine output
- **Progress Parser Speed Info**: Fixed 'OperationType' object has no attribute 'lower' error by converting enum to string value when extracting speed info from timestamp status patterns
### Improvements
- **Default Wine Components**: Added dxvk to default component list for better graphics compatibility
- **TTW Installer UI**: Show version numbers in status displays
### Engine Updates
- **jackify-engine 0.4.1**: Download reliability fixes, BSA case sensitivity handling, external drive I/O limiting, GPU detection caching, and texture processing performance improvements
---
## v0.2.0 - Modlist Gallery, OAuth Authentication & Performance Improvements
**Release Date:** 2025-12-06
### Major Features
#### Modlist Selection Gallery
Complete overhaul of modlist selection (First pass):
**Core Features:**
- Card-based Modlist Selection browser with modlist images, titles, authors and metadata
- Game-specific filtering automatically applied based on selected game type
- Details per card: download/install/total sizes, tags, version, badges
- Async image loading from GitHub with local 7-day caching
- Detail view with full descriptions, banner images, and external links
- Selected modlist automatically populates Install Modlist workflow
**Search and Filtering:**
- Text search across modlist names and descriptions
- Multi-select tag filtering with normalized tags
- Show Official Only, Show NSFW, Hide Unavailable toggles
- Mod search capability - find modlists containing specific Nexus mods
- Randomised card ordering
**Performance:**
- Gallery images loading from cache
- Background metadata and image preloading when Install Modlist screen opens
- Efficient rendering - cards created once, filters toggle visibility
- Non-blocking UI with concurrent image downloads
**Steam Deck Optimized:**
- Dynamic card sizing (e.g 250x270 on Steam Deck, larger on desktop)
- Responsive grid layout (up to 4 columns on large screens, 3 on Steam Deck)
- Optimized spacing and padding for 1280x800 displays
#### OAuth 2.0 Authentication
Modern authentication for Nexus Mods with secure token management:
- One-click browser-based authorization with PKCE security
- Automatic token refresh with encrypted storage
- Authorisation status indicator on Install Modlist screen
- Works in both GUI and CLI workflows
#### Compact Mode UI Redesign
Streamlined interface with dynamic window management:
- Default compact mode with optional Details view
- Activity window tab (default), across all workflow screens
- Process Monitor tab still available
- Show Details toggle for console output when needed
### Critical Fixes
### Replaced TTW Installer
- Replaced the previous TTW Installer due to complexities with its config file
#### GPU Texture Conversion (jackify-engine 0.4.0)
- Fixed GPU not being used for BC7/BC6H texture conversions
- Previous versions fell back to CPU-only despite GPU availability
- Added GPU toggle in Settings (enabled by default)
#### Winetricks Compatibility & Protontricks
- Fixed bundled winetricks path incompatibility
- Hopefully fixed winetricks in cases where it failed to download components
- For now, Jackify still defaults to bundled winetricks (Protontricks toggle in settings)
#### Steam Restart Reliability
- Enhanced Steam Restart so that is now hopefully works more reliably on all distros
- Fixed Flatpak detection blocking normal Steam start methods
### Technical Improvements
- Proton version usage clarified: Install Proton for installation/texture processing, Game Proton for shortcuts
- Centralised Steam detection in SystemInfo
- ConfigHandler refactored to always read fresh from disk
- Removed obsolete dotnet4.x code
- Enhanced Flatpak Steam compatdata detection with proper VDF parsing
### Bug Fixes
- TTW installation UI performance (batched output processing, non-blocking operations)
- Activity window animations (removed custom timers, Qt native rendering)
- Timer reset when returning from TTW screen
- Fixed bandwidth limit KB/s to bytes conversion
- Fixed AttributeError in AutomatedPrefixService.restart_steam()
### Engine Updates
- jackify-engine 0.4.0 with GPU texture conversion fixes and refactored file progress reporting
---
## v0.1.7.1 - Wine Component Verification & Flatpak Steam Fixes
**Release Date:** November 11, 2025
### Critical Bug Fixes
- **FIXED: Wine Component Installation Verification** - Jackify now verifies components are actually installed before reporting success
### Bug Fixes
- **Steam Deck SD Card Paths**: Fixed ModOrganizer.ini path corruption on SD card installs using regex-based stripping
- **Flatpak Steam Detection**: Fixed libraryfolders.vdf path detection for Flatpak Steam installations
- **Flatpak Steam Restart**: Steam restart service now properly detects and controls Flatpak Steam
- **Path Manipulation**: Fixed path corruption in Configure Existing/New Modlist (paths with spaces)
### Improvements
- Added network diagnostics before winetricks fallback to protontricks
- Enhanced component installation logging with verification status
- Added GE-Proton 10-14 recommendation to success message (ENB compatibility note for Valve's Proton 10)
### Engine Updates
- **jackify-engine 0.3.18**: Archive extraction fixes for Windows symlinks, bandwidth limiting fix, improved error messages
---
## v0.1.7 - TTW Automation & Bug Fixes
**Release Date:** November 1, 2025
### Major Features
- **TTW (Tale of Two Wastelands) Installation and Automation**
laf - TTW Installation function using Hoolamike application - https://github.com/Niedzwiedzw/hoolamike
- Automated workflow for TTW installation and integration into FNV modlists, where possible
- Automatic detection of TTW-compatible modlists
- User prompt after modlist installation with option to install TTW
- Automated integration: file copying, load order updates, modlist.txt updates
- Available in both CLI and GUI workflows
### Bug Fixes
- **Registry UTF-8 Decode Error**: Fixed crash during dotnet4.x installation when Wine outputs binary data
- **Python 3.10 Compatibility**: Fixed startup crash on Python 3.10 systems
- **TTW Steam Deck Layout**: Fixed window sizing issues on Steam Deck when entering/exiting TTW screen
- **TTW Integration Status**: Added visible status banner updates during modlist integration for collapsed mode
- **TTW Accidental Input Protection**: Added 3-second countdown to TTW installation prompt to prevent accidental dismissal
- **Settings Persistence**: Settings changes now persist correctly across workflows
- **Steam Deck Keyboard Input**: Fixed keyboard input failure on Steam Deck
- **Application Close Crash**: Fixed crash when closing application on Steam Deck
- **Winetricks Diagnostics**: Enhanced error detection with automatic fallback
---
## v0.1.6.6 - AppImage Bundling Fix
**Release Date:** October 29, 2025
### Bug Fixes
- **Fixed AppImage bundling issue** causing legacy code to be retained in rare circumstances
---
## v0.1.6.5 - Steam Deck SD Card Path Fix
**Release Date:** October 27, 2025
### Bug Fixes
- **Fixed Steam Deck SD card path manipulation** when jackify-engine installed
- **Fixed Ubuntu Qt platform plugin errors** by bundling XCB libraries
- **Added Flatpak GE-Proton detection** and protontricks installation choices
- **Extended Steam Deck SD card timeouts** for slower I/O operations
---
## v0.1.6.4 - Flatpak Steam Detection Hotfix
**Release Date:** October 24, 2025
### Critical Bug Fixes
- **FIXED: Flatpak Steam Detection**: Added support for `/data/Steam/` directory structure used by some Flatpak Steam installations
- **IMPROVED: Steam Path Detection**: Now checks all known Flatpak Steam directory structures for maximum compatibility
---
## v0.1.6.3 - Emergency Hotfix
**Release Date:** October 23, 2025
### Critical Bug Fixes
- **FIXED: Proton Detection for Custom Steam Libraries**: Now properly reads all Steam libraries from libraryfolders.vdf
- **IMPROVED: Registry Wine Binary Detection**: Uses user's configured Proton for better compatibility
- **IMPROVED: Error Handling**: Registry fixes now provide clear warnings if they fail instead of breaking entire workflow
---
## v0.1.6.2 - Minor Bug Fixes
**Release Date:** October 23, 2025
### Bug Fixes
- **Improved dotnet4.x Compatibility**: Universal registry fixes for better modlist compatibility
- **Fixed Proton 9 Override**: A bug meant that modlists with spaces in the name weren't being overridden correctly
- **Removed PageFileManager Plugin**: Eliminates Linux PageFile warnings
---
## v0.1.6.1 - Fix dotnet40 install and expand Game Proton override
**Release Date:** October 21, 2025
### Bug Fixes
- **Fixed dotnet40 Installation Failures**: Resolved widespread .NET Framework installation issues affecting multiple modlists
- **Added Lost Legacy Proton 9 Override**: Automatic ENB compatibility for Lost Legacy modlist
- **Fixed Symlinked Downloads**: Automatically handles symlinked download directories to avoid Wine compatibility issues
---
## v0.1.6 - Lorerim Proton Support
**Release Date:** October 16, 2025
### New Features
- **Lorerim Proton Override**: Automatically selects Proton 9 for Lorerim installations (GE-Proton9-27 preferred)
- **Engine Update**: jackify-engine v0.3.17
---
## v0.1.5.3 - Critical Bug Fixes
**Release Date:** October 2, 2025
### Critical Bug Fixes
- **Fixed Multi-User Steam Detection**: Properly reads loginusers.vdf and converts SteamID64 to SteamID3 for accurate user identification
- **Fixed dotnet40 Installation Failures**: Hybrid approach uses protontricks for dotnet40 (reliable), winetricks for other components (fast)
- **Fixed dotnet8 Installation**: Now properly handled by winetricks instead of unimplemented pass statement
- **Fixed D: Drive Detection**: SD card detection now only applies to Steam Deck systems, not regular Linux systems
- **Fixed SD Card Mount Patterns**: Replaced hardcoded mmcblk0p1 references with dynamic path detection
- **Fixed Debug Restart UX**: Replaced PyInstaller detection with AppImage detection for proper restart behavior
---
## v0.1.5.2 - Proton Configuration & Engine Updates
**Release Date:** September 30, 2025
### Critical Bug Fixes
- **Fixed Proton Version Selection**: Wine component installation now properly honors user-selected Proton version from Settings dialog
- Previously, changing from GE-Proton to Proton Experimental in settings would still use the old version for component installation
- Fixed ConfigHandler to reload fresh configuration from disk instead of using stale cache
- Updated all Proton path retrieval across codebase to use fresh-reading methods
### Engine Updates
- **jackify-engine v0.3.16**: Updated to latest engine version with important reliability improvements
- **Sanity Check Fallback**: Added Proton 7z.exe fallback for case sensitivity extraction failures
- **Enhanced Error Messages**: Improved texconv/texdiag error messages to include original texture file names and conversion parameters
### Technical Improvements
- Enhanced configuration system reliability for multi-instance scenarios
- Improved error diagnostics for texture processing operations
- Fix Qt platform plugin discovery in AppImage distribution for improved compatibility
---
## v0.1.5.1 - Bug Fixes
**Release Date:** September 28, 2025
### Bug Fixes
- Fixed Steam user detection in multi-user environments
- Fixed controls not re-enabling after workflow errors
- Fixed screen state persistence between workflows
---
## v0.1.5 - Winetricks Integration & Enhanced Compatibility
**Release Date:** September 26, 2025
### Major Improvements
- **Winetricks Integration**: Replaced protontricks with bundled winetricks for faster, more reliable wine component installation
- **Enhanced SD Card Detection**: Dynamic detection of SD card mount points supports both `/run/media/mmcblk0p1` and `/run/media/deck/UUID` patterns
- **Smart Proton Detection**: Comprehensive GE-Proton support with detection in both steamapps/common and compatibilitytools.d directories
- **Steam Deck SD Card Support**: Fixed path handling for SD card installations on Steam Deck
### User Experience
- **No Focus Stealing**: Wine component installation runs in background without disrupting user workflow
- **Popup Suppression**: Eliminated wine GUI popups while maintaining functionality
- **GUI Navigation**: Fixed navigation issues after Tuxborn workflow removal
### Bug Fixes
- **CLI Configure Existing**: Fixed AppID detection with signed-to-unsigned conversion, removing protontricks dependency
- **GE-Proton Validation**: Fixed validation to support both Valve Proton and GE-Proton directory structures
- **Resolution Override**: Eliminated hardcoded 2560x1600 fallbacks that overrode user Steam Deck settings
- **VDF Case-Sensitivity**: Added case-insensitive parsing for Steam shortcuts fields
- **Cabextract Bundling**: Bundled cabextract binary to resolve winetricks dependency issues
- **ModOrganizer.ini Path Format**: Fixed missing backslash in gamePath format for proper Windows path structure
- **SD Card Binary Paths**: Corrected binary paths to use D: drive mapping instead of raw Linux paths for SD card installs
- **Proton Fallback Logic**: Enhanced fallback when user-selected Proton version is missing or invalid
#Y- **Settings Persistence**: Improved configuration saving with verification and logging
- **System Wine Elimination**: Comprehensive audit ensures Jackify never uses system wine installations
- **Winetricks Reliability**: Fixed vcrun2022 installation failures and wine app crashes
- **Enderal Registry Injection**: Switched from launch options to registry injection approach
- **Proton Path Detection**: Uses actual Steam libraries from libraryfolders.vdf instead of hardcoded paths
### Technical Improvements
- **Self-contained Cache**: Relocated winetricks cache to jackify_data_dir for better isolation
---
## v0.1.4 - GE-Proton Support and Performance Optimization
**Release Date:** September 22, 2025
### New Features
- **GE-Proton Detection**: Automatic detection and prioritization of GE-Proton versions
- **User-selectable Proton version**: Settings dialog displays all available Proton versions with type indicators
### Engine Updates
- **jackify-engine v0.3.15**: Reads Proton configuration from config.json, adds degree symbol handling for special characters, removes Wine fallback (Proton now required)
### Technical Improvements
- **Smart Priority**: GE-Proton 10+ → Proton Experimental → Proton 10 → Proton 9
- **Auto-Configuration**: Fresh installations automatically select optimal Proton version
### Bug Fixes
- **Steam VDF Compatibility**: Fixed case-sensitivity issues with Steam shortcuts.vdf parsing for Configure Existing Modlist workflows
---
## v0.1.3 - Enhanced Proton Support and System Compatibility
**Release Date:** September 21, 2025
### New Features
- **Enhanced Proton Detection**: Automatic fallback system with priority: Experimental → Proton 10 → Proton 9
- **Guided Proton Installation**: Professional auto-install dialog with Steam protocol integration for missing Proton versions
- **Enderal Game Support**: Added Enderal to supported games list with special handling for Somnium modlist structure
- **Proton Version Leniency**: Accept any Proton version 9+ instead of requiring Experimental
### UX Improvements
- **Resolution System Overhaul**: Eliminated hardcoded 2560x1600 fallbacks across all screens
- **Steam Deck Detection**: Proper 1280x800 default resolution with 1920x1080 fallback for desktop
- **Leave Unchanged Logic**: Fixed resolution setting to actually preserve existing user configurations
### Technical Improvements
- **Resolution Utilities**: New `shared/resolution_utils.py` with centralized resolution management
- **Protontricks Detection**: Enhanced detection for both native and Flatpak protontricks installations
- **Real-time Monitoring**: Progress tracking for Proton installation with directory stability detection
### Bug Fixes
- **Somnium Support**: Automatic detection of `files/ModOrganizer.exe` structure in edge-case modlists
- **Steam Protocol Integration**: Reliable triggering of Proton installation via `steam://install/` URLs
- **Manual Fallback**: Clear instructions and recheck functionality when auto-install fails
---
## v0.1.2 - About Dialog and System Information
**Release Date:** September 16, 2025
@@ -241,6 +770,23 @@
- **Clean Architecture**: Removed obsolete service imports, initializations, and cleanup methods
- **Code Quality**: Eliminated "tombstone comments" and unused service references
### Deferred Features (Available in Future Release)
#### OAuth 2.0 Authentication for Nexus Mods
**Status:** Fully implemented but disabled pending Nexus Mods approval
The OAuth 2.0 authentication system has been fully developed and tested, but is temporarily disabled in v0.1.8 as we await approval from Nexus Mods for our OAuth application. The backend code remains intact and will be re-enabled immediately upon approval.
**Features (ready for deployment):**
- **Secure OAuth 2.0 + PKCE Flow**: Modern authentication to replace API key dependency
- **Encrypted Token Storage**: Tokens stored using Fernet encryption with automatic refresh
- **GUI Integration**: Clean status display on Install Modlist screen with authorize/revoke functionality
- **CLI Integration**: OAuth menu in Additional Tasks for command-line users
- **API Key Fallback**: Optional legacy API key support (configurable in Settings)
- **Unified Auth Service**: Single authentication layer supporting both OAuth and API key methods
**Current Limitation:** Awaiting Nexus approval for `jackify://oauth/callback` custom URI. Once approved, OAuth will be enabled as the primary authentication method with API key as optional fallback.
### Technical Details
- **Single Shortcut Creation Path**: All workflows now use `run_working_workflow()` → `create_shortcut_with_native_service()`
- **Service Layer Cleanup**: Removed dual codepath architecture in favor of proven automated workflows

View File

@@ -77,6 +77,9 @@ Currently, there are two main functions that Jackify will perform at this stage
- **FUSE** (required for AppImage execution)
- Pre-installed on most Linux distributions
- If AppImage fails to run, install FUSE using your distribution's package manager
- **Ubuntu/Debian only**: Qt platform plugin library
- `sudo apt install libxcb-cursor-dev`
- Required for Qt GUI to initialize properly
### Installation

View File

@@ -5,4 +5,4 @@ This package provides both CLI and GUI interfaces for managing
Wabbajack modlists natively on Linux systems.
"""
__version__ = "0.1.2"
__version__ = "0.2.1.1"

View File

@@ -23,6 +23,46 @@ from jackify.backend.handlers.config_handler import ConfigHandler
# UI Colors already imported above
def _get_user_proton_version():
"""Get user's preferred Proton version from config, with fallback to auto-detection"""
try:
from jackify.backend.handlers.config_handler import ConfigHandler
from jackify.backend.handlers.wine_utils import WineUtils
config_handler = ConfigHandler()
# Use Install Proton (not Game Proton) for installation/texture processing
# get_proton_path() returns the Install Proton path
user_proton_path = config_handler.get_proton_path()
if not user_proton_path or user_proton_path == 'auto':
# Use enhanced fallback logic with GE-Proton preference
logging.info("User selected auto-detect, using GE-Proton → Experimental → Proton precedence")
return WineUtils.select_best_proton()
else:
# User has selected a specific Proton version
# Use the exact directory name for Steam config.vdf
try:
proton_version = os.path.basename(user_proton_path)
# GE-Proton uses exact directory name, Valve Proton needs lowercase conversion
if proton_version.startswith('GE-Proton'):
# Keep GE-Proton name exactly as-is
steam_proton_name = proton_version
else:
# Convert Valve Proton names to Steam's format
steam_proton_name = proton_version.lower().replace(' - ', '_').replace(' ', '_').replace('-', '_')
if not steam_proton_name.startswith('proton'):
steam_proton_name = f"proton_{steam_proton_name}"
logging.info(f"Using user-selected Proton: {steam_proton_name}")
return steam_proton_name
except Exception as e:
logging.warning(f"Invalid user Proton path '{user_proton_path}', falling back to auto: {e}")
return WineUtils.select_best_proton()
except Exception as e:
logging.error(f"Failed to get user Proton preference, using default: {e}")
return "proton_experimental"
# Attempt to import readline for tab completion
READLINE_AVAILABLE = False
try:
@@ -52,15 +92,16 @@ def get_jackify_engine_path():
logger.debug(f"Using engine from environment variable: {env_engine_path}")
return env_engine_path
# Priority 2: PyInstaller bundle (most specific detection)
if getattr(sys, 'frozen', False) and hasattr(sys, '_MEIPASS'):
# Running in a PyInstaller bundle
# Engine is expected at <bundle_root>/jackify/engine/jackify-engine
engine_path = os.path.join(sys._MEIPASS, 'jackify', 'engine', 'jackify-engine')
# Priority 2: AppImage bundle (most specific detection)
appdir = os.environ.get('APPDIR')
if appdir:
# Running inside AppImage
# Engine is expected at <appdir>/opt/jackify/engine/jackify-engine
engine_path = os.path.join(appdir, 'opt', 'jackify', 'engine', 'jackify-engine')
if os.path.exists(engine_path):
return engine_path
# Fallback: log warning but continue to other detection methods
logger.warning(f"PyInstaller engine not found at expected path: {engine_path}")
logger.warning(f"AppImage engine not found at expected path: {engine_path}")
# Priority 3: Check if THIS process is actually running from Jackify AppImage
# (not just inheriting APPDIR from another AppImage like Cursor)
@@ -85,7 +126,6 @@ def get_jackify_engine_path():
# If all else fails, log error and return the source path anyway
logger.error(f"jackify-engine not found in any expected location. Tried:")
logger.error(f" PyInstaller: {getattr(sys, '_MEIPASS', 'N/A')}/jackify/engine/jackify-engine")
logger.error(f" AppImage: {appdir or 'N/A'}/opt/jackify/engine/jackify-engine")
logger.error(f" Source: {engine_path}")
logger.error("This will likely cause installation failures.")
@@ -118,7 +158,7 @@ class ModlistInstallCLI:
from ..models.configuration import SystemInfo
self.system_info = SystemInfo(is_steamdeck=steamdeck)
self.protontricks_handler = ProtontricksHandler(steamdeck=self.steamdeck)
self.protontricks_handler = ProtontricksHandler(self.steamdeck)
self.shortcut_handler = ShortcutHandler(steamdeck=self.steamdeck)
self.context = {}
# Use standard logging (no file handler)
@@ -443,53 +483,78 @@ class ModlistInstallCLI:
self.context['download_dir'] = download_dir_path
self.logger.debug(f"Download directory context set to: {self.context['download_dir']}")
# 5. Prompt for Nexus API key (skip if in context and valid)
# 5. Get Nexus authentication (OAuth or API key)
if 'nexus_api_key' not in self.context or not self.context.get('nexus_api_key'):
from jackify.backend.services.api_key_service import APIKeyService
api_key_service = APIKeyService()
saved_key = api_key_service.get_saved_api_key()
api_key = None
if saved_key:
print("\n" + "-" * 28)
print(f"{COLOR_INFO}A Nexus API Key is already saved.{COLOR_RESET}")
use_saved = input(f"{COLOR_PROMPT}Use the saved API key? [Y/n]: {COLOR_RESET}").strip().lower()
if use_saved in ('', 'y', 'yes'):
api_key = saved_key
from jackify.backend.services.nexus_auth_service import NexusAuthService
auth_service = NexusAuthService()
# Get current auth status
authenticated, method, username = auth_service.get_auth_status()
if authenticated:
# Already authenticated - use existing auth
if method == 'oauth':
print("\n" + "-" * 28)
print(f"{COLOR_SUCCESS}Nexus Authentication: Authorized via OAuth{COLOR_RESET}")
if username:
print(f"{COLOR_INFO}Logged in as: {username}{COLOR_RESET}")
elif method == 'api_key':
print("\n" + "-" * 28)
print(f"{COLOR_INFO}Nexus Authentication: Using API Key (Legacy){COLOR_RESET}")
# Get valid token/key and OAuth state for engine auto-refresh
api_key, oauth_info = auth_service.get_auth_for_engine()
if api_key:
self.context['nexus_api_key'] = api_key
self.context['nexus_oauth_info'] = oauth_info # For engine auto-refresh
else:
new_key = input(f"{COLOR_PROMPT}Enter a new Nexus API Key (or press Enter to keep the saved one): {COLOR_RESET}").strip()
if new_key:
api_key = new_key
replace = input(f"{COLOR_PROMPT}Replace the saved key with this one? [y/N]: {COLOR_RESET}").strip().lower()
if replace == 'y':
if api_key_service.save_api_key(api_key):
print(f"{COLOR_SUCCESS}API key saved successfully.{COLOR_RESET}")
else:
print(f"{COLOR_WARNING}Failed to save API key. Using for this session only.{COLOR_RESET}")
# Auth expired or invalid - prompt to set up
print(f"\n{COLOR_WARNING}Your authentication has expired or is invalid.{COLOR_RESET}")
authenticated = False
if not authenticated:
# Not authenticated - offer to set up OAuth
print("\n" + "-" * 28)
print(f"{COLOR_WARNING}Nexus Mods authentication is required for downloading mods.{COLOR_RESET}")
print(f"\n{COLOR_PROMPT}Would you like to authorize with Nexus now?{COLOR_RESET}")
print(f"{COLOR_INFO}This will open your browser for secure OAuth authorization.{COLOR_RESET}")
authorize = input(f"{COLOR_PROMPT}Authorize now? [Y/n]: {COLOR_RESET}").strip().lower()
if authorize in ('', 'y', 'yes'):
# Launch OAuth authorization
print(f"\n{COLOR_INFO}Starting OAuth authorization...{COLOR_RESET}")
print(f"{COLOR_WARNING}Your browser will open shortly.{COLOR_RESET}")
print(f"{COLOR_INFO}Note: You may see a security warning about a self-signed certificate.{COLOR_RESET}")
print(f"{COLOR_INFO}This is normal - click 'Advanced' and 'Proceed' to continue.{COLOR_RESET}")
def show_message(msg):
print(f"\n{COLOR_INFO}{msg}{COLOR_RESET}")
success = auth_service.authorize_oauth(show_browser_message_callback=show_message)
if success:
print(f"\n{COLOR_SUCCESS}OAuth authorization successful!{COLOR_RESET}")
_, _, username = auth_service.get_auth_status()
if username:
print(f"{COLOR_INFO}Authorized as: {username}{COLOR_RESET}")
api_key, oauth_info = auth_service.get_auth_for_engine()
if api_key:
self.context['nexus_api_key'] = api_key
self.context['nexus_oauth_info'] = oauth_info # For engine auto-refresh
else:
print(f"{COLOR_INFO}Using new key for this session only. Saved key unchanged.{COLOR_RESET}")
print(f"{COLOR_ERROR}Failed to retrieve auth token after authorization.{COLOR_RESET}")
return None
else:
api_key = saved_key
else:
print("\n" + "-" * 28)
print(f"{COLOR_INFO}A Nexus Mods API key is required for downloading mods.{COLOR_RESET}")
print(f"{COLOR_INFO}You can get your personal key at: {COLOR_SELECTION}https://www.nexusmods.com/users/myaccount?tab=api{COLOR_RESET}")
print(f"{COLOR_WARNING}Your API Key is NOT saved locally. It is used only for this session unless you choose to save it.{COLOR_RESET}")
api_key = input(f"{COLOR_PROMPT}Enter Nexus API Key (or 'q' to cancel): {COLOR_RESET}").strip()
if not api_key or api_key.lower() == 'q':
self.logger.info("User cancelled or provided no API key.")
return None
save = input(f"{COLOR_PROMPT}Would you like to save this API key for future use? [y/N]: {COLOR_RESET}").strip().lower()
if save == 'y':
if api_key_service.save_api_key(api_key):
print(f"{COLOR_SUCCESS}API key saved successfully.{COLOR_RESET}")
else:
print(f"{COLOR_WARNING}Failed to save API key. Using for this session only.{COLOR_RESET}")
print(f"\n{COLOR_ERROR}OAuth authorization failed.{COLOR_RESET}")
return None
else:
print(f"{COLOR_INFO}Using API key for this session only. It will not be saved.{COLOR_RESET}")
# Set the API key in context regardless of which path was taken
self.context['nexus_api_key'] = api_key
self.logger.debug(f"NEXUS_API_KEY is set in environment for engine (presence check).")
# User declined OAuth - cancelled
print(f"\n{COLOR_INFO}Authorization required to proceed. Installation cancelled.{COLOR_RESET}")
self.logger.info("User declined Nexus authorization.")
return None
self.logger.debug(f"Nexus authentication configured for engine.")
# Display summary and confirm
self._display_summary() # Ensure this method exists or implement it
@@ -584,11 +649,23 @@ class ModlistInstallCLI:
if isinstance(download_dir_display, tuple):
download_dir_display = download_dir_display[0] # Get the Path object from (Path, bool)
print(f"Download Directory: {download_dir_display}")
if self.context.get('nexus_api_key'):
print(f"Nexus API Key: [SET]")
# Show authentication method
from jackify.backend.services.nexus_auth_service import NexusAuthService
auth_service = NexusAuthService()
authenticated, method, username = auth_service.get_auth_status()
if method == 'oauth':
auth_display = f"Nexus Authentication: OAuth"
if username:
auth_display += f" ({username})"
elif method == 'api_key':
auth_display = "Nexus Authentication: API Key (Legacy)"
else:
print(f"Nexus API Key: [NOT SET - WILL LIKELY FAIL]")
# Should never reach here since we validate auth before getting to summary
auth_display = "Nexus Authentication: Unknown"
print(auth_display)
print(f"{COLOR_INFO}----------------------------------------{COLOR_RESET}")
def configuration_phase(self):
@@ -605,7 +682,8 @@ class ModlistInstallCLI:
start_time = time.time()
# --- BEGIN: TEE LOGGING SETUP & LOG ROTATION ---
log_dir = Path.home() / "Jackify" / "logs"
from jackify.shared.paths import get_jackify_logs_dir
log_dir = get_jackify_logs_dir()
log_dir.mkdir(parents=True, exist_ok=True)
workflow_log_path = log_dir / "Modlist_Install_workflow.log"
# Log rotation: keep last 3 logs, 1MB each (adjust as needed)
@@ -661,7 +739,17 @@ class ModlistInstallCLI:
modlist_arg = self.context.get('modlist_value') or self.context.get('machineid')
machineid = self.context.get('machineid')
api_key = self.context.get('nexus_api_key')
# CRITICAL: Re-check authentication right before launching engine
# This ensures we use current auth state, not stale cached values from context
# (e.g., if user revoked OAuth after context was created)
from jackify.backend.services.nexus_auth_service import NexusAuthService
auth_service = NexusAuthService()
current_api_key, current_oauth_info = auth_service.get_auth_for_engine()
# Use current auth state, fallback to context values only if current check failed
api_key = current_api_key or self.context.get('nexus_api_key')
oauth_info = current_oauth_info or self.context.get('nexus_oauth_info')
# Path to the engine binary
engine_path = get_jackify_engine_path()
@@ -681,7 +769,7 @@ class ModlistInstallCLI:
# --- End Patch ---
# Build command
cmd = [engine_path, 'install']
cmd = [engine_path, 'install', '--show-file-progress']
# Determine if this is a local .wabbajack file or an online modlist
modlist_value = self.context.get('modlist_value')
if modlist_value and modlist_value.endswith('.wabbajack') and os.path.isfile(modlist_value):
@@ -692,27 +780,54 @@ class ModlistInstallCLI:
cmd += ['-m', self.context['machineid']]
cmd += ['-o', install_dir_str, '-d', download_dir_str]
# Add debug flag if debug mode is enabled
from jackify.backend.handlers.config_handler import ConfigHandler
config_handler = ConfigHandler()
debug_mode = config_handler.get('debug_mode', False)
if debug_mode:
cmd.append('--debug')
self.logger.info("Adding --debug flag to jackify-engine")
# Store original environment values to restore later
original_env_values = {
'NEXUS_API_KEY': os.environ.get('NEXUS_API_KEY'),
'NEXUS_OAUTH_INFO': os.environ.get('NEXUS_OAUTH_INFO'),
'DOTNET_SYSTEM_GLOBALIZATION_INVARIANT': os.environ.get('DOTNET_SYSTEM_GLOBALIZATION_INVARIANT')
}
try:
# Temporarily modify current process's environment
if api_key:
# Prefer NEXUS_OAUTH_INFO (supports auto-refresh) over NEXUS_API_KEY (legacy)
if oauth_info:
os.environ['NEXUS_OAUTH_INFO'] = oauth_info
# CRITICAL: Set client_id so engine can refresh tokens with correct client_id
# Engine's RefreshToken method reads this to use our "jackify" client_id instead of hardcoded "wabbajack"
from jackify.backend.services.nexus_oauth_service import NexusOAuthService
os.environ['NEXUS_OAUTH_CLIENT_ID'] = NexusOAuthService.CLIENT_ID
self.logger.debug(f"Set NEXUS_OAUTH_INFO and NEXUS_OAUTH_CLIENT_ID={NexusOAuthService.CLIENT_ID} for engine (supports auto-refresh)")
# Also set NEXUS_API_KEY for backward compatibility
if api_key:
os.environ['NEXUS_API_KEY'] = api_key
elif api_key:
# No OAuth info, use API key only (no auto-refresh support)
os.environ['NEXUS_API_KEY'] = api_key
self.logger.debug(f"Temporarily set os.environ['NEXUS_API_KEY'] for engine call using session-provided key.")
elif 'NEXUS_API_KEY' in os.environ: # api_key is None/empty, but a system key might exist
self.logger.debug(f"Session API key not provided. Temporarily removing inherited NEXUS_API_KEY ('{'[REDACTED]' if os.environ.get('NEXUS_API_KEY') else 'None'}') from os.environ for engine call to ensure it is not used.")
del os.environ['NEXUS_API_KEY']
# If api_key is None and NEXUS_API_KEY was not in os.environ, it remains unset, which is correct.
self.logger.debug(f"Set NEXUS_API_KEY for engine (no auto-refresh)")
else:
# No auth available, clear any inherited values
if 'NEXUS_API_KEY' in os.environ:
del os.environ['NEXUS_API_KEY']
if 'NEXUS_OAUTH_INFO' in os.environ:
del os.environ['NEXUS_OAUTH_INFO']
if 'NEXUS_OAUTH_CLIENT_ID' in os.environ:
del os.environ['NEXUS_OAUTH_CLIENT_ID']
self.logger.debug(f"No Nexus auth available, cleared inherited env vars")
os.environ['DOTNET_SYSTEM_GLOBALIZATION_INVARIANT'] = "1"
self.logger.debug(f"Temporarily set os.environ['DOTNET_SYSTEM_GLOBALIZATION_INVARIANT'] = '1' for engine call.")
self.logger.info("Environment prepared for jackify-engine install process by modifying os.environ.")
self.logger.debug(f"NEXUS_API_KEY in os.environ (pre-call): {'[SET]' if os.environ.get('NEXUS_API_KEY') else '[NOT SET]'}")
self.logger.debug(f"NEXUS_OAUTH_INFO in os.environ (pre-call): {'[SET]' if os.environ.get('NEXUS_OAUTH_INFO') else '[NOT SET]'}")
pretty_cmd = ' '.join([f'"{arg}"' if ' ' in arg else arg for arg in cmd])
print(f"{COLOR_INFO}Launching Jackify Install Engine with command:{COLOR_RESET} {pretty_cmd}")
@@ -725,9 +840,11 @@ class ModlistInstallCLI:
else:
self.logger.warning(f"File descriptor limit: {message}")
# Popen now inherits the modified os.environ because env=None
# Use cleaned environment to prevent AppImage variable inheritance
from jackify.backend.handlers.subprocess_utils import get_clean_subprocess_env
clean_env = get_clean_subprocess_env()
# Store process reference for cleanup
self._current_process = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, text=False, env=None, cwd=engine_dir)
self._current_process = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, text=False, env=clean_env, cwd=engine_dir)
proc = self._current_process
# Read output in binary mode to properly handle carriage returns
@@ -742,11 +859,29 @@ class ModlistInstallCLI:
if chunk == b'\n':
# Complete line - decode and print
line = buffer.decode('utf-8', errors='replace')
# Filter FILE_PROGRESS spam but keep the status line before it
if '[FILE_PROGRESS]' in line:
parts = line.split('[FILE_PROGRESS]', 1)
if parts[0].strip():
line = parts[0].rstrip()
else:
# Skip this line entirely if it's only FILE_PROGRESS
buffer = b''
continue
print(line, end='')
buffer = b''
elif chunk == b'\r':
# Carriage return - decode and print without newline
line = buffer.decode('utf-8', errors='replace')
# Filter FILE_PROGRESS spam but keep the status line before it
if '[FILE_PROGRESS]' in line:
parts = line.split('[FILE_PROGRESS]', 1)
if parts[0].strip():
line = parts[0].rstrip()
else:
# Skip this line entirely if it's only FILE_PROGRESS
buffer = b''
continue
print(line, end='')
sys.stdout.flush()
buffer = b''
@@ -754,7 +889,16 @@ class ModlistInstallCLI:
# Print any remaining buffer content
if buffer:
line = buffer.decode('utf-8', errors='replace')
print(line, end='')
# Filter FILE_PROGRESS spam but keep the status line before it
if '[FILE_PROGRESS]' in line:
parts = line.split('[FILE_PROGRESS]', 1)
if parts[0].strip():
line = parts[0].rstrip()
else:
# Skip this line entirely if it's only FILE_PROGRESS
line = ''
if line:
print(line, end='')
proc.wait()
# Clear process reference after completion
@@ -1248,13 +1392,16 @@ class ModlistInstallCLI:
from jackify.backend.services.native_steam_service import NativeSteamService
steam_service = NativeSteamService()
# Get user's preferred Proton version
proton_version = _get_user_proton_version()
success, app_id = steam_service.create_shortcut_with_proton(
app_name=config_context['name'],
exe_path=config_context['mo2_exe_path'],
start_dir=os.path.dirname(config_context['mo2_exe_path']),
launch_options="%command%",
tags=["Jackify"],
proton_version="proton_experimental"
proton_version=proton_version
)
if not success or not app_id:
@@ -1463,9 +1610,21 @@ class ModlistInstallCLI:
if isinstance(download_dir_display, tuple):
download_dir_display = download_dir_display[0] # Get the Path object from (Path, bool)
print(f"Download Directory: {download_dir_display}")
if self.context.get('nexus_api_key'):
print(f"Nexus API Key: [SET]")
# Show authentication method
from jackify.backend.services.nexus_auth_service import NexusAuthService
auth_service = NexusAuthService()
authenticated, method, username = auth_service.get_auth_status()
if method == 'oauth':
auth_display = f"Nexus Authentication: OAuth"
if username:
auth_display += f" ({username})"
elif method == 'api_key':
auth_display = "Nexus Authentication: API Key (Legacy)"
else:
print(f"Nexus API Key: [NOT SET - WILL LIKELY FAIL]")
# Should never reach here since we validate auth before getting to summary
auth_display = "Nexus Authentication: Unknown"
print(auth_display)
print(f"{COLOR_INFO}----------------------------------------{COLOR_RESET}")

View File

@@ -0,0 +1,3 @@
"""
Data package for static configuration and reference data.
"""

View File

@@ -0,0 +1,46 @@
"""
TTW-Compatible Modlists Configuration
Defines which Fallout New Vegas modlists support Tale of Two Wastelands.
This whitelist determines when Jackify should offer TTW installation after
a successful modlist installation.
"""
TTW_COMPATIBLE_MODLISTS = {
# Exact modlist names that support/require TTW
"exact_matches": [
"Begin Again",
"Uranium Fever",
"The Badlands",
"Wild Card TTW",
],
# Pattern matching for modlist names (regex)
"patterns": [
r".*TTW.*", # Any modlist with TTW in name
r".*Tale.*Two.*Wastelands.*",
]
}
def is_ttw_compatible(modlist_name: str) -> bool:
"""Check if modlist name matches TTW compatibility criteria
Args:
modlist_name: Name of the modlist to check
Returns:
bool: True if modlist is TTW-compatible, False otherwise
"""
import re
# Check exact matches
if modlist_name in TTW_COMPATIBLE_MODLISTS['exact_matches']:
return True
# Check pattern matches
for pattern in TTW_COMPATIBLE_MODLISTS['patterns']:
if re.match(pattern, modlist_name, re.IGNORECASE):
return True
return False

View File

@@ -6,12 +6,15 @@ Handles application settings and configuration
"""
import os
import sys
import json
import logging
import shutil
import re
import base64
import hashlib
from pathlib import Path
from typing import Optional
# Initialize logger
logger = logging.getLogger(__name__)
@@ -20,14 +23,27 @@ logger = logging.getLogger(__name__)
class ConfigHandler:
"""
Handles application configuration and settings
Singleton pattern ensures all code shares the same instance
"""
_instance = None
_initialized = False
def __new__(cls):
if cls._instance is None:
cls._instance = super(ConfigHandler, cls).__new__(cls)
return cls._instance
def __init__(self):
"""Initialize configuration handler with default settings"""
# Only initialize once (singleton pattern)
if ConfigHandler._initialized:
return
ConfigHandler._initialized = True
self.config_dir = os.path.expanduser("~/.config/jackify")
self.config_file = os.path.join(self.config_dir, "config.json")
self.settings = {
"version": "0.0.5",
"version": "0.2.0",
"last_selected_modlist": None,
"steam_libraries": [],
"resolution": None,
@@ -38,17 +54,31 @@ class ConfigHandler:
"default_download_parent_dir": None, # Parent directory for downloads
"modlist_install_base_dir": os.path.expanduser("~/Games"), # Configurable base directory for modlist installations
"modlist_downloads_base_dir": os.path.expanduser("~/Games/Modlist_Downloads"), # Configurable base directory for downloads
"jackify_data_dir": None # Configurable Jackify data directory (default: ~/Jackify)
"jackify_data_dir": None, # Configurable Jackify data directory (default: ~/Jackify)
"use_winetricks_for_components": True, # DEPRECATED: Migrated to component_installation_method. Kept for backward compatibility.
"component_installation_method": "winetricks", # "winetricks" (default) or "system_protontricks"
"game_proton_path": None, # Proton version for game shortcuts (can be any Proton 9+), separate from install proton
"proton_path": None, # Install Proton path (for jackify-engine) - None means auto-detect
"proton_version": None, # Install Proton version name - None means auto-detect
"steam_restart_strategy": "jackify", # "jackify" (default) or "nak_simple"
"window_width": None, # Saved window width (None = use dynamic sizing)
"window_height": None # Saved window height (None = use dynamic sizing)
}
# Load configuration if exists
self._load_config()
# Perform version migrations
self._migrate_config()
# If steam_path is not set, detect it
if not self.settings["steam_path"]:
self.settings["steam_path"] = self._detect_steam_path()
# Save the updated settings
self.save_config()
# Auto-detect and set Proton version ONLY on first run (config file doesn't exist)
# Do NOT overwrite user's saved settings!
if not os.path.exists(self.config_file) and not self.settings.get("proton_path"):
self._auto_detect_proton()
# If jackify_data_dir is not set, initialize it to default
if not self.settings.get("jackify_data_dir"):
@@ -82,7 +112,8 @@ class ConfigHandler:
libraryfolders_vdf_paths = [
os.path.expanduser("~/.steam/steam/config/libraryfolders.vdf"),
os.path.expanduser("~/.local/share/Steam/config/libraryfolders.vdf"),
os.path.expanduser("~/.steam/root/config/libraryfolders.vdf")
os.path.expanduser("~/.steam/root/config/libraryfolders.vdf"),
os.path.expanduser("~/.var/app/com.valvesoftware.Steam/.local/share/Steam/config/libraryfolders.vdf") # Flatpak
]
for vdf_path in libraryfolders_vdf_paths:
@@ -96,7 +127,10 @@ class ConfigHandler:
return None
def _load_config(self):
"""Load configuration from file"""
"""
Load configuration from file and update in-memory cache.
For legacy compatibility with initialization code.
"""
try:
if os.path.exists(self.config_file):
with open(self.config_file, 'r') as f:
@@ -109,6 +143,84 @@ class ConfigHandler:
self._create_config_dir()
except Exception as e:
logger.error(f"Error loading configuration: {e}")
def _migrate_config(self):
"""
Migrate configuration between versions
Handles breaking changes and data format updates
"""
current_version = self.settings.get("version", "0.0.0")
target_version = "0.2.0"
if current_version == target_version:
return
logger.info(f"Migrating config from {current_version} to {target_version}")
# Migration: v0.0.x -> v0.2.0
# Encryption changed from cryptography (Fernet) to pycryptodome (AES-GCM)
# Old encrypted API keys cannot be decrypted, must be re-entered
from packaging import version
if version.parse(current_version) < version.parse("0.2.0"):
# Clear old encrypted credentials
if self.settings.get("nexus_api_key"):
logger.warning("Clearing saved API key due to encryption format change")
logger.warning("Please re-enter your Nexus API key in Settings")
self.settings["nexus_api_key"] = None
# Clear OAuth token file (different encryption format)
oauth_token_file = Path(self.config_dir) / "nexus-oauth.json"
if oauth_token_file.exists():
logger.warning("Clearing saved OAuth token due to encryption format change")
logger.warning("Please re-authorize with Nexus Mods")
try:
oauth_token_file.unlink()
except Exception as e:
logger.error(f"Failed to remove old OAuth token: {e}")
# Remove obsolete keys
obsolete_keys = [
"hoolamike_install_path",
"hoolamike_version",
"api_key_fallback_enabled",
"proton_version", # Display string only, path stored in proton_path
"game_proton_version" # Display string only, path stored in game_proton_path
]
removed_count = 0
for key in obsolete_keys:
if key in self.settings:
del self.settings[key]
removed_count += 1
if removed_count > 0:
logger.info(f"Removed {removed_count} obsolete config keys")
# Update version
self.settings["version"] = target_version
self.save_config()
logger.info("Config migration completed")
def _read_config_from_disk(self):
"""
Read configuration directly from disk without caching.
Returns merged config (defaults + saved values).
"""
try:
config = self.settings.copy() # Start with defaults
if os.path.exists(self.config_file):
with open(self.config_file, 'r') as f:
saved_config = json.load(f)
config.update(saved_config)
return config
except Exception as e:
# Don't use logger here - can cause recursion if logger tries to access config
print(f"Warning: Error reading configuration from disk: {e}", file=sys.stderr)
return self.settings.copy()
def reload_config(self):
"""Reload configuration from disk to pick up external changes"""
self._load_config()
def _create_config_dir(self):
"""Create configuration directory if it doesn't exist"""
@@ -131,8 +243,12 @@ class ConfigHandler:
return False
def get(self, key, default=None):
"""Get a configuration value by key"""
return self.settings.get(key, default)
"""
Get a configuration value by key.
Always reads fresh from disk to avoid stale data.
"""
config = self._read_config_from_disk()
return config.get(key, default)
def set(self, key, value):
"""Set a configuration value"""
@@ -191,48 +307,192 @@ class ConfigHandler:
"""Get the path to protontricks executable"""
return self.settings.get("protontricks_path")
def _get_encryption_key(self) -> bytes:
"""
Generate encryption key for API key storage using same method as OAuth tokens
Returns:
Fernet-compatible encryption key
"""
import socket
import getpass
try:
hostname = socket.gethostname()
username = getpass.getuser()
# Try to get machine ID
machine_id = None
try:
with open('/etc/machine-id', 'r') as f:
machine_id = f.read().strip()
except:
try:
with open('/var/lib/dbus/machine-id', 'r') as f:
machine_id = f.read().strip()
except:
pass
if machine_id:
key_material = f"{hostname}:{username}:{machine_id}:jackify"
else:
key_material = f"{hostname}:{username}:jackify"
except Exception as e:
logger.warning(f"Failed to get machine info for encryption: {e}")
key_material = "jackify:default:key"
# Generate Fernet-compatible key
key_bytes = hashlib.sha256(key_material.encode('utf-8')).digest()
return base64.urlsafe_b64encode(key_bytes)
def _encrypt_api_key(self, api_key: str) -> str:
"""
Encrypt API key using AES-GCM
Args:
api_key: Plain text API key
Returns:
Encrypted API key string
"""
try:
from Crypto.Cipher import AES
from Crypto.Random import get_random_bytes
# Derive 32-byte AES key
key = base64.urlsafe_b64decode(self._get_encryption_key())
# Generate random nonce
nonce = get_random_bytes(12)
# Encrypt with AES-GCM
cipher = AES.new(key, AES.MODE_GCM, nonce=nonce)
ciphertext, tag = cipher.encrypt_and_digest(api_key.encode('utf-8'))
# Combine and encode
combined = nonce + ciphertext + tag
return base64.b64encode(combined).decode('utf-8')
except ImportError:
# Fallback to base64 if pycryptodome not available
logger.warning("pycryptodome not available, using base64 encoding (less secure)")
return base64.b64encode(api_key.encode('utf-8')).decode('utf-8')
except Exception as e:
logger.error(f"Error encrypting API key: {e}")
return ""
def _decrypt_api_key(self, encrypted_key: str) -> Optional[str]:
"""
Decrypt API key using AES-GCM
Args:
encrypted_key: Encrypted API key string
Returns:
Decrypted API key or None on failure
"""
try:
from Crypto.Cipher import AES
# Check if MODE_GCM is available (pycryptodome has it, old pycrypto doesn't)
if not hasattr(AES, 'MODE_GCM'):
# Fallback to base64 decode if old pycrypto is installed
try:
return base64.b64decode(encrypted_key.encode('utf-8')).decode('utf-8')
except:
return None
# Derive 32-byte AES key
key = base64.urlsafe_b64decode(self._get_encryption_key())
# Decode and split
combined = base64.b64decode(encrypted_key.encode('utf-8'))
nonce = combined[:12]
tag = combined[-16:]
ciphertext = combined[12:-16]
# Decrypt with AES-GCM
cipher = AES.new(key, AES.MODE_GCM, nonce=nonce)
plaintext = cipher.decrypt_and_verify(ciphertext, tag)
return plaintext.decode('utf-8')
except ImportError:
# Fallback to base64 decode
try:
return base64.b64decode(encrypted_key.encode('utf-8')).decode('utf-8')
except:
return None
except AttributeError:
# Old pycrypto doesn't have MODE_GCM, fallback to base64
try:
return base64.b64decode(encrypted_key.encode('utf-8')).decode('utf-8')
except:
return None
except Exception as e:
# Might be old base64-only format, try decoding
try:
return base64.b64decode(encrypted_key.encode('utf-8')).decode('utf-8')
except:
logger.error(f"Error decrypting API key: {e}")
return None
def save_api_key(self, api_key):
"""
Save Nexus API key with base64 encoding
Save Nexus API key with Fernet encryption
Args:
api_key (str): Plain text API key
Returns:
bool: True if saved successfully, False otherwise
"""
try:
if api_key:
# Encode the API key using base64
encoded_key = base64.b64encode(api_key.encode('utf-8')).decode('utf-8')
self.settings["nexus_api_key"] = encoded_key
logger.debug("API key saved successfully")
# Encrypt the API key using Fernet
encrypted_key = self._encrypt_api_key(api_key)
if not encrypted_key:
logger.error("Failed to encrypt API key")
return False
self.settings["nexus_api_key"] = encrypted_key
logger.debug("API key encrypted and saved successfully")
else:
# Clear the API key if empty
self.settings["nexus_api_key"] = None
logger.debug("API key cleared")
return self.save_config()
result = self.save_config()
# Set restrictive permissions on config file
if result:
try:
os.chmod(self.config_file, 0o600)
except Exception as e:
logger.warning(f"Could not set restrictive permissions on config: {e}")
return result
except Exception as e:
logger.error(f"Error saving API key: {e}")
return False
def get_api_key(self):
"""
Retrieve and decode the saved Nexus API key
Always reads fresh from disk to pick up changes from other instances
Retrieve and decrypt the saved Nexus API key.
Always reads fresh from disk.
Returns:
str: Decoded API key or None if not saved
str: Decrypted API key or None if not saved
"""
try:
# Reload config from disk to pick up changes from Settings dialog
self._load_config()
encoded_key = self.settings.get("nexus_api_key")
if encoded_key:
# Decode the base64 encoded key
decoded_key = base64.b64decode(encoded_key.encode('utf-8')).decode('utf-8')
return decoded_key
config = self._read_config_from_disk()
encrypted_key = config.get("nexus_api_key")
if encrypted_key:
# Decrypt the API key
decrypted_key = self._decrypt_api_key(encrypted_key)
return decrypted_key
return None
except Exception as e:
logger.error(f"Error retrieving API key: {e}")
@@ -240,15 +500,14 @@ class ConfigHandler:
def has_saved_api_key(self):
"""
Check if an API key is saved in configuration
Always reads fresh from disk to pick up changes from other instances
Check if an API key is saved in configuration.
Always reads fresh from disk.
Returns:
bool: True if API key exists, False otherwise
"""
# Reload config from disk to pick up changes from Settings dialog
self._load_config()
return self.settings.get("nexus_api_key") is not None
config = self._read_config_from_disk()
return config.get("nexus_api_key") is not None
def clear_api_key(self):
"""
@@ -494,4 +753,98 @@ class ConfigHandler:
logger.error(f"Error saving modlist downloads base directory: {e}")
return False
def get_proton_path(self):
"""
Retrieve the saved Install Proton path from configuration (for jackify-engine).
Always reads fresh from disk.
Returns:
str: Saved Install Proton path, or None if not set (indicates auto-detect mode)
"""
try:
config = self._read_config_from_disk()
proton_path = config.get("proton_path")
# Return None if missing/None/empty string - don't default to "auto"
if not proton_path:
logger.debug("proton_path not set in config - will use auto-detection")
return None
logger.debug(f"Retrieved fresh install proton_path from config: {proton_path}")
return proton_path
except Exception as e:
logger.error(f"Error retrieving install proton_path: {e}")
return None
def get_game_proton_path(self):
"""
Retrieve the saved Game Proton path from configuration (for game shortcuts).
Falls back to install Proton path if game Proton not set.
Always reads fresh from disk.
Returns:
str: Saved Game Proton path, Install Proton path, or None if not saved (indicates auto-detect mode)
"""
try:
config = self._read_config_from_disk()
game_proton_path = config.get("game_proton_path")
# If game proton not set or set to same_as_install, use install proton
if not game_proton_path or game_proton_path == "same_as_install":
game_proton_path = config.get("proton_path") # Returns None if not set
# Return None if missing/None/empty string
if not game_proton_path:
logger.debug("game_proton_path not set in config - will use auto-detection")
return None
logger.debug(f"Retrieved fresh game proton_path from config: {game_proton_path}")
return game_proton_path
except Exception as e:
logger.error(f"Error retrieving game proton_path: {e}")
return "auto"
def get_proton_version(self):
"""
Retrieve the saved Proton version from configuration.
Always reads fresh from disk.
Returns:
str: Saved Proton version or 'auto' if not saved
"""
try:
config = self._read_config_from_disk()
proton_version = config.get("proton_version", "auto")
logger.debug(f"Retrieved fresh proton_version from config: {proton_version}")
return proton_version
except Exception as e:
logger.error(f"Error retrieving proton_version: {e}")
return "auto"
def _auto_detect_proton(self):
"""Auto-detect and set best Proton version (includes GE-Proton and Valve Proton)"""
try:
from .wine_utils import WineUtils
best_proton = WineUtils.select_best_proton()
if best_proton:
self.settings["proton_path"] = str(best_proton['path'])
self.settings["proton_version"] = best_proton['name']
proton_type = best_proton.get('type', 'Unknown')
logger.info(f"Auto-detected Proton: {best_proton['name']} ({proton_type})")
self.save_config()
else:
# Set proton_path to None (will appear as null in JSON) so jackify-engine doesn't get invalid path
# Code will auto-detect on each run when proton_path is None
self.settings["proton_path"] = None
self.settings["proton_version"] = None
logger.warning("No compatible Proton versions found - proton_path set to null in config.json")
logger.info("Jackify will auto-detect Proton on each run until a valid version is found")
self.save_config()
except Exception as e:
logger.error(f"Failed to auto-detect Proton: {e}")
# Set proton_path to None (will appear as null in JSON)
self.settings["proton_path"] = None
self.settings["proton_version"] = None
logger.warning("proton_path set to null in config.json due to auto-detection failure")
self.save_config()

View File

@@ -0,0 +1,317 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
ENB Handler Module
Handles ENB detection and Linux compatibility configuration for modlists.
"""
import logging
import configparser
import shutil
from pathlib import Path
from typing import Dict, Any, Optional, Tuple
logger = logging.getLogger(__name__)
class ENBHandler:
"""
Handles ENB detection and configuration for Linux compatibility.
Detects ENB components in modlist installations and ensures enblocal.ini
has the required LinuxVersion=true setting in the [GLOBAL] section.
"""
def __init__(self):
"""Initialize ENB handler."""
self.logger = logger
def detect_enb_in_modlist(self, modlist_path: Path) -> Dict[str, Any]:
"""
Detect ENB components in modlist installation.
Searches for ENB configuration files:
- enbseries.ini, enblocal.ini (ENB configuration files)
Note: Does NOT check for DLL files (d3d9.dll, d3d11.dll, dxgi.dll) as these
are used by many other mods (ReShade, other graphics mods) and are not
reliable indicators of ENB presence.
Args:
modlist_path: Path to modlist installation directory
Returns:
Dict with detection results:
- has_enb: bool - True if ENB config files found
- enblocal_ini: str or None - Path to enblocal.ini if found
- enbseries_ini: str or None - Path to enbseries.ini if found
- d3d9_dll: str or None - Always None (not checked)
- d3d11_dll: str or None - Always None (not checked)
- dxgi_dll: str or None - Always None (not checked)
"""
enb_info = {
'has_enb': False,
'enblocal_ini': None,
'enbseries_ini': None,
'd3d9_dll': None,
'd3d11_dll': None,
'dxgi_dll': None
}
if not modlist_path.exists():
self.logger.warning(f"Modlist path does not exist: {modlist_path}")
return enb_info
# Search for ENB indicator files
# IMPORTANT: Only check for ENB config files (enbseries.ini, enblocal.ini)
# Do NOT check for DLL files (d3d9.dll, d3d11.dll, dxgi.dll) as these are used
# by many other mods (ReShade, other graphics mods) and are not reliable ENB indicators
enb_config_patterns = [
('**/enbseries.ini', 'enbseries_ini'),
('**/enblocal.ini', 'enblocal_ini')
]
for pattern, key in enb_config_patterns:
for file_path in modlist_path.glob(pattern):
# Skip backups and plugin data directories
if "Backup" in str(file_path) or "plugins/data" in str(file_path):
continue
enb_info['has_enb'] = True
if not enb_info[key]: # Store first match
enb_info[key] = str(file_path)
# If we detected ENB config but didn't find enblocal.ini via glob,
# use the priority-based finder
if enb_info['has_enb'] and not enb_info['enblocal_ini']:
found_ini = self.find_enblocal_ini(modlist_path)
if found_ini:
enb_info['enblocal_ini'] = str(found_ini)
return enb_info
def find_enblocal_ini(self, modlist_path: Path) -> Optional[Path]:
"""
Find enblocal.ini in modlist installation using priority-based search.
Search order (highest priority first):
1. Stock Game/Game Root directories (active locations)
2. Mods folder with Root/root subfolder (most common pattern)
3. Direct in mods/fixes folders
4. Fallback recursive search (excluding backups)
Args:
modlist_path: Path to modlist installation directory
Returns:
Path to enblocal.ini if found, None otherwise
"""
if not modlist_path.exists():
return None
# Priority 1: Stock Game/Game Root (active locations)
stock_game_names = [
"Stock Game",
"Game Root",
"STOCK GAME",
"Stock Game Folder",
"Stock Folder",
"Skyrim Stock"
]
for name in stock_game_names:
candidate = modlist_path / name / "enblocal.ini"
if candidate.exists():
self.logger.debug(f"Found enblocal.ini in Stock Game location: {candidate}")
return candidate
# Priority 2: Mods folder with Root/root subfolder
mods_dir = modlist_path / "mods"
if mods_dir.exists():
# Search for Root/root subfolders
for root_dir in mods_dir.rglob("Root"):
candidate = root_dir / "enblocal.ini"
if candidate.exists():
self.logger.debug(f"Found enblocal.ini in mods/Root: {candidate}")
return candidate
for root_dir in mods_dir.rglob("root"):
candidate = root_dir / "enblocal.ini"
if candidate.exists():
self.logger.debug(f"Found enblocal.ini in mods/root: {candidate}")
return candidate
# Priority 3: Direct in mods/fixes folders
for search_dir in [modlist_path / "mods", modlist_path / "fixes"]:
if search_dir.exists():
for enb_file in search_dir.rglob("enblocal.ini"):
# Skip backups and plugin data
if "Backup" not in str(enb_file) and "plugins/data" not in str(enb_file):
self.logger.debug(f"Found enblocal.ini in {search_dir.name}: {enb_file}")
return enb_file
# Priority 4: Fallback recursive search (exclude backups)
for enb_file in modlist_path.rglob("enblocal.ini"):
if "Backup" not in str(enb_file) and "plugins/data" not in str(enb_file):
self.logger.debug(f"Found enblocal.ini via recursive search: {enb_file}")
return enb_file
return None
def ensure_linux_version_setting(self, enblocal_ini_path: Path) -> bool:
"""
Safely ensure [GLOBAL] section exists with LinuxVersion=true in enblocal.ini.
Safety features:
- Verifies file exists before attempting modification
- Checks if [GLOBAL] section exists before adding (prevents duplicates)
- Creates backup before any write operation
- Only writes if changes are actually needed
- Handles encoding issues gracefully
- Preserves existing file structure and comments
Args:
enblocal_ini_path: Path to enblocal.ini file
Returns:
bool: True if successful or no changes needed, False on error
"""
try:
# Safety check: file must exist
if not enblocal_ini_path.exists():
self.logger.warning(f"enblocal.ini not found at: {enblocal_ini_path}")
return False
# Read existing INI with same settings as modlist_handler.py
config = configparser.ConfigParser(
allow_no_value=True,
delimiters=['=']
)
config.optionxform = str # Preserve case sensitivity
# Read with encoding handling (same pattern as modlist_handler.py)
try:
with open(enblocal_ini_path, 'r', encoding='utf-8-sig') as f:
config.read_file(f)
except UnicodeDecodeError:
with open(enblocal_ini_path, 'r', encoding='latin-1') as f:
config.read_file(f)
except configparser.DuplicateSectionError as e:
# If file has duplicate [GLOBAL] sections, log warning and skip
self.logger.warning(f"enblocal.ini has duplicate sections: {e}. Skipping modification.")
return False
# Check if [GLOBAL] section exists (case-insensitive check)
global_section_exists = False
global_section_name = None
# Find existing [GLOBAL] section (case-insensitive)
for section_name in config.sections():
if section_name.upper() == 'GLOBAL':
global_section_exists = True
global_section_name = section_name # Use actual case
break
# Check current LinuxVersion value
needs_update = False
if global_section_exists:
# Section exists - check if LinuxVersion needs updating
current_value = config.get(global_section_name, 'LinuxVersion', fallback=None)
if current_value is None or current_value.lower() != 'true':
needs_update = True
else:
# Section doesn't exist - we need to add it
needs_update = True
# If no changes needed, return success
if not needs_update:
self.logger.debug(f"enblocal.ini already has LinuxVersion=true in [GLOBAL] section")
return True
# Changes needed - create backup first
backup_path = enblocal_ini_path.with_suffix('.ini.jackify_backup')
try:
if not backup_path.exists():
shutil.copy2(enblocal_ini_path, backup_path)
self.logger.debug(f"Created backup: {backup_path}")
except Exception as e:
self.logger.warning(f"Failed to create backup: {e}. Proceeding anyway.")
# Make changes
if not global_section_exists:
# Add [GLOBAL] section (configparser will use exact case 'GLOBAL')
config.add_section('GLOBAL')
global_section_name = 'GLOBAL'
self.logger.debug("Added [GLOBAL] section to enblocal.ini")
# Set LinuxVersion=true
config.set(global_section_name, 'LinuxVersion', 'true')
self.logger.debug(f"Set LinuxVersion=true in [GLOBAL] section")
# Write back to file
with open(enblocal_ini_path, 'w', encoding='utf-8') as f:
config.write(f, space_around_delimiters=False)
self.logger.info(f"Successfully configured enblocal.ini: {enblocal_ini_path}")
return True
except configparser.DuplicateSectionError as e:
# Handle duplicate sections gracefully
self.logger.error(f"enblocal.ini has duplicate [GLOBAL] sections: {e}")
return False
except configparser.Error as e:
# Handle other configparser errors
self.logger.error(f"ConfigParser error reading enblocal.ini: {e}")
return False
except Exception as e:
# Handle any other errors
self.logger.error(f"Unexpected error configuring enblocal.ini: {e}", exc_info=True)
return False
def configure_enb_for_linux(self, modlist_path: Path) -> Tuple[bool, Optional[str], bool]:
"""
Main entry point: detect ENB and configure enblocal.ini.
Safe for modlists without ENB - returns success with no message.
Args:
modlist_path: Path to modlist installation directory
Returns:
Tuple[bool, Optional[str], bool]: (success, message, enb_detected)
- success: True if successful or no ENB detected, False on error
- message: Human-readable message (None if no action taken)
- enb_detected: True if ENB was detected, False otherwise
"""
try:
# Step 1: Detect ENB (safe - just searches for files)
enb_info = self.detect_enb_in_modlist(modlist_path)
enb_detected = enb_info.get('has_enb', False)
# Step 2: If no ENB detected, return success (no action needed)
if not enb_detected:
return (True, None, False) # Safe: no ENB, nothing to do
# Step 3: Find enblocal.ini
enblocal_path = enb_info.get('enblocal_ini')
if not enblocal_path:
# ENB detected but no enblocal.ini found - this is unusual but not an error
self.logger.warning("ENB detected but enblocal.ini not found - may be configured elsewhere")
return (True, None, True) # ENB detected but no config file
# Step 4: Configure enblocal.ini (safe method with all checks)
enblocal_path_obj = Path(enblocal_path)
success = self.ensure_linux_version_setting(enblocal_path_obj)
if success:
return (True, "ENB configured for Linux compatibility", True)
else:
# Non-blocking: log error but don't fail workflow
return (False, "Failed to configure ENB (see logs for details)", True)
except Exception as e:
# Catch-all error handling - never break the workflow
self.logger.error(f"Error in ENB configuration: {e}", exc_info=True)
return (False, "ENB configuration error (see logs)", False)

View File

@@ -604,6 +604,11 @@ class FileSystemHandler:
"""
Create required directories for a game modlist
This includes both Linux home directories and Wine prefix directories.
Creating the Wine prefix Documents directories is critical for USVFS
to work properly on first launch - USVFS needs the target directory
to exist before it can virtualize profile INI files.
Args:
game_name: Name of the game (e.g., skyrimse, fallout4)
appid: Steam AppID of the modlist
@@ -614,13 +619,24 @@ class FileSystemHandler:
try:
# Define base paths
home_dir = os.path.expanduser("~")
# Game-specific Documents directory names (for both Linux home and Wine prefix)
game_docs_dirs = {
"skyrimse": "Skyrim Special Edition",
"fallout4": "Fallout4",
"falloutnv": "FalloutNV",
"oblivion": "Oblivion",
"enderal": "Enderal Special Edition",
"enderalse": "Enderal Special Edition"
}
game_dirs = {
# Common directories needed across all games
"common": [
os.path.join(home_dir, ".local", "share", "Steam", "steamapps", "compatdata", appid, "pfx"),
os.path.join(home_dir, ".steam", "steam", "steamapps", "compatdata", appid, "pfx")
],
# Game-specific directories
# Game-specific directories in Linux home (legacy, may not be needed)
"skyrimse": [
os.path.join(home_dir, "Documents", "My Games", "Skyrim Special Edition"),
],
@@ -635,18 +651,52 @@ class FileSystemHandler:
]
}
# Create common directories
# Create common directories (compatdata pfx paths)
for dir_path in game_dirs["common"]:
if dir_path and os.path.exists(os.path.dirname(dir_path)):
os.makedirs(dir_path, exist_ok=True)
self.logger.debug(f"Created directory: {dir_path}")
# Create game-specific directories
# Create game-specific directories in Linux home (legacy support)
if game_name in game_dirs:
for dir_path in game_dirs[game_name]:
os.makedirs(dir_path, exist_ok=True)
self.logger.debug(f"Created game-specific directory: {dir_path}")
# CRITICAL: Create game-specific Documents directories in Wine prefix
# This is required for USVFS to virtualize profile INI files on first launch
if game_name in game_docs_dirs:
docs_dir_name = game_docs_dirs[game_name]
# Find compatdata path for this AppID
from ..handlers.path_handler import PathHandler
path_handler = PathHandler()
compatdata_path = path_handler.find_compat_data(appid)
if compatdata_path:
# Create Documents/My Games/{GameName} in Wine prefix
wine_docs_path = os.path.join(
str(compatdata_path),
"pfx",
"drive_c",
"users",
"steamuser",
"Documents",
"My Games",
docs_dir_name
)
try:
os.makedirs(wine_docs_path, exist_ok=True)
self.logger.info(f"Created Wine prefix Documents directory for USVFS: {wine_docs_path}")
self.logger.debug(f"This allows USVFS to virtualize profile INI files on first launch")
except Exception as e:
self.logger.warning(f"Could not create Wine prefix Documents directory {wine_docs_path}: {e}")
# Don't fail completely - this is a first-launch optimization
else:
self.logger.warning(f"Could not find compatdata path for AppID {appid}, skipping Wine prefix Documents directory creation")
self.logger.debug("Wine prefix Documents directories will be created when game runs for first time")
return True
except Exception as e:
self.logger.error(f"Error creating required directories: {e}")
@@ -671,59 +721,75 @@ class FileSystemHandler:
return True
@staticmethod
def set_ownership_and_permissions_sudo(path: Path, status_callback=None) -> bool:
"""Change ownership and permissions using sudo (robust, with timeout and re-prompt)."""
def verify_ownership_and_permissions(path: Path) -> tuple[bool, str]:
"""
Verify and fix ownership/permissions for modlist directory.
Returns (success, error_message).
Logic:
- If files NOT owned by user: Can't fix without sudo, return error with instructions
- If files owned by user: Try to fix permissions ourselves with chmod
"""
if not path.exists():
logger.error(f"Path does not exist: {path}")
return False
# Check if all files/dirs are already owned by the user
if FileSystemHandler.all_owned_by_user(path):
logger.info(f"All files in {path} are already owned by the current user. Skipping sudo chown/chmod.")
return True
return False, f"Path does not exist: {path}"
# Check if all files/dirs are owned by the user
if not FileSystemHandler.all_owned_by_user(path):
# Files not owned by us - need sudo to fix
try:
user_name = pwd.getpwuid(os.geteuid()).pw_name
group_name = grp.getgrgid(os.geteuid()).gr_name
except KeyError:
logger.error("Could not determine current user or group name.")
return False, "Could not determine current user or group name."
logger.error(f"Ownership issue detected: Some files in {path} are not owned by {user_name}")
error_msg = (
f"\nOwnership Issue Detected\n"
f"Some files in the modlist directory are not owned by your user account.\n"
f"This can happen if the modlist was copied from another location or installed by a different user.\n\n"
f"To fix this, open a terminal and run:\n\n"
f" sudo chown -R {user_name}:{group_name} \"{path}\"\n"
f" sudo chmod -R 755 \"{path}\"\n\n"
f"After running these commands, retry the configuration process."
)
return False, error_msg
# Files are owned by us - try to fix permissions ourselves
logger.info(f"Files in {path} are owned by current user, verifying permissions...")
try:
user_name = pwd.getpwuid(os.geteuid()).pw_name
group_name = grp.getgrgid(os.geteuid()).gr_name
except KeyError:
logger.error("Could not determine current user or group name.")
return False
result = subprocess.run(
['chmod', '-R', '755', str(path)],
capture_output=True,
text=True,
check=False
)
if result.returncode == 0:
logger.info(f"Permissions set successfully for {path}")
return True, ""
else:
logger.warning(f"chmod returned non-zero but we'll continue: {result.stderr}")
# Non-critical if chmod fails on our own files, might be read-only filesystem or similar
return True, ""
except Exception as e:
logger.warning(f"Error running chmod: {e}, continuing anyway")
# Non-critical error, we own the files so proceed
return True, ""
log_msg = f"Applying ownership/permissions for {path} (user: {user_name}, group: {group_name}) via sudo."
logger.info(log_msg)
if status_callback:
status_callback(f"Setting ownership/permissions for {os.path.basename(str(path))}...")
else:
print(f'\n{COLOR_PROMPT}Adjusting permissions for {path} (may require sudo password)...{COLOR_RESET}')
def run_sudo_with_retries(cmd, desc, max_retries=3, timeout=300):
for attempt in range(max_retries):
try:
logger.info(f"Running sudo command (attempt {attempt+1}/{max_retries}): {' '.join(cmd)}")
result = subprocess.run(cmd, capture_output=True, text=True, check=False, timeout=timeout)
if result.returncode == 0:
return True
else:
logger.error(f"sudo {desc} failed. Error: {result.stderr.strip()}")
print(f"Error: Failed to {desc}. Check logs.")
return False
except subprocess.TimeoutExpired:
logger.error(f"sudo {desc} timed out (attempt {attempt+1}/{max_retries}).")
print(f"\nSudo prompt timed out after {timeout} seconds. Please try again.")
# Flush input if possible, then retry
print(f"Failed to {desc} after {max_retries} attempts. Aborting.")
return False
# Run chown with retries
chown_command = ['sudo', 'chown', '-R', f'{user_name}:{group_name}', str(path)]
if not run_sudo_with_retries(chown_command, "change ownership"):
return False
print()
# Run chmod with retries
chmod_command = ['sudo', 'chmod', '-R', '755', str(path)]
if not run_sudo_with_retries(chmod_command, "set permissions"):
return False
print()
logger.info("Permissions set successfully.")
return True
@staticmethod
def set_ownership_and_permissions_sudo(path: Path, status_callback=None) -> bool:
"""
DEPRECATED: Use verify_ownership_and_permissions() instead.
This method is kept for backwards compatibility but no longer executes sudo.
"""
logger.warning("set_ownership_and_permissions_sudo() is deprecated - use verify_ownership_and_permissions()")
success, error_msg = FileSystemHandler.verify_ownership_and_permissions(path)
if not success:
logger.error(error_msg)
print(error_msg)
return success
def download_file(self, url: str, destination_path: Path, overwrite: bool = False, quiet: bool = False) -> bool:
"""Downloads a file from a URL to a destination path."""
@@ -784,7 +850,8 @@ class FileSystemHandler:
possible_vdf_paths = [
Path.home() / ".steam/steam/config/libraryfolders.vdf",
Path.home() / ".local/share/Steam/config/libraryfolders.vdf",
Path.home() / ".steam/root/config/libraryfolders.vdf"
Path.home() / ".steam/root/config/libraryfolders.vdf",
Path.home() / ".var/app/com.valvesoftware.Steam/.local/share/Steam/config/libraryfolders.vdf" # Flatpak
]
libraryfolders_vdf_path: Optional[Path] = None

View File

@@ -1,994 +0,0 @@
import logging
import os
import subprocess
import zipfile
import tarfile
from pathlib import Path
import yaml # Assuming PyYAML is installed
from typing import Dict, Optional, List
import requests
# Import necessary handlers from the current Jackify structure
from .path_handler import PathHandler
from .vdf_handler import VDFHandler # Keeping just in case
from .filesystem_handler import FileSystemHandler
from .config_handler import ConfigHandler
# Import color constants needed for print statements in this module
from .ui_colors import COLOR_ERROR, COLOR_SUCCESS, COLOR_WARNING, COLOR_RESET, COLOR_INFO, COLOR_PROMPT, COLOR_SELECTION
# Standard logging (no file handler) - LoggingHandler import removed
from .status_utils import show_status, clear_status
from .subprocess_utils import get_clean_subprocess_env
logger = logging.getLogger(__name__)
# Define default Hoolamike AppIDs for relevant games
TARGET_GAME_APPIDS = {
'Fallout 3': '22370', # GOTY Edition
'Fallout New Vegas': '22380', # Base game
'Skyrim Special Edition': '489830',
'Oblivion': '22330', # GOTY Edition
'Fallout 4': '377160'
}
# Define the expected name of the native Hoolamike executable
HOOLAMIKE_EXECUTABLE_NAME = "hoolamike" # Assuming this is the binary name
# Keep consistent with logs directory - use ~/Jackify/ for user-visible managed components
JACKIFY_BASE_DIR = Path.home() / "Jackify"
# Use Jackify base directory for ALL Hoolamike-related files to centralize management
DEFAULT_HOOLAMIKE_APP_INSTALL_DIR = JACKIFY_BASE_DIR / "Hoolamike"
HOOLAMIKE_CONFIG_DIR = DEFAULT_HOOLAMIKE_APP_INSTALL_DIR
HOOLAMIKE_CONFIG_FILENAME = "hoolamike.yaml"
# Default dirs for other components
DEFAULT_HOOLAMIKE_DOWNLOADS_DIR = JACKIFY_BASE_DIR / "Mod_Downloads"
DEFAULT_MODLIST_INSTALL_BASE_DIR = Path.home() / "ModdedGames"
class HoolamikeHandler:
"""Handles discovery, configuration, and execution of Hoolamike tasks.
Assumes Hoolamike is a native Linux CLI application.
"""
def __init__(self, steamdeck: bool, verbose: bool, filesystem_handler: FileSystemHandler, config_handler: ConfigHandler, menu_handler=None):
"""Initialize the handler and perform initial discovery."""
self.steamdeck = steamdeck
self.verbose = verbose
self.path_handler = PathHandler()
self.filesystem_handler = filesystem_handler
self.config_handler = config_handler
self.menu_handler = menu_handler
# Use standard logging (no file handler)
self.logger = logging.getLogger(__name__)
# --- Discovered/Managed State ---
self.game_install_paths: Dict[str, Path] = {}
# Allow user override for Hoolamike app install path later
self.hoolamike_app_install_path: Path = DEFAULT_HOOLAMIKE_APP_INSTALL_DIR
self.hoolamike_executable_path: Optional[Path] = None # Path to the binary
self.hoolamike_installed: bool = False
self.hoolamike_config_path: Path = HOOLAMIKE_CONFIG_DIR / HOOLAMIKE_CONFIG_FILENAME
self.hoolamike_config: Optional[Dict] = None
# Load Hoolamike install path from Jackify config if it exists
saved_path_str = self.config_handler.get('hoolamike_install_path')
if saved_path_str and Path(saved_path_str).is_dir(): # Basic check if path exists
self.hoolamike_app_install_path = Path(saved_path_str)
self.logger.info(f"Loaded Hoolamike install path from Jackify config: {self.hoolamike_app_install_path}")
self._load_hoolamike_config()
self._run_discovery()
def _ensure_hoolamike_dirs_exist(self):
"""Ensure base directories for Hoolamike exist."""
try:
HOOLAMIKE_CONFIG_DIR.mkdir(parents=True, exist_ok=True) # Separate Hoolamike config
self.hoolamike_app_install_path.mkdir(parents=True, exist_ok=True) # Install dir (~/Jackify/Hoolamike)
# Default downloads dir also needs to exist if we reference it
DEFAULT_HOOLAMIKE_DOWNLOADS_DIR.mkdir(parents=True, exist_ok=True)
except OSError as e:
self.logger.error(f"Error creating Hoolamike directories: {e}", exc_info=True)
# Decide how to handle this - maybe raise an exception?
def _check_hoolamike_installation(self):
"""Check if Hoolamike executable exists at the expected location.
Prioritizes path stored in config if available.
"""
potential_exe_path = self.hoolamike_app_install_path / HOOLAMIKE_EXECUTABLE_NAME
check_path = None
if potential_exe_path.is_file() and os.access(potential_exe_path, os.X_OK):
check_path = potential_exe_path
self.logger.info(f"Found Hoolamike at current path: {check_path}")
else:
self.logger.info(f"Hoolamike executable ({HOOLAMIKE_EXECUTABLE_NAME}) not found or not executable at current path {self.hoolamike_app_install_path}.")
# Update state based on whether we found a valid path
if check_path:
self.hoolamike_installed = True
self.hoolamike_executable_path = check_path
else:
self.hoolamike_installed = False
self.hoolamike_executable_path = None
def _generate_default_config(self) -> Dict:
"""Generates the default configuration dictionary."""
self.logger.info("Generating default Hoolamike config structure.")
# Detection is now handled separately after loading config
detected_paths = self.path_handler.find_game_install_paths(TARGET_GAME_APPIDS)
config = {
"downloaders": {
"downloads_directory": str(DEFAULT_HOOLAMIKE_DOWNLOADS_DIR),
"nexus": {"api_key": "YOUR_API_KEY_HERE"}
},
"installation": {
"wabbajack_file_path": "", # Placeholder, set per-run
"installation_path": "" # Placeholder, set per-run
},
"games": { # Only include detected games with consistent formatting (no spaces)
self._format_game_name(game_name): {"root_directory": str(path)}
for game_name, path in detected_paths.items()
},
"fixup": {
"game_resolution": "1920x1080"
},
"extras": {
"tale_of_two_wastelands": {
"path_to_ttw_mpi_file": "", # Placeholder
"variables": {
"DESTINATION": "" # Placeholder
}
}
}
}
# Add comment if no games detected
if not detected_paths:
# This won't appear in YAML, logic adjusted below
pass
return config
def _format_game_name(self, game_name: str) -> str:
"""Formats game name for Hoolamike configuration (removes spaces).
Hoolamike expects game names without spaces like: Fallout3, FalloutNewVegas, SkyrimSpecialEdition
"""
# Handle specific game name formats that Hoolamike expects
game_name_map = {
"Fallout 3": "Fallout3",
"Fallout New Vegas": "FalloutNewVegas",
"Skyrim Special Edition": "SkyrimSpecialEdition",
"Fallout 4": "Fallout4",
"Oblivion": "Oblivion" # No change needed
}
# Use predefined mapping if available
if game_name in game_name_map:
return game_name_map[game_name]
# Otherwise, just remove spaces as fallback
return game_name.replace(" ", "")
def _load_hoolamike_config(self):
"""Load hoolamike.yaml if it exists, or generate a default one."""
self._ensure_hoolamike_dirs_exist() # Ensure parent dir exists
if self.hoolamike_config_path.is_file():
self.logger.info(f"Found existing hoolamike.yaml at {self.hoolamike_config_path}. Loading...")
try:
with open(self.hoolamike_config_path, 'r', encoding='utf-8') as f:
self.hoolamike_config = yaml.safe_load(f)
if not isinstance(self.hoolamike_config, dict):
self.logger.warning(f"Failed to parse hoolamike.yaml as a dictionary. Generating default.")
self.hoolamike_config = self._generate_default_config()
self.save_hoolamike_config() # Save the newly generated default
else:
self.logger.info("Successfully loaded hoolamike.yaml configuration.")
# Game path merging is handled in _run_discovery now
except yaml.YAMLError as e:
self.logger.error(f"Error parsing hoolamike.yaml: {e}. The file may be corrupted.")
# Don't automatically overwrite - let user decide
self.hoolamike_config = None
return False
except Exception as e:
self.logger.error(f"Error reading hoolamike.yaml: {e}.", exc_info=True)
# Don't automatically overwrite - let user decide
self.hoolamike_config = None
return False
else:
self.logger.info(f"hoolamike.yaml not found at {self.hoolamike_config_path}. Generating default configuration.")
self.hoolamike_config = self._generate_default_config()
self.save_hoolamike_config()
return True
def save_hoolamike_config(self):
"""Saves the current configuration dictionary to hoolamike.yaml."""
if self.hoolamike_config is None:
self.logger.error("Cannot save config, internal config dictionary is None.")
return False
self._ensure_hoolamike_dirs_exist() # Ensure parent dir exists
self.logger.info(f"Saving configuration to {self.hoolamike_config_path}")
try:
with open(self.hoolamike_config_path, 'w', encoding='utf-8') as f:
# Add comments conditionally
f.write("# Configuration file created or updated by Jackify\n")
if not self.hoolamike_config.get("games"):
f.write("# No games were detected by Jackify. Add game paths manually if needed.\n")
# Dump the actual YAML
yaml.dump(self.hoolamike_config, f, default_flow_style=False, sort_keys=False)
self.logger.info("Configuration saved successfully.")
return True
except Exception as e:
self.logger.error(f"Error saving hoolamike.yaml: {e}", exc_info=True)
return False
def _run_discovery(self):
"""Execute all discovery steps."""
self.logger.info("Starting Hoolamike feature discovery phase...")
# Detect game paths and update internal state + config
self._detect_and_update_game_paths()
self.logger.info("Hoolamike discovery phase complete.")
def _detect_and_update_game_paths(self):
"""Detect game install paths and update state and config."""
self.logger.info("Detecting game install paths...")
# Always run detection
detected_paths = self.path_handler.find_game_install_paths(TARGET_GAME_APPIDS)
self.game_install_paths = detected_paths # Update internal state
self.logger.info(f"Detected game paths: {detected_paths}")
# Update the loaded config if it exists
if self.hoolamike_config is not None:
self.logger.debug("Updating loaded hoolamike.yaml with detected game paths.")
if "games" not in self.hoolamike_config or not isinstance(self.hoolamike_config.get("games"), dict):
self.hoolamike_config["games"] = {} # Ensure games section exists
# Define a unified format for game names in config - no spaces
# Clear existing entries first to avoid duplicates
self.hoolamike_config["games"] = {}
# Add detected paths with proper formatting - no spaces
for game_name, detected_path in detected_paths.items():
formatted_name = self._format_game_name(game_name)
self.hoolamike_config["games"][formatted_name] = {"root_directory": str(detected_path)}
self.logger.info(f"Updated config with {len(detected_paths)} game paths using correct naming format (no spaces)")
else:
self.logger.warning("Cannot update game paths in config because config is not loaded.")
# --- Methods for Hoolamike Tasks (To be implemented later) ---
# TODO: Update these methods to accept necessary parameters and update/save config
def install_update_hoolamike(self, context=None) -> bool:
"""Install or update Hoolamike application.
Returns:
bool: True if installation/update was successful or process was properly cancelled,
False if a critical error occurred.
"""
self.logger.info("Starting Hoolamike Installation/Update...")
print("\nStarting Hoolamike Installation/Update...")
# 1. Prompt user to install/reinstall/update
try:
# Check if Hoolamike is already installed at the expected path
self._check_hoolamike_installation()
if self.hoolamike_installed:
self.logger.info(f"Hoolamike appears to be installed at: {self.hoolamike_executable_path}")
print(f"{COLOR_INFO}Hoolamike is already installed at:{COLOR_RESET}")
print(f" {self.hoolamike_executable_path}")
# Use a menu-style prompt for reinstall/update
print(f"\n{COLOR_PROMPT}Choose an action for Hoolamike:{COLOR_RESET}")
print(f" 1. Reinstall/Update Hoolamike")
print(f" 2. Keep existing installation (return to menu)")
while True:
choice = input(f"Select an option [1-2]: ").strip()
if choice == '1':
self.logger.info("User chose to reinstall/update Hoolamike.")
break
elif choice == '2' or choice.lower() == 'q':
self.logger.info("User chose to keep existing Hoolamike installation.")
print("Skipping Hoolamike installation/update.")
return True
else:
print(f"{COLOR_WARNING}Invalid choice. Please enter 1 or 2.{COLOR_RESET}")
# 2. Get installation directory from user (allow override)
self.logger.info(f"Default install path: {self.hoolamike_app_install_path}")
print("\nHoolamike Installation Directory:")
print(f"Default: {self.hoolamike_app_install_path}")
install_dir = self.menu_handler.get_directory_path(
prompt_message=f"Specify where to install Hoolamike (or press Enter for default)",
default_path=self.hoolamike_app_install_path,
create_if_missing=True,
no_header=True
)
if install_dir is None:
self.logger.warning("User cancelled Hoolamike installation path selection.")
print("Installation cancelled.")
return True
# Check if hoolamike already exists at this specific path
potential_existing_exe = install_dir / HOOLAMIKE_EXECUTABLE_NAME
if potential_existing_exe.is_file() and os.access(potential_existing_exe, os.X_OK):
self.logger.info(f"Hoolamike executable found at the chosen path: {potential_existing_exe}")
print(f"{COLOR_INFO}Hoolamike appears to already be installed at:{COLOR_RESET}")
print(f" {install_dir}")
# Use menu-style prompt for overwrite
print(f"{COLOR_PROMPT}Choose an action for the existing installation:{COLOR_RESET}")
print(f" 1. Download and overwrite (update)")
print(f" 2. Keep existing installation (return to menu)")
while True:
overwrite_choice = input(f"Select an option [1-2]: ").strip()
if overwrite_choice == '1':
self.logger.info("User chose to update (overwrite) existing Hoolamike installation.")
break
elif overwrite_choice == '2' or overwrite_choice.lower() == 'q':
self.logger.info("User chose to keep existing Hoolamike installation at chosen path.")
print("Update cancelled. Using existing installation for this session.")
self.hoolamike_app_install_path = install_dir
self.hoolamike_executable_path = potential_existing_exe
self.hoolamike_installed = True
return True
else:
print(f"{COLOR_WARNING}Invalid choice. Please enter 1 or 2.{COLOR_RESET}")
# Proceed with install/update
self.logger.info(f"Proceeding with installation to directory: {install_dir}")
self.hoolamike_app_install_path = install_dir
# Get latest release info from GitHub
release_url = "https://api.github.com/repos/Niedzwiedzw/hoolamike/releases/latest"
download_url = None
asset_name = None
try:
self.logger.info(f"Fetching latest release info from {release_url}")
show_status("Fetching latest Hoolamike release info...")
response = requests.get(release_url, timeout=15, verify=True)
response.raise_for_status()
release_data = response.json()
self.logger.debug(f"GitHub Release Data: {release_data}")
linux_tar_asset = None
linux_zip_asset = None
for asset in release_data.get('assets', []):
name = asset.get('name', '').lower()
self.logger.debug(f"Checking asset: {name}")
is_linux = 'linux' in name
is_x64 = 'x86_64' in name or 'amd64' in name
is_incompatible_arch = 'arm' in name or 'aarch64' in name or 'darwin' in name
if is_linux and is_x64 and not is_incompatible_arch:
if name.endswith(('.tar.gz', '.tgz')):
linux_tar_asset = asset
self.logger.debug(f"Found potential tar asset: {name}")
break
elif name.endswith('.zip') and not linux_tar_asset:
linux_zip_asset = asset
self.logger.debug(f"Found potential zip asset: {name}")
chosen_asset = linux_tar_asset or linux_zip_asset
if not chosen_asset:
clear_status()
self.logger.error("Could not find a suitable Linux x86_64 download asset (tar.gz/zip) in the latest release.")
print(f"{COLOR_ERROR}Error: Could not find a linux x86_64 download asset in the latest Hoolamike release.{COLOR_RESET}")
return False
download_url = chosen_asset.get('browser_download_url')
asset_name = chosen_asset.get('name')
if not download_url or not asset_name:
clear_status()
self.logger.error(f"Chosen asset is missing URL or name: {chosen_asset}")
print(f"{COLOR_ERROR}Error: Found asset but could not get download details.{COLOR_RESET}")
return False
self.logger.info(f"Found asset '{asset_name}' for download: {download_url}")
clear_status()
except requests.exceptions.RequestException as e:
clear_status()
self.logger.error(f"Failed to fetch release info from GitHub: {e}")
print(f"Error: Failed to contact GitHub to check for Hoolamike updates: {e}")
return False
except Exception as e:
clear_status()
self.logger.error(f"Error parsing release info: {e}", exc_info=True)
print("Error: Failed to understand release information from GitHub.")
return False
# Download the asset
show_status(f"Downloading {asset_name}...")
temp_download_path = self.hoolamike_app_install_path / asset_name
if not self.filesystem_handler.download_file(download_url, temp_download_path, overwrite=True, quiet=True):
clear_status()
self.logger.error(f"Failed to download {asset_name} from {download_url}")
print(f"{COLOR_ERROR}Error: Failed to download Hoolamike asset.{COLOR_RESET}")
return False
clear_status()
self.logger.info(f"Downloaded {asset_name} successfully to {temp_download_path}")
show_status("Extracting Hoolamike archive...")
# Extract the asset
try:
if asset_name.lower().endswith(('.tar.gz', '.tgz')):
self.logger.debug(f"Extracting tar file: {temp_download_path}")
with tarfile.open(temp_download_path, 'r:*') as tar:
tar.extractall(path=self.hoolamike_app_install_path)
self.logger.info("Extracted tar file successfully.")
elif asset_name.lower().endswith('.zip'):
self.logger.debug(f"Extracting zip file: {temp_download_path}")
with zipfile.ZipFile(temp_download_path, 'r') as zip_ref:
zip_ref.extractall(self.hoolamike_app_install_path)
self.logger.info("Extracted zip file successfully.")
else:
clear_status()
self.logger.error(f"Unknown archive format for asset: {asset_name}")
print(f"{COLOR_ERROR}Error: Unknown file type '{asset_name}'. Cannot extract.{COLOR_RESET}")
return False
clear_status()
print("Extraction complete. Setting permissions...")
except (tarfile.TarError, zipfile.BadZipFile, EOFError) as e:
clear_status()
self.logger.error(f"Failed to extract archive {temp_download_path}: {e}", exc_info=True)
print(f"{COLOR_ERROR}Error: Failed to extract downloaded file: {e}{COLOR_RESET}")
return False
except Exception as e:
clear_status()
self.logger.error(f"An unexpected error occurred during extraction: {e}", exc_info=True)
print(f"{COLOR_ERROR}An unexpected error occurred during extraction.{COLOR_RESET}")
return False
finally:
# Clean up downloaded archive
if temp_download_path.exists():
try:
temp_download_path.unlink()
self.logger.debug(f"Removed temporary download file: {temp_download_path}")
except OSError as e:
self.logger.warning(f"Could not remove temporary download file {temp_download_path}: {e}")
# Set execute permissions on the binary
executable_path = self.hoolamike_app_install_path / HOOLAMIKE_EXECUTABLE_NAME
if executable_path.is_file():
try:
show_status("Setting permissions on Hoolamike executable...")
os.chmod(executable_path, 0o755)
self.logger.info(f"Set execute permissions (+x) on {executable_path}")
clear_status()
print("Permissions set successfully.")
except OSError as e:
clear_status()
self.logger.error(f"Failed to set execute permission on {executable_path}: {e}")
print(f"{COLOR_ERROR}Error: Could not set execute permission on Hoolamike executable.{COLOR_RESET}")
else:
clear_status()
self.logger.error(f"Hoolamike executable not found after extraction at {executable_path}")
print(f"{COLOR_ERROR}Error: Hoolamike executable missing after extraction!{COLOR_RESET}")
return False
# Update self.hoolamike_installed and self.hoolamike_executable_path state
self.logger.info("Refreshing Hoolamike installation status...")
self._check_hoolamike_installation()
if not self.hoolamike_installed:
self.logger.error("Hoolamike check failed after apparent successful install/extract.")
print(f"{COLOR_ERROR}Error: Installation completed, but failed final verification check.{COLOR_RESET}")
return False
# Save install path to Jackify config
self.logger.info(f"Saving Hoolamike install path to Jackify config: {self.hoolamike_app_install_path}")
self.config_handler.set('hoolamike_install_path', str(self.hoolamike_app_install_path))
if not self.config_handler.save_config():
self.logger.warning("Failed to save Jackify config file after updating Hoolamike path.")
# Non-fatal, but warn user?
print(f"{COLOR_WARNING}Warning: Could not save installation path to main Jackify config file.{COLOR_RESET}")
print(f"{COLOR_SUCCESS}Hoolamike installation/update successful!{COLOR_RESET}")
self.logger.info("Hoolamike install/update process completed successfully.")
return True
except Exception as e:
self.logger.error(f"Error during Hoolamike installation/update: {e}", exc_info=True)
print(f"{COLOR_ERROR}Error: An unexpected error occurred during Hoolamike installation/update: {e}{COLOR_RESET}")
return False
def install_modlist(self, wabbajack_path=None, install_path=None, downloads_path=None, premium=False, api_key=None, game_resolution=None, context=None):
"""
Install a Wabbajack modlist using Hoolamike, following Jackify's Discovery/Configuration/Confirmation pattern.
"""
self.logger.info("Starting Hoolamike modlist install (Discovery Phase)")
self._check_hoolamike_installation()
menu = self.menu_handler
print(f"\n{'='*60}")
print(f"{COLOR_INFO}Hoolamike Modlist Installation{COLOR_RESET}")
print(f"{'='*60}\n")
# --- Discovery Phase ---
# 1. Auto-detect games (robust, multi-library)
detected_games = self.path_handler.find_vanilla_game_paths()
# 2. Prompt for .wabbajack file (custom prompt, only accept .wabbajack, q to exit, with tab-completion)
print()
while not wabbajack_path:
print(f"{COLOR_WARNING}This option requires a Nexus Mods Premium account for automatic downloads.{COLOR_RESET}")
print(f"If you don't have a premium account, please use the '{COLOR_SELECTION}Non-Premium Installation{COLOR_RESET}' option from the previous menu instead.\n")
print(f"Before continuing, you'll need a .wabbajack file. You can usually find these at:")
print(f" 1. {COLOR_INFO}https://build.wabbajack.org/authored_files{COLOR_RESET} - Official Wabbajack modlist repository")
print(f" 2. {COLOR_INFO}https://www.nexusmods.com/{COLOR_RESET} - Some modlist authors publish on Nexus Mods")
print(f" 3. Various Discord communities for specific modlists\n")
print(f"{COLOR_WARNING}NOTE: Download the .wabbajack file first, then continue. Enter 'q' to exit.{COLOR_RESET}\n")
# Use menu.get_existing_file_path for tab-completion
candidate = menu.get_existing_file_path(
prompt_message="Enter the path to your .wabbajack file (or 'q' to cancel):",
extension_filter=".wabbajack",
no_header=True
)
if candidate is None:
print(f"{COLOR_WARNING}Cancelled by user.{COLOR_RESET}")
return False
# If user literally typed 'q', treat as cancel
if str(candidate).strip().lower() == 'q':
print(f"{COLOR_WARNING}Cancelled by user.{COLOR_RESET}")
return False
wabbajack_path = candidate
# 3. Prompt for install directory
print()
while True:
install_path_result = menu.get_directory_path(
prompt_message="Select the directory where the modlist should be installed:",
default_path=DEFAULT_MODLIST_INSTALL_BASE_DIR / wabbajack_path.stem,
create_if_missing=True,
no_header=False
)
if not install_path_result:
print(f"{COLOR_WARNING}Cancelled by user.{COLOR_RESET}")
return False
# Handle tuple (path, should_create)
if isinstance(install_path_result, tuple):
install_path, install_should_create = install_path_result
else:
install_path, install_should_create = install_path_result, False
# Check if directory exists and is not empty
if install_path.exists() and any(install_path.iterdir()):
print(f"{COLOR_WARNING}Warning: The selected directory '{install_path}' already exists and is not empty. Its contents may be overwritten!{COLOR_RESET}")
confirm = input(f"{COLOR_PROMPT}This directory is not empty and may be overwritten. Proceed? (y/N): {COLOR_RESET}").strip().lower()
if not confirm.startswith('y'):
print(f"{COLOR_INFO}Please select a different directory.\n{COLOR_RESET}")
continue
break
# 4. Prompt for downloads directory
print()
if not downloads_path:
downloads_path_result = menu.get_directory_path(
prompt_message="Select the directory for mod downloads:",
default_path=DEFAULT_HOOLAMIKE_DOWNLOADS_DIR,
create_if_missing=True,
no_header=False
)
if not downloads_path_result:
print(f"{COLOR_WARNING}Cancelled by user.{COLOR_RESET}")
return False
# Handle tuple (path, should_create)
if isinstance(downloads_path_result, tuple):
downloads_path, downloads_should_create = downloads_path_result
else:
downloads_path, downloads_should_create = downloads_path_result, False
else:
downloads_should_create = False
# 5. Nexus API key
print()
current_api_key = self.hoolamike_config.get('downloaders', {}).get('nexus', {}).get('api_key') if self.hoolamike_config else None
if not current_api_key or current_api_key == 'YOUR_API_KEY_HERE':
api_key = menu.get_nexus_api_key(current_api_key)
if not api_key:
print(f"{COLOR_WARNING}Cancelled by user.{COLOR_RESET}")
return False
else:
api_key = current_api_key
# --- Summary & Confirmation ---
print(f"\n{'-'*60}")
print(f"{COLOR_INFO}Summary of configuration:{COLOR_RESET}")
print(f"- Wabbajack file: {wabbajack_path}")
print(f"- Install directory: {install_path}")
print(f"- Downloads directory: {downloads_path}")
print(f"- Nexus API key: [{'Set' if api_key else 'Not Set'}]")
print("- Games:")
for game in ["Fallout 3", "Fallout New Vegas", "Skyrim Special Edition", "Oblivion", "Fallout 4"]:
found = detected_games.get(game)
print(f" {game}: {found if found else 'Not Found'}")
print(f"{'-'*60}")
print(f"{COLOR_WARNING}Proceed with these settings and start Hoolamike install? (Warning: This can take MANY HOURS){COLOR_RESET}")
confirm = input(f"{COLOR_PROMPT}[Y/n]: {COLOR_RESET}").strip().lower()
if confirm and not confirm.startswith('y'):
print(f"{COLOR_WARNING}Cancelled by user.{COLOR_RESET}")
return False
# --- Actually create directories if needed ---
if install_should_create and not install_path.exists():
try:
install_path.mkdir(parents=True, exist_ok=True)
print(f"{COLOR_SUCCESS}Install directory created: {install_path}{COLOR_RESET}")
except Exception as e:
print(f"{COLOR_ERROR}Failed to create install directory: {e}{COLOR_RESET}")
return False
if downloads_should_create and not downloads_path.exists():
try:
downloads_path.mkdir(parents=True, exist_ok=True)
print(f"{COLOR_SUCCESS}Downloads directory created: {downloads_path}{COLOR_RESET}")
except Exception as e:
print(f"{COLOR_ERROR}Failed to create downloads directory: {e}{COLOR_RESET}")
return False
# --- Configuration Phase ---
# Prepare config dict
config = {
"downloaders": {
"downloads_directory": str(downloads_path),
"nexus": {"api_key": api_key}
},
"installation": {
"wabbajack_file_path": str(wabbajack_path),
"installation_path": str(install_path)
},
"games": {
self._format_game_name(game): {"root_directory": str(path)}
for game, path in detected_games.items()
},
"fixup": {
"game_resolution": "1920x1080"
},
# Resolution intentionally omitted
# "extras": {},
# No 'jackify_managed' key here
}
self.hoolamike_config = config
if not self.save_hoolamike_config():
print(f"{COLOR_ERROR}Failed to save hoolamike.yaml. Aborting.{COLOR_RESET}")
return False
# --- Run Hoolamike ---
print(f"\n{COLOR_INFO}Starting Hoolamike...{COLOR_RESET}")
print(f"{COLOR_INFO}Streaming output below. Press Ctrl+C to cancel and return to Jackify menu.{COLOR_RESET}\n")
# Defensive: Ensure executable path is set and valid
if not self.hoolamike_executable_path or not Path(self.hoolamike_executable_path).is_file():
print(f"{COLOR_ERROR}Error: Hoolamike executable not found or not set. Please (re)install Hoolamike from the menu before continuing.{COLOR_RESET}")
input(f"{COLOR_PROMPT}Press Enter to return to the Hoolamike menu...{COLOR_RESET}")
return False
try:
cmd = [str(self.hoolamike_executable_path), "install"]
ret = subprocess.call(cmd, cwd=str(self.hoolamike_app_install_path), env=get_clean_subprocess_env())
if ret == 0:
print(f"\n{COLOR_SUCCESS}Hoolamike completed successfully!{COLOR_RESET}")
input(f"{COLOR_PROMPT}Press Enter to return to the Hoolamike menu...{COLOR_RESET}")
return True
else:
print(f"\n{COLOR_ERROR}Hoolamike process failed with exit code {ret}.{COLOR_RESET}")
input(f"{COLOR_PROMPT}Press Enter to return to the Hoolamike menu...{COLOR_RESET}")
return False
except KeyboardInterrupt:
print(f"\n{COLOR_WARNING}Hoolamike install interrupted by user. Returning to menu.{COLOR_RESET}")
input(f"{COLOR_PROMPT}Press Enter to return to the Hoolamike menu...{COLOR_RESET}")
return False
except Exception as e:
print(f"\n{COLOR_ERROR}Error running Hoolamike: {e}{COLOR_RESET}")
input(f"{COLOR_PROMPT}Press Enter to return to the Hoolamike menu...{COLOR_RESET}")
return False
def install_ttw(self, ttw_mpi_path=None, ttw_output_path=None, context=None):
"""Install Tale of Two Wastelands (TTW) using Hoolamike.
Args:
ttw_mpi_path: Path to the TTW installer .mpi file
ttw_output_path: Target installation directory for TTW
Returns:
bool: True if successful, False otherwise
"""
self.logger.info(f"Starting Tale of Two Wastelands installation via Hoolamike")
self._check_hoolamike_installation()
menu = self.menu_handler
print(f"\n{'='*60}")
print(f"{COLOR_INFO}Hoolamike: Tale of Two Wastelands Installation{COLOR_RESET}")
print(f"{'='*60}\n")
print(f"This feature will install Tale of Two Wastelands (TTW) using Hoolamike.")
print(f"Requirements:")
print(f" • Fallout 3 and Fallout New Vegas must be installed and detected.")
print(f" • You must provide the path to your TTW .mpi installer file.")
print(f" • You must select an output directory for the TTW install.\n")
# Ensure config is loaded
if self.hoolamike_config is None:
loaded = self._load_hoolamike_config()
if not loaded or self.hoolamike_config is None:
self.logger.error("Failed to load or generate hoolamike.yaml configuration.")
print(f"{COLOR_ERROR}Error: Could not load or generate Hoolamike configuration. Aborting TTW install.{COLOR_RESET}")
return False
# Verify required games are in configuration
required_games = ['Fallout 3', 'Fallout New Vegas']
detected_games = self.path_handler.find_vanilla_game_paths()
missing_games = [game for game in required_games if game not in detected_games]
if missing_games:
self.logger.error(f"Missing required games for TTW installation: {', '.join(missing_games)}")
print(f"{COLOR_ERROR}Error: The following required games were not found: {', '.join(missing_games)}{COLOR_RESET}")
print("TTW requires both Fallout 3 and Fallout New Vegas to be installed.")
return False
# Prompt for TTW .mpi file
print(f"{COLOR_INFO}Please provide the path to your TTW .mpi installer file.{COLOR_RESET}")
print(f"You can download this from: {COLOR_INFO}https://mod.pub/ttw/133/files{COLOR_RESET}")
print(f"(Extract the .mpi file from the downloaded archive.)\n")
while not ttw_mpi_path:
candidate = menu.get_existing_file_path(
prompt_message="Enter the path to your TTW .mpi file (or 'q' to cancel):",
extension_filter=".mpi",
no_header=True
)
if candidate is None:
print(f"{COLOR_WARNING}Cancelled by user.{COLOR_RESET}")
return False
if str(candidate).strip().lower() == 'q':
print(f"{COLOR_WARNING}Cancelled by user.{COLOR_RESET}")
return False
ttw_mpi_path = candidate
# Prompt for output directory
print(f"\n{COLOR_INFO}Please select the output directory where TTW will be installed.{COLOR_RESET}")
print(f"(This should be an empty or new directory.)\n")
while not ttw_output_path:
ttw_output_path = menu.get_directory_path(
prompt_message="Select the TTW output directory:",
default_path=self.hoolamike_app_install_path / "TTW_Output",
create_if_missing=True,
no_header=False
)
if not ttw_output_path:
print(f"{COLOR_WARNING}Cancelled by user.{COLOR_RESET}")
return False
if ttw_output_path.exists() and any(ttw_output_path.iterdir()):
print(f"{COLOR_WARNING}Warning: The selected directory '{ttw_output_path}' already exists and is not empty. Its contents may be overwritten!{COLOR_RESET}")
confirm = input(f"{COLOR_PROMPT}This directory is not empty and may be overwritten. Proceed? (y/N): {COLOR_RESET}").strip().lower()
if not confirm.startswith('y'):
print(f"{COLOR_INFO}Please select a different directory.\n{COLOR_RESET}")
ttw_output_path = None
continue
# --- Summary & Confirmation ---
print(f"\n{'-'*60}")
print(f"{COLOR_INFO}Summary of configuration:{COLOR_RESET}")
print(f"- TTW .mpi file: {ttw_mpi_path}")
print(f"- Output directory: {ttw_output_path}")
print("- Games:")
for game in required_games:
found = detected_games.get(game)
print(f" {game}: {found if found else 'Not Found'}")
print(f"{'-'*60}")
print(f"{COLOR_WARNING}Proceed with these settings and start TTW installation? (This can take MANY HOURS){COLOR_RESET}")
confirm = input(f"{COLOR_PROMPT}[Y/n]: {COLOR_RESET}").strip().lower()
if confirm and not confirm.startswith('y'):
print(f"{COLOR_WARNING}Cancelled by user.{COLOR_RESET}")
return False
# --- Always re-detect games before updating config ---
detected_games = self.path_handler.find_vanilla_game_paths()
if not detected_games:
print(f"{COLOR_ERROR}No supported games were detected on your system. TTW requires Fallout 3 and Fallout New Vegas to be installed.{COLOR_RESET}")
return False
# Update the games section with correct keys
if self.hoolamike_config is None:
self.hoolamike_config = {}
self.hoolamike_config['games'] = {
self._format_game_name(game): {"root_directory": str(path)}
for game, path in detected_games.items()
}
# Update TTW configuration
self._update_hoolamike_config_for_ttw(ttw_mpi_path, ttw_output_path)
if not self.save_hoolamike_config():
self.logger.error("Failed to save hoolamike.yaml configuration.")
print(f"{COLOR_ERROR}Error: Failed to save Hoolamike configuration.{COLOR_RESET}")
print("Attempting to continue anyway...")
# Construct command to execute
cmd = [
str(self.hoolamike_executable_path),
"tale-of-two-wastelands"
]
self.logger.info(f"Executing Hoolamike command: {' '.join(cmd)}")
print(f"\n{COLOR_INFO}Executing Hoolamike for TTW Installation...{COLOR_RESET}")
print(f"Command: {' '.join(cmd)}")
print(f"{COLOR_INFO}Streaming output below. Press Ctrl+C to cancel and return to Jackify menu.{COLOR_RESET}\n")
try:
ret = subprocess.call(cmd, cwd=str(self.hoolamike_app_install_path), env=get_clean_subprocess_env())
if ret == 0:
self.logger.info("TTW installation completed successfully.")
print(f"\n{COLOR_SUCCESS}TTW installation completed successfully!{COLOR_RESET}")
input(f"{COLOR_PROMPT}Press Enter to return to the Hoolamike menu...{COLOR_RESET}")
return True
else:
self.logger.error(f"TTW installation process returned non-zero exit code: {ret}")
print(f"\n{COLOR_ERROR}Error: TTW installation failed with exit code {ret}.{COLOR_RESET}")
input(f"{COLOR_PROMPT}Press Enter to return to the Hoolamike menu...{COLOR_RESET}")
return False
except Exception as e:
self.logger.error(f"Error executing Hoolamike TTW installation: {e}", exc_info=True)
print(f"\n{COLOR_ERROR}Error executing Hoolamike TTW installation: {e}{COLOR_RESET}")
input(f"{COLOR_PROMPT}Press Enter to return to the Hoolamike menu...{COLOR_RESET}")
return False
def _update_hoolamike_config_for_ttw(self, ttw_mpi_path: Path, ttw_output_path: Path):
"""Update the Hoolamike configuration with settings for TTW installation."""
# Ensure extras and TTW sections exist
if "extras" not in self.hoolamike_config:
self.hoolamike_config["extras"] = {}
if "tale_of_two_wastelands" not in self.hoolamike_config["extras"]:
self.hoolamike_config["extras"]["tale_of_two_wastelands"] = {
"variables": {}
}
# Update TTW configuration
ttw_config = self.hoolamike_config["extras"]["tale_of_two_wastelands"]
ttw_config["path_to_ttw_mpi_file"] = str(ttw_mpi_path)
# Ensure variables section exists
if "variables" not in ttw_config:
ttw_config["variables"] = {}
# Set destination variable
ttw_config["variables"]["DESTINATION"] = str(ttw_output_path)
# Set USERPROFILE to a Jackify-managed directory for TTW
userprofile_path = str(self.hoolamike_app_install_path / "USERPROFILE")
if "variables" not in self.hoolamike_config["extras"]["tale_of_two_wastelands"]:
self.hoolamike_config["extras"]["tale_of_two_wastelands"]["variables"] = {}
self.hoolamike_config["extras"]["tale_of_two_wastelands"]["variables"]["USERPROFILE"] = userprofile_path
# Make sure game paths are set correctly
for game in ['Fallout 3', 'Fallout New Vegas']:
if game in self.game_install_paths:
game_key = game.replace(' ', '').lower()
if "games" not in self.hoolamike_config:
self.hoolamike_config["games"] = {}
if game not in self.hoolamike_config["games"]:
self.hoolamike_config["games"][game] = {}
self.hoolamike_config["games"][game]["root_directory"] = str(self.game_install_paths[game])
self.logger.info("Updated Hoolamike configuration with TTW settings.")
def reset_config(self):
"""Resets the hoolamike.yaml to default settings, backing up any existing file."""
if self.hoolamike_config_path.is_file():
# Create a backup with timestamp
import datetime
timestamp = datetime.datetime.now().strftime("%Y%m%d_%H%M%S")
backup_path = self.hoolamike_config_path.with_suffix(f".{timestamp}.bak")
try:
import shutil
shutil.copy2(self.hoolamike_config_path, backup_path)
self.logger.info(f"Created backup of existing config at {backup_path}")
print(f"{COLOR_INFO}Created backup of existing config at {backup_path}{COLOR_RESET}")
except Exception as e:
self.logger.error(f"Failed to create backup of config: {e}")
print(f"{COLOR_WARNING}Warning: Failed to create backup of config: {e}{COLOR_RESET}")
# Generate and save a fresh default config
self.logger.info("Generating new default configuration")
self.hoolamike_config = self._generate_default_config()
if self.save_hoolamike_config():
self.logger.info("Successfully reset config to defaults")
print(f"{COLOR_SUCCESS}Successfully reset configuration to defaults.{COLOR_RESET}")
return True
else:
self.logger.error("Failed to save new default config")
print(f"{COLOR_ERROR}Failed to save new default configuration.{COLOR_RESET}")
return False
def edit_hoolamike_config(self):
"""Opens the hoolamike.yaml file in a chosen editor, with a 0 option to return to menu."""
self.logger.info("Task: Edit Hoolamike Config started...")
self._check_hoolamike_installation()
if not self.hoolamike_installed:
self.logger.warning("Cannot edit config - Hoolamike not installed")
print(f"\n{COLOR_WARNING}Hoolamike is not installed through Jackify yet.{COLOR_RESET}")
print(f"Please use option 1 from the Hoolamike menu to install Hoolamike first.")
print(f"This will ensure that Jackify can properly manage the Hoolamike configuration.")
return False
if self.hoolamike_config is None:
self.logger.warning("Config is not loaded properly. Will attempt to fix or create.")
print(f"\n{COLOR_WARNING}Configuration file may be corrupted or not accessible.{COLOR_RESET}")
print("Options:")
print("1. Reset to default configuration (backup will be created)")
print("2. Try to edit the file anyway (may be corrupted)")
print("0. Cancel and return to menu")
choice = input("\nEnter your choice (0-2): ").strip()
if choice == "1":
if not self.reset_config():
self.logger.error("Failed to reset configuration")
print(f"{COLOR_ERROR}Failed to reset configuration. See logs for details.{COLOR_RESET}")
return
elif choice == "2":
self.logger.warning("User chose to edit potentially corrupted config")
# Continue to editing
elif choice == "0":
self.logger.info("User cancelled editing corrupted config")
print("Edit cancelled.")
return
else:
self.logger.info("User cancelled editing corrupted config")
print("Edit cancelled.")
return
if not self.hoolamike_config_path.exists():
self.logger.warning(f"Hoolamike config file does not exist at {self.hoolamike_config_path}. Generating default before editing.")
self.hoolamike_config = self._generate_default_config()
self.save_hoolamike_config()
if not self.hoolamike_config_path.exists():
self.logger.error("Failed to create config file for editing.")
print("Error: Could not create configuration file.")
return
available_editors = ["nano", "vim", "vi", "gedit", "kate", "micro"]
preferred_editor = os.environ.get("EDITOR")
found_editors = {}
import shutil
for editor_name in available_editors:
editor_path = shutil.which(editor_name)
if editor_path and editor_path not in found_editors.values():
found_editors[editor_name] = editor_path
if preferred_editor:
preferred_editor_path = shutil.which(preferred_editor)
if preferred_editor_path and preferred_editor_path not in found_editors.values():
display_name = os.path.basename(preferred_editor) if '/' in preferred_editor else preferred_editor
if display_name not in found_editors:
found_editors[display_name] = preferred_editor_path
if not found_editors:
self.logger.error("No suitable text editors found on the system.")
print(f"{COLOR_ERROR}Error: No common text editors (nano, vim, gedit, kate, micro) found.{COLOR_RESET}")
return
sorted_editor_names = sorted(found_editors.keys())
print("\nSelect an editor to open the configuration file:")
print(f"(System default EDITOR is: {preferred_editor if preferred_editor else 'Not set'})")
for i, name in enumerate(sorted_editor_names):
print(f" {i + 1}. {name}")
print(f" 0. Return to Hoolamike Menu")
while True:
try:
choice = input(f"Enter choice (0-{len(sorted_editor_names)}): ").strip()
if choice == "0":
print("Edit cancelled.")
return
choice_index = int(choice) - 1
if 0 <= choice_index < len(sorted_editor_names):
chosen_name = sorted_editor_names[choice_index]
editor_to_use_path = found_editors[chosen_name]
break
else:
print("Invalid choice.")
except ValueError:
print("Invalid input. Please enter a number.")
except KeyboardInterrupt:
print("\nEdit cancelled.")
return
if editor_to_use_path:
self.logger.info(f"Launching editor '{editor_to_use_path}' for {self.hoolamike_config_path}")
try:
process = subprocess.Popen([editor_to_use_path, str(self.hoolamike_config_path)])
process.wait()
self.logger.info(f"Editor '{editor_to_use_path}' closed. Reloading config...")
if not self._load_hoolamike_config():
self.logger.error("Failed to load config after editing. It may still be corrupted.")
print(f"{COLOR_ERROR}Warning: The configuration file could not be parsed after editing.{COLOR_RESET}")
print("You may need to fix it manually or reset it to defaults.")
return False
else:
self.logger.info("Successfully reloaded config after editing.")
print(f"{COLOR_SUCCESS}Configuration file successfully updated.{COLOR_RESET}")
return True
except FileNotFoundError:
self.logger.error(f"Editor '{editor_to_use_path}' not found unexpectedly.")
print(f"{COLOR_ERROR}Error: Editor command '{editor_to_use_path}' not found.{COLOR_RESET}")
except Exception as e:
self.logger.error(f"Error launching or waiting for editor: {e}")
print(f"{COLOR_ERROR}An error occurred while launching the editor: {e}{COLOR_RESET}")
# Example usage (for testing, remove later)
if __name__ == '__main__':
logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(name)s - %(levelname)s - %(message)s')
print("Running HoolamikeHandler discovery...")
handler = HoolamikeHandler(steamdeck=False, verbose=True)
print("\n--- Discovery Results ---")
print(f"Game Paths: {handler.game_install_paths}")
print(f"Hoolamike App Install Path: {handler.hoolamike_app_install_path}")
print(f"Hoolamike Executable: {handler.hoolamike_executable_path}")
print(f"Hoolamike Installed: {handler.hoolamike_installed}")
print(f"Hoolamike Config Path: {handler.hoolamike_config_path}")
config_loaded = isinstance(handler.hoolamike_config, dict)
print(f"Hoolamike Config Loaded: {config_loaded}")
if config_loaded:
print(f" Downloads Dir: {handler.hoolamike_config.get('downloaders', {}).get('downloads_directory')}")
print(f" API Key Set: {'Yes' if handler.hoolamike_config.get('downloaders', {}).get('nexus', {}).get('api_key') != 'YOUR_API_KEY_HERE' else 'No'}")
print("-------------------------")
# Test edit config (example)
# handler.edit_hoolamike_config()

View File

@@ -1196,7 +1196,8 @@ class InstallWabbajackHandler:
"""Displays the final success message and next steps."""
# Basic log file path (assuming standard location)
# TODO: Get log file path more reliably if needed
log_path = Path.home() / "Jackify" / "logs" / "jackify-cli.log"
from jackify.shared.paths import get_jackify_logs_dir
log_path = get_jackify_logs_dir() / "jackify-cli.log"
print("\n───────────────────────────────────────────────────────────────────")
print(f"{COLOR_INFO}Wabbajack Installation Completed Successfully!{COLOR_RESET}")

View File

@@ -14,14 +14,15 @@ import shutil
class LoggingHandler:
"""
Central logging handler for Jackify.
- Uses ~/Jackify/logs/ as the log directory.
- Uses configured Jackify data directory for logs (default: ~/Jackify/logs/).
- Supports per-function log files (e.g., jackify-install-wabbajack.log).
- Handles log rotation and log directory creation.
Usage:
logger = LoggingHandler().setup_logger('install_wabbajack', 'jackify-install-wabbajack.log')
"""
def __init__(self):
self.log_dir = Path.home() / "Jackify" / "logs"
from jackify.shared.paths import get_jackify_logs_dir
self.log_dir = get_jackify_logs_dir()
self.ensure_log_directory()
def ensure_log_directory(self) -> None:
@@ -185,5 +186,5 @@ class LoggingHandler:
return stats
def get_general_logger(self):
"""Get the general CLI logger (~/Jackify/logs/jackify-cli.log)."""
"""Get the general CLI logger ({jackify_data_dir}/logs/jackify-cli.log)."""
return self.setup_logger('jackify_cli', is_general=True)

View File

@@ -152,8 +152,10 @@ class ModlistMenuHandler:
self.path_handler = PathHandler()
self.vdf_handler = VDFHandler()
# Determine Steam Deck status (already done by ConfigHandler, use it)
self.steamdeck = config_handler.settings.get('steamdeck', False)
# Determine Steam Deck status using centralized detection
from ..services.platform_detection_service import PlatformDetectionService
platform_service = PlatformDetectionService.get_instance()
self.steamdeck = platform_service.is_steamdeck
# Create the resolution handler
self.resolution_handler = ResolutionHandler()
@@ -178,7 +180,13 @@ class ModlistMenuHandler:
self.logger.error(f"Error initializing ModlistMenuHandler: {e}")
# Initialize with defaults/empty to prevent errors
self.filesystem_handler = FileSystemHandler()
self.steamdeck = False
# Use centralized detection even in fallback
try:
from ..services.platform_detection_service import PlatformDetectionService
platform_service = PlatformDetectionService.get_instance()
self.steamdeck = platform_service.is_steamdeck
except:
self.steamdeck = False # Final fallback
self.modlist_handler = None
def show_modlist_menu(self):
@@ -563,15 +571,19 @@ class ModlistMenuHandler:
self.logger.warning(f"[DEBUG] Could not find AppID for {context['name']} with exe {context['mo2_exe_path']}")
set_modlist_result = self.modlist_handler.set_modlist(context)
self.logger.debug(f"[DEBUG] set_modlist returned: {set_modlist_result}")
# Check GUI mode early to avoid input() calls in GUI context
import os
gui_mode = os.environ.get('JACKIFY_GUI_MODE') == '1'
if not set_modlist_result:
print(f"{COLOR_ERROR}\nError setting up context for configuration.{COLOR_RESET}")
self.logger.error(f"set_modlist failed for {context.get('name')}")
input(f"\n{COLOR_PROMPT}Press Enter to continue...{COLOR_RESET}")
if not gui_mode:
input(f"\n{COLOR_PROMPT}Press Enter to continue...{COLOR_RESET}")
return False
# --- Resolution selection logic for GUI mode ---
import os
gui_mode = os.environ.get('JACKIFY_GUI_MODE') == '1'
selected_resolution = context.get('resolution', None)
if gui_mode:
# If resolution is provided, set it and do not prompt
@@ -632,6 +644,29 @@ class ModlistMenuHandler:
if status_line:
print()
# Configure ENB for Linux compatibility (non-blocking, same as GUI)
enb_detected = False
try:
from ..handlers.enb_handler import ENBHandler
from pathlib import Path
enb_handler = ENBHandler()
install_dir = Path(context.get('path', ''))
if install_dir.exists():
enb_success, enb_message, enb_detected = enb_handler.configure_enb_for_linux(install_dir)
if enb_message:
if enb_success:
self.logger.info(enb_message)
update_status(enb_message)
else:
self.logger.warning(enb_message)
# Non-blocking: continue workflow even if ENB config fails
except Exception as e:
self.logger.warning(f"ENB configuration skipped due to error: {e}")
# Continue workflow - ENB config is optional
print("")
print("")
print("") # Extra blank line before completion
@@ -642,7 +677,27 @@ class ModlistMenuHandler:
print("Modlist Install and Configuration complete!")
print(f"• You should now be able to Launch '{context.get('name')}' through Steam")
print("• Congratulations and enjoy the game!")
print("Detailed log available at: ~/Jackify/logs/Configure_New_Modlist_workflow.log")
print("")
# Show ENB-specific warning if ENB was detected (replaces generic note)
if enb_detected:
print(f"{COLOR_WARNING}⚠️ ENB DETECTED{COLOR_RESET}")
print("")
print("If you plan on using ENB as part of this modlist, you will need to use")
print("one of the following Proton versions, otherwise you will have issues:")
print("")
print(" (in order of recommendation)")
print(f" {COLOR_SUCCESS}• Proton-CachyOS{COLOR_RESET}")
print(f" {COLOR_INFO}• GE-Proton 10-14 or lower{COLOR_RESET}")
print(f" {COLOR_WARNING}• Proton 9 from Valve{COLOR_RESET}")
print("")
print(f"{COLOR_WARNING}Note: Valve's Proton 10 has known ENB compatibility issues.{COLOR_RESET}")
print("")
else:
# No ENB detected - no warning needed
pass
from jackify.shared.paths import get_jackify_logs_dir
print(f"Detailed log available at: {get_jackify_logs_dir()}/Configure_New_Modlist_workflow.log")
# Only wait for input in CLI mode, not GUI mode
if not gui_mode:
input(f"{COLOR_PROMPT}Press Enter to return to the menu...{COLOR_RESET}")
@@ -851,60 +906,6 @@ class MenuHandler:
self.logger.debug("_clear_screen: Clearing screen for POSIX by printing 100 newlines.")
print("\n" * 100, flush=True)
def show_hoolamike_menu(self, cli_instance):
"""Show the Hoolamike Modlist Management menu"""
if not hasattr(cli_instance, 'hoolamike_handler') or cli_instance.hoolamike_handler is None:
try:
from .hoolamike_handler import HoolamikeHandler
cli_instance.hoolamike_handler = HoolamikeHandler(
steamdeck=getattr(cli_instance, 'steamdeck', False),
verbose=getattr(cli_instance, 'verbose', False),
filesystem_handler=getattr(cli_instance, 'filesystem_handler', None),
config_handler=getattr(cli_instance, 'config_handler', None),
menu_handler=self
)
except Exception as e:
self.logger.error(f"Failed to initialize Hoolamike features: {e}", exc_info=True)
print(f"{COLOR_ERROR}Error: Failed to initialize Hoolamike features. Check logs.{COLOR_RESET}")
input("\nPress Enter to return to the main menu...")
return # Exit this menu if handler fails
while True:
self._clear_screen()
# Banner display handled by frontend
# Use print_section_header for consistency if available, otherwise manual with COLOR_SELECTION
if hasattr(self, 'print_section_header'): # Check if method exists (it's from ui_utils)
print_section_header("Hoolamike Modlist Management")
else: # Fallback if not imported or available directly on self
print(f"{COLOR_SELECTION}Hoolamike Modlist Management{COLOR_RESET}")
print(f"{COLOR_SELECTION}{'-'*30}{COLOR_RESET}")
print(f"{COLOR_SELECTION}1.{COLOR_RESET} Install or Update Hoolamike App")
print(f"{COLOR_SELECTION}2.{COLOR_RESET} Install Modlist (Nexus Premium)")
print(f"{COLOR_SELECTION}3.{COLOR_RESET} Install Modlist (Non-Premium) {COLOR_DISABLED}(Not Implemented){COLOR_RESET}")
print(f"{COLOR_SELECTION}4.{COLOR_RESET} Install Tale of Two Wastelands (TTW)")
print(f"{COLOR_SELECTION}5.{COLOR_RESET} Edit Hoolamike Configuration")
print(f"{COLOR_SELECTION}0.{COLOR_RESET} Return to Main Menu")
selection = input(f"\n{COLOR_PROMPT}Enter your selection (0-5): {COLOR_RESET}").strip()
if selection.lower() == 'q': # Allow 'q' to re-display menu
continue
if selection == "1":
cli_instance.hoolamike_handler.install_update_hoolamike()
elif selection == "2":
cli_instance.hoolamike_handler.install_modlist(premium=True)
elif selection == "3":
print(f"{COLOR_INFO}Install Modlist (Non-Premium) is not yet implemented.{COLOR_RESET}")
input("\nPress Enter to return to the Hoolamike menu...")
elif selection == "4":
cli_instance.hoolamike_handler.install_ttw()
elif selection == "5":
cli_instance.hoolamike_handler.edit_hoolamike_config()
elif selection == "0":
break
else:
print("Invalid selection. Please try again.")
time.sleep(1)

View File

@@ -71,15 +71,19 @@ class ModlistHandler:
}
# Canonical mapping of modlist-specific Wine components (from omni-guides.sh)
# NOTE: dotnet4.x components disabled in v0.1.6.2 - replaced with universal registry fixes
MODLIST_WINE_COMPONENTS = {
"wildlander": ["dotnet472"],
"librum": ["dotnet40", "dotnet8"],
"apostasy": ["dotnet40", "dotnet8"],
"nordicsouls": ["dotnet40"],
"livingskyrim": ["dotnet40"],
"lsiv": ["dotnet40"],
"ls4": ["dotnet40"],
"lostlegacy": ["dotnet48"],
# "wildlander": ["dotnet472"], # DISABLED: Universal registry fixes replace dotnet472 installation
# "librum": ["dotnet40", "dotnet8"], # PARTIAL DISABLE: Keep dotnet8, remove dotnet40
"librum": ["dotnet8"], # dotnet40 replaced with universal registry fixes
# "apostasy": ["dotnet40", "dotnet8"], # PARTIAL DISABLE: Keep dotnet8, remove dotnet40
"apostasy": ["dotnet8"], # dotnet40 replaced with universal registry fixes
# "nordicsouls": ["dotnet40"], # DISABLED: Universal registry fixes replace dotnet40 installation
# "livingskyrim": ["dotnet40"], # DISABLED: Universal registry fixes replace dotnet40 installation
# "lsiv": ["dotnet40"], # DISABLED: Universal registry fixes replace dotnet40 installation
# "ls4": ["dotnet40"], # DISABLED: Universal registry fixes replace dotnet40 installation
# "lorerim": ["dotnet40"], # DISABLED: Universal registry fixes replace dotnet40 installation
# "lostlegacy": ["dotnet40"], # DISABLED: Universal registry fixes replace dotnet40 installation
}
def __init__(self, steam_path_or_config: Union[Dict, str, Path, None] = None,
@@ -101,10 +105,16 @@ class ModlistHandler:
verbose: Boolean indicating if verbose output is desired.
filesystem_handler: Optional FileSystemHandler instance to use instead of creating a new one.
"""
# Use standard logging (no file handler)
# Use standard logging (propagate to root logger so messages appear in logs)
self.logger = logging.getLogger(__name__)
self.logger.propagate = False
self.logger.propagate = True
self.steamdeck = steamdeck
# DEBUG: Log ModlistHandler instantiation details for SD card path debugging
import traceback
caller_info = traceback.extract_stack()[-2] # Get caller info
self.logger.debug(f"[SD_CARD_DEBUG] ModlistHandler created: id={id(self)}, steamdeck={steamdeck}")
self.logger.debug(f"[SD_CARD_DEBUG] Created from: {caller_info.filename}:{caller_info.lineno} in {caller_info.name}()")
self.steam_path: Optional[Path] = None
self.verbose = verbose # Store verbose flag
self.mo2_path: Optional[Path] = None
@@ -158,7 +168,10 @@ class ModlistHandler:
self.stock_game_path = None
# Initialize Handlers (should happen regardless of how paths were provided)
self.protontricks_handler = ProtontricksHandler(steamdeck=self.steamdeck, logger=self.logger)
self.protontricks_handler = ProtontricksHandler(self.steamdeck, logger=self.logger)
# Initialize winetricks handler for wine component installation
from .winetricks_handler import WinetricksHandler
self.winetricks_handler = WinetricksHandler(logger=self.logger)
self.shortcut_handler = ShortcutHandler(steamdeck=self.steamdeck, verbose=self.verbose)
self.filesystem_handler = filesystem_handler if filesystem_handler else FileSystemHandler()
self.resolution_handler = ResolutionHandler()
@@ -224,44 +237,41 @@ class ModlistHandler:
discovered_modlists_info = []
try:
# 1. Get ALL non-Steam shortcuts from Protontricks
# Now calls the renamed method without filtering
protontricks_shortcuts = self.protontricks_handler.list_non_steam_shortcuts()
if not protontricks_shortcuts:
self.logger.warning("Protontricks did not list any non-Steam shortcuts.")
return []
self.logger.debug(f"Protontricks non-Steam shortcuts found: {protontricks_shortcuts}")
# 2. Get shortcuts pointing to the executable from shortcuts.vdf
# Get shortcuts pointing to the executable from shortcuts.vdf
matching_vdf_shortcuts = self.shortcut_handler.find_shortcuts_by_exe(executable_name)
if not matching_vdf_shortcuts:
self.logger.debug(f"No shortcuts found pointing to '{executable_name}' in shortcuts.vdf.")
return []
self.logger.debug(f"Shortcuts matching executable '{executable_name}' in VDF: {matching_vdf_shortcuts}")
# 3. Correlate the two lists and extract required info
# Process each matching shortcut and convert signed AppID to unsigned
for vdf_shortcut in matching_vdf_shortcuts:
app_name = vdf_shortcut.get('AppName')
start_dir = vdf_shortcut.get('StartDir')
signed_appid = vdf_shortcut.get('appid')
if not app_name or not start_dir:
self.logger.warning(f"Skipping VDF shortcut due to missing AppName or StartDir: {vdf_shortcut}")
continue
if app_name in protontricks_shortcuts:
app_id = protontricks_shortcuts[app_name]
# Append dictionary with all necessary info
modlist_info = {
'name': app_name,
'appid': app_id,
'path': start_dir
}
discovered_modlists_info.append(modlist_info)
self.logger.info(f"Validated shortcut: '{app_name}' (AppID: {app_id}, Path: {start_dir})")
if signed_appid is None:
self.logger.warning(f"Skipping VDF shortcut due to missing appid: {vdf_shortcut}")
continue
# Convert signed AppID to unsigned AppID (the format used by Steam prefixes)
if signed_appid < 0:
unsigned_appid = signed_appid + (2**32)
else:
# Downgraded from WARNING to INFO
self.logger.info(f"Shortcut '{app_name}' found in VDF but not listed by protontricks. Skipping.")
unsigned_appid = signed_appid
# Append dictionary with all necessary info using unsigned AppID
modlist_info = {
'name': app_name,
'appid': unsigned_appid,
'path': start_dir
}
discovered_modlists_info.append(modlist_info)
self.logger.info(f"Discovered shortcut: '{app_name}' (Signed: {signed_appid} → Unsigned: {unsigned_appid}, Path: {start_dir})")
except Exception as e:
self.logger.error(f"Error discovering executable shortcuts: {e}", exc_info=True)
@@ -315,13 +325,22 @@ class ModlistHandler:
self.modlist_dir = Path(modlist_dir_path_str)
self.modlist_ini = modlist_ini_path
# Determine if modlist is on SD card
# Use str() for startswith check
if str(self.modlist_dir).startswith("/run/media") or str(self.modlist_dir).startswith("/media"):
# Determine if modlist is on SD card (Steam Deck only)
# On non-Steam Deck systems, /media mounts should use Z: drive, not D: drive
is_on_sdcard_path = str(self.modlist_dir).startswith("/run/media") or str(self.modlist_dir).startswith("/media")
# Log SD card detection for debugging
self.logger.debug(f"SD card detection: modlist_dir={self.modlist_dir}, is_sdcard_path={is_on_sdcard_path}, steamdeck={self.steamdeck}")
if is_on_sdcard_path and self.steamdeck:
self.modlist_sdcard = True
self.logger.info("Modlist appears to be on an SD card.")
self.logger.info("Modlist appears to be on an SD card (Steam Deck).")
self.logger.debug(f"Set modlist_sdcard=True")
else:
self.modlist_sdcard = False
self.logger.debug(f"Set modlist_sdcard=False (is_on_sdcard_path={is_on_sdcard_path}, steamdeck={self.steamdeck})")
if is_on_sdcard_path and not self.steamdeck:
self.logger.info("Modlist on /media mount detected on non-Steam Deck system - using Z: drive mapping.")
# Find and set compatdata path now that we have appid
# Ensure PathHandler is available (should be initialized in __init__)
@@ -345,7 +364,8 @@ class ModlistHandler:
# Store engine_installed flag for conditional path manipulation
self.engine_installed = modlist_info.get('engine_installed', False)
self.logger.debug(f" Engine Installed: {self.engine_installed}")
# Call internal detection methods to populate more state
if not self._detect_game_variables():
self.logger.warning("Failed to auto-detect game type after setting context.")
@@ -551,15 +571,19 @@ class ModlistHandler:
status_callback (callable, optional): A function to call with status updates during configuration.
manual_steps_completed (bool): If True, skip the manual steps prompt (used for new modlist flow).
"""
# Store status_callback for Configuration Summary
self._current_status_callback = status_callback
self.logger.info("Executing configuration steps...")
# Ensure required context is set
if not all([self.modlist_dir, self.appid, self.game_var, self.steamdeck is not None]):
self.logger.error("Cannot execute configuration steps: Missing required context (modlist_dir, appid, game_var, steamdeck status).")
print("Error: Missing required information to start configuration.")
try:
# Store status_callback for Configuration Summary
self._current_status_callback = status_callback
self.logger.info("Executing configuration steps...")
# Ensure required context is set
if not all([self.modlist_dir, self.appid, self.game_var, self.steamdeck is not None]):
self.logger.error("Cannot execute configuration steps: Missing required context (modlist_dir, appid, game_var, steamdeck status).")
print("Error: Missing required information to start configuration.")
return False
except Exception as e:
self.logger.error(f"Exception in _execute_configuration_steps initialization: {e}", exc_info=True)
return False
# Step 1: Set protontricks permissions
@@ -685,23 +709,115 @@ class ModlistHandler:
# All modlists now use their own AppID for wine components
target_appid = self.appid
if not self.protontricks_handler.install_wine_components(target_appid, self.game_var_full, specific_components=components):
self.logger.error("Failed to install Wine components. Configuration aborted.")
# Use user's preferred component installation method (respects settings toggle)
self.logger.debug(f"Getting WINEPREFIX for AppID {target_appid}...")
wineprefix = self.protontricks_handler.get_wine_prefix_path(target_appid)
if not wineprefix:
self.logger.error("Failed to get WINEPREFIX path for component installation.")
print("Error: Could not determine wine prefix location.")
return False
self.logger.debug(f"WINEPREFIX obtained: {wineprefix}")
# Use the winetricks handler which respects the user's toggle setting
try:
self.logger.info("Installing Wine components using user's preferred method...")
self.logger.debug(f"Calling winetricks_handler.install_wine_components with wineprefix={wineprefix}, game_var={self.game_var_full}, components={components}")
success = self.winetricks_handler.install_wine_components(wineprefix, self.game_var_full, specific_components=components, status_callback=status_callback)
if success:
self.logger.info("Wine component installation completed successfully")
if status_callback:
status_callback(f"{self._get_progress_timestamp()} Wine components verified and installed successfully")
else:
self.logger.error("Wine component installation failed")
print("Error: Failed to install necessary Wine components.")
return False
except Exception as e:
self.logger.error(f"Wine component installation failed with exception: {e}")
print("Error: Failed to install necessary Wine components.")
return False # Abort on failure
return False
self.logger.info("Step 4: Installing Wine components... Done")
# Step 5: Ensure permissions of Modlist directory
# Step 4.5: Apply universal dotnet4.x compatibility registry fixes AFTER wine components
# This ensures the fixes are not overwritten by component installation processes
if status_callback:
status_callback(f"{self._get_progress_timestamp()} Setting ownership and permissions for modlist directory")
self.logger.info("Step 5: Setting ownership and permissions for modlist directory...")
status_callback(f"{self._get_progress_timestamp()} Applying universal dotnet4.x compatibility fixes")
self.logger.info("Step 4.5: Applying universal dotnet4.x compatibility registry fixes...")
registry_success = False
try:
registry_success = self._apply_universal_dotnet_fixes()
except Exception as e:
error_msg = f"CRITICAL: Registry fixes failed - modlist may have .NET compatibility issues: {e}"
self.logger.error(error_msg)
if status_callback:
status_callback(f"{self._get_progress_timestamp()} ERROR: {error_msg}")
registry_success = False
if not registry_success:
failure_msg = "WARNING: Universal dotnet4.x registry fixes FAILED! This modlist may experience .NET Framework compatibility issues."
self.logger.error("=" * 80)
self.logger.error(failure_msg)
self.logger.error("Consider manually setting mscoree=native in winecfg if problems occur.")
self.logger.error("=" * 80)
if status_callback:
status_callback(f"{self._get_progress_timestamp()} {failure_msg}")
# Continue but user should be aware of potential issues
# Step 4.6: Enable dotfiles visibility for Wine prefix
if status_callback:
status_callback(f"{self._get_progress_timestamp()} Enabling dotfiles visibility")
self.logger.info("Step 4.6: Enabling dotfiles visibility in Wine prefix...")
try:
if self.protontricks_handler.enable_dotfiles(self.appid):
self.logger.info("Dotfiles visibility enabled successfully")
else:
self.logger.warning("Failed to enable dotfiles visibility (non-critical, continuing)")
except Exception as e:
self.logger.warning(f"Error enabling dotfiles visibility: {e} (non-critical, continuing)")
self.logger.info("Step 4.6: Enabling dotfiles visibility... Done")
# Step 4.7: Create Wine prefix Documents directories for USVFS
# This is critical for USVFS to virtualize profile INI files on first launch
if status_callback:
status_callback(f"{self._get_progress_timestamp()} Creating Wine prefix Documents directories for USVFS")
self.logger.info("Step 4.7: Creating Wine prefix Documents directories for USVFS...")
try:
if self.appid and self.game_var:
# Map game_var to game_name for create_required_dirs
game_name_map = {
"skyrimspecialedition": "skyrimse",
"fallout4": "fallout4",
"falloutnv": "falloutnv",
"oblivion": "oblivion",
"enderalspecialedition": "enderalse"
}
game_name = game_name_map.get(self.game_var.lower(), None)
if game_name:
appid_str = str(self.appid)
if self.filesystem_handler.create_required_dirs(game_name, appid_str):
self.logger.info("Wine prefix Documents directories created successfully for USVFS")
else:
self.logger.warning("Failed to create Wine prefix Documents directories (non-critical, continuing)")
else:
self.logger.debug(f"Game {self.game_var} not in directory creation map, skipping")
else:
self.logger.warning("AppID or game_var not available, skipping Wine prefix Documents directory creation")
except Exception as e:
self.logger.warning(f"Error creating Wine prefix Documents directories: {e} (non-critical, continuing)")
self.logger.info("Step 4.7: Creating Wine prefix Documents directories... Done")
# Step 5: Verify ownership of Modlist directory
if status_callback:
status_callback(f"{self._get_progress_timestamp()} Verifying modlist directory ownership")
self.logger.info("Step 5: Verifying ownership of modlist directory...")
# Convert modlist_dir string to Path object for the method
modlist_path_obj = Path(self.modlist_dir)
if not self.filesystem_handler.set_ownership_and_permissions_sudo(modlist_path_obj):
self.logger.error("Failed to set ownership/permissions for modlist directory. Configuration aborted.")
print("Error: Failed to set permissions for the modlist directory.")
success, error_msg = self.filesystem_handler.verify_ownership_and_permissions(modlist_path_obj)
if not success:
self.logger.error("Ownership verification failed for modlist directory. Configuration aborted.")
print(f"\n{COLOR_ERROR}{error_msg}{COLOR_RESET}")
return False # Abort on failure
self.logger.info("Step 5: Setting ownership and permissions... Done")
self.logger.info("Step 5: Ownership verification... Done")
# Step 6: Backup ModOrganizer.ini
if status_callback:
@@ -716,6 +832,14 @@ class ModlistHandler:
self.logger.info(f"ModOrganizer.ini backed up to: {backup_path}")
self.logger.info("Step 6: Backing up ModOrganizer.ini... Done")
# Step 6.5: Handle symlinked downloads directory
if status_callback:
status_callback(f"{self._get_progress_timestamp()} Checking for symlinked downloads directory")
self.logger.info("Step 6.5: Checking for symlinked downloads directory...")
if not self._handle_symlinked_downloads():
self.logger.warning("Warning during symlink handling (non-critical)")
self.logger.info("Step 6.5: Checking for symlinked downloads directory... Done")
# Step 7a: Detect Stock Game/Game Root path
if status_callback:
status_callback(f"{self._get_progress_timestamp()} Detecting stock game path")
@@ -758,9 +882,19 @@ class ModlistHandler:
self.logger.info("No stock game path found, skipping gamePath update - edit_binary_working_paths will handle all path updates.")
self.logger.info("Using unified path manipulation to avoid duplicate processing.")
# Conditionally update binary and working directory paths
# Conditionally update binary and working directory paths
# Skip for jackify-engine workflows since paths are already correct
if not getattr(self, 'engine_installed', False):
# Exception: Always run for SD card installs to fix Z:/run/media/... to D:/... paths
# DEBUG: Add comprehensive logging to identify Steam Deck SD card path manipulation issues
engine_installed = getattr(self, 'engine_installed', False)
self.logger.debug(f"[SD_CARD_DEBUG] ModlistHandler instance: id={id(self)}")
self.logger.debug(f"[SD_CARD_DEBUG] engine_installed: {engine_installed}")
self.logger.debug(f"[SD_CARD_DEBUG] modlist_sdcard: {self.modlist_sdcard}")
self.logger.debug(f"[SD_CARD_DEBUG] steamdeck parameter passed to constructor: {getattr(self, 'steamdeck', 'NOT_SET')}")
self.logger.debug(f"[SD_CARD_DEBUG] Path manipulation condition: not {engine_installed} or {self.modlist_sdcard} = {not engine_installed or self.modlist_sdcard}")
if not getattr(self, 'engine_installed', False) or self.modlist_sdcard:
# Convert steamapps/common path to library root path
steam_libraries = None
if self.steam_library:
@@ -779,7 +913,8 @@ class ModlistHandler:
print("Error: Failed to update binary and working directory paths in ModOrganizer.ini.")
return False # Abort on failure
else:
self.logger.debug("Skipping path manipulation - jackify-engine already set correct paths in ModOrganizer.ini")
self.logger.debug("[SD_CARD_DEBUG] Skipping path manipulation - jackify-engine already set correct paths in ModOrganizer.ini")
self.logger.debug(f"[SD_CARD_DEBUG] SKIPPED because: engine_installed={engine_installed} and modlist_sdcard={self.modlist_sdcard}")
self.logger.info("Step 8: Updating ModOrganizer.ini paths... Done")
# Step 9: Update Resolution Settings (if applicable)
@@ -792,10 +927,10 @@ class ModlistHandler:
vanilla_game_dir = None
if self.steam_library and self.game_var_full:
vanilla_game_dir = str(Path(self.steam_library) / "steamapps" / "common" / self.game_var_full)
if not self.resolution_handler.update_ini_resolution(
modlist_dir=self.modlist_dir,
game_var=self.game_var_full,
if not ResolutionHandler.update_ini_resolution(
modlist_dir=self.modlist_dir,
game_var=self.game_var_full,
set_res=self.selected_resolution,
vanilla_game_dir=vanilla_game_dir
):
@@ -829,33 +964,59 @@ class ModlistHandler:
if self.steam_library and self.game_var_full:
vanilla_game_dir = str(Path(self.steam_library) / "steamapps" / "common" / self.game_var_full)
if not self.path_handler.create_dxvk_conf(
dxvk_created = self.path_handler.create_dxvk_conf(
modlist_dir=self.modlist_dir,
modlist_sdcard=self.modlist_sdcard,
steam_library=str(self.steam_library) if self.steam_library else None, # Pass as string or None
basegame_sdcard=self.basegame_sdcard,
game_var_full=self.game_var_full,
vanilla_game_dir=vanilla_game_dir
):
self.logger.warning("Failed to create dxvk.conf file.")
print("Warning: Failed to create dxvk.conf file.")
vanilla_game_dir=vanilla_game_dir,
stock_game_path=self.stock_game_path
)
dxvk_verified = self.path_handler.verify_dxvk_conf_exists(
modlist_dir=self.modlist_dir,
steam_library=str(self.steam_library) if self.steam_library else None,
game_var_full=self.game_var_full,
vanilla_game_dir=vanilla_game_dir,
stock_game_path=self.stock_game_path
)
if not dxvk_created or not dxvk_verified:
self.logger.warning("DXVK configuration file is missing or incomplete after post-install steps.")
print("Warning: Failed to verify dxvk.conf file (required for AMD GPUs).")
self.logger.info("Step 10: Creating dxvk.conf... Done")
# Step 11a: Small Tasks - Delete Plugin
# Step 11a: Small Tasks - Delete Incompatible Plugins
if status_callback:
status_callback(f"{self._get_progress_timestamp()} Deleting incompatible MO2 plugin")
self.logger.info("Step 11a: Deleting incompatible MO2 plugin (FixGameRegKey.py)...")
plugin_path = Path(self.modlist_dir) / "plugins" / "FixGameRegKey.py"
if plugin_path.exists():
status_callback(f"{self._get_progress_timestamp()} Deleting incompatible MO2 plugins")
self.logger.info("Step 11a: Deleting incompatible MO2 plugins...")
# Delete FixGameRegKey.py plugin
fixgamereg_path = Path(self.modlist_dir) / "plugins" / "FixGameRegKey.py"
if fixgamereg_path.exists():
try:
plugin_path.unlink()
fixgamereg_path.unlink()
self.logger.info("FixGameRegKey.py plugin deleted successfully.")
except Exception as e:
self.logger.warning(f"Failed to delete FixGameRegKey.py plugin: {e}")
print("Warning: Failed to delete incompatible plugin file.")
print("Warning: Failed to delete FixGameRegKey.py plugin file.")
else:
self.logger.debug("FixGameRegKey.py plugin not found (this is normal).")
self.logger.info("Step 11a: Plugin deletion check complete.")
# Delete PageFileManager plugin directory (Linux has no PageFile)
pagefilemgr_path = Path(self.modlist_dir) / "plugins" / "PageFileManager"
if pagefilemgr_path.exists():
try:
import shutil
shutil.rmtree(pagefilemgr_path)
self.logger.info("PageFileManager plugin directory deleted successfully.")
except Exception as e:
self.logger.warning(f"Failed to delete PageFileManager plugin directory: {e}")
print("Warning: Failed to delete PageFileManager plugin directory.")
else:
self.logger.debug("PageFileManager plugin not found (this is normal).")
self.logger.info("Step 11a: Incompatible plugin deletion check complete.")
# Step 11b: Download Font
if status_callback:
@@ -863,7 +1024,7 @@ class ModlistHandler:
prefix_path_str = self.path_handler.find_compat_data(str(self.appid))
if prefix_path_str:
prefix_path = Path(prefix_path_str)
fonts_dir = prefix_path / "drive_c" / "windows" / "Fonts"
fonts_dir = prefix_path / "pfx" / "drive_c" / "windows" / "Fonts"
font_url = "https://github.com/mrbvrz/segoe-ui-linux/raw/refs/heads/master/font/seguisym.ttf"
font_dest_path = fonts_dir / "seguisym.ttf"
@@ -898,6 +1059,10 @@ class ModlistHandler:
# status_callback("Configuration completed successfully!")
self.logger.info("Configuration steps completed successfully.")
# Step 14: Re-enforce Windows 10 mode after modlist-specific configurations (matches legacy script line 1333)
self._re_enforce_windows_10_mode()
return True # Return True on success
def _detect_steam_library_info(self) -> bool:
@@ -1163,7 +1328,7 @@ class ModlistHandler:
# Determine game type
game = (game_var_full or modlist_name or "").lower().replace(" ", "")
# Add game-specific extras
if "skyrim" in game or "fallout4" in game or "starfield" in game or "oblivion_remastered" in game:
if "skyrim" in game or "fallout4" in game or "starfield" in game or "oblivion_remastered" in game or "enderal" in game:
extras += ["d3dcompiler_47", "d3dx11_43", "d3dcompiler_43", "dotnet6", "dotnet7"]
elif "falloutnewvegas" in game or "fnv" in game or "oblivion" in game:
extras += ["d3dx9_43", "d3dx9"]
@@ -1238,6 +1403,12 @@ class ModlistHandler:
# Check ModOrganizer.ini for indicators (nvse/enderal) as an early, robust signal
try:
mo2_ini = modlist_path / "ModOrganizer.ini"
# Also check Somnium's non-standard location
if not mo2_ini.exists():
somnium_mo2_ini = modlist_path / "files" / "ModOrganizer.ini"
if somnium_mo2_ini.exists():
mo2_ini = somnium_mo2_ini
if mo2_ini.exists():
try:
content = mo2_ini.read_text(errors='ignore').lower()
@@ -1294,4 +1465,327 @@ class ModlistHandler:
self.logger.debug("No special game type detected - standard workflow will be used")
return None
# (Ensure EOF is clean and no extra incorrect methods exist below)
def _re_enforce_windows_10_mode(self):
"""
Re-enforce Windows 10 mode after modlist-specific configurations.
This matches the legacy script behavior (line 1333) where Windows 10 mode
is re-applied after modlist-specific steps to ensure consistency.
"""
try:
if not hasattr(self, 'appid') or not self.appid:
self.logger.warning("Cannot re-enforce Windows 10 mode - no AppID available")
return
from ..handlers.winetricks_handler import WinetricksHandler
from ..handlers.path_handler import PathHandler
# Get prefix path for the AppID
prefix_path = PathHandler.find_compat_data(str(self.appid))
if not prefix_path:
self.logger.warning("Cannot re-enforce Windows 10 mode - prefix path not found")
return
# Use winetricks handler to set Windows 10 mode
winetricks_handler = WinetricksHandler()
wine_binary = winetricks_handler._get_wine_binary_for_prefix(str(prefix_path))
if not wine_binary:
self.logger.warning("Cannot re-enforce Windows 10 mode - wine binary not found")
return
winetricks_handler._set_windows_10_mode(str(prefix_path), wine_binary)
self.logger.info("Windows 10 mode re-enforced after modlist-specific configurations")
except Exception as e:
self.logger.warning(f"Error re-enforcing Windows 10 mode: {e}")
def _handle_symlinked_downloads(self) -> bool:
"""
Check if downloads_directory in ModOrganizer.ini points to a symlink.
If it does, comment out the line to force MO2 to use default behavior.
Returns:
bool: True on success or no action needed, False on error
"""
try:
import configparser
import os
if not self.modlist_ini or not os.path.exists(self.modlist_ini):
self.logger.warning("ModOrganizer.ini not found for symlink check")
return True # Non-critical
# Read the INI file
config = configparser.ConfigParser(allow_no_value=True, delimiters=['='])
config.optionxform = str # Preserve case sensitivity
try:
# Read file manually to handle BOM
with open(self.modlist_ini, 'r', encoding='utf-8-sig') as f:
config.read_file(f)
except UnicodeDecodeError:
with open(self.modlist_ini, 'r', encoding='latin-1') as f:
config.read_file(f)
# Check if downloads_directory or download_directory exists and is a symlink
downloads_key = None
downloads_path = None
if 'General' in config:
# Check for both possible key names
if 'downloads_directory' in config['General']:
downloads_key = 'downloads_directory'
downloads_path = config['General']['downloads_directory']
elif 'download_directory' in config['General']:
downloads_key = 'download_directory'
downloads_path = config['General']['download_directory']
if downloads_path:
if downloads_path and os.path.exists(downloads_path):
# Check if the path or any parent directory contains symlinks
def has_symlink_in_path(path):
"""Check if path or any parent directory is a symlink"""
current_path = Path(path).resolve()
check_path = Path(path)
# Walk up the path checking each component
for parent in [check_path] + list(check_path.parents):
if parent.is_symlink():
return True, str(parent)
return False, None
has_symlink, symlink_path = has_symlink_in_path(downloads_path)
if has_symlink:
self.logger.info(f"Detected symlink in downloads directory path: {symlink_path} -> {downloads_path}")
self.logger.info("Commenting out downloads_directory to avoid Wine symlink issues")
# Read the file manually to preserve comments and formatting
with open(self.modlist_ini, 'r', encoding='utf-8') as f:
lines = f.readlines()
# Find and comment out the downloads directory line
modified = False
for i, line in enumerate(lines):
if line.strip().startswith(f'{downloads_key}='):
lines[i] = '#' + line # Comment out the line
modified = True
break
if modified:
# Write the modified file back
with open(self.modlist_ini, 'w', encoding='utf-8') as f:
f.writelines(lines)
self.logger.info(f"{downloads_key} line commented out successfully")
else:
self.logger.warning("downloads_directory line not found in file")
else:
self.logger.debug(f"downloads_directory is not a symlink: {downloads_path}")
else:
self.logger.debug("downloads_directory path does not exist or is empty")
else:
self.logger.debug("No downloads_directory found in ModOrganizer.ini")
return True
except Exception as e:
self.logger.error(f"Error handling symlinked downloads: {e}", exc_info=True)
return False
def _apply_universal_dotnet_fixes(self):
"""
Apply universal dotnet4.x compatibility registry fixes to ALL modlists.
Now called AFTER wine component installation to prevent overwrites.
Includes wineserver shutdown/flush to ensure persistence.
"""
try:
prefix_path = os.path.join(str(self.compat_data_path), "pfx")
if not os.path.exists(prefix_path):
self.logger.warning(f"Prefix path not found: {prefix_path}")
return False
self.logger.info("Applying universal dotnet4.x compatibility registry fixes (post-component installation)...")
# Find the appropriate Wine binary to use for registry operations
wine_binary = self._find_wine_binary_for_registry()
if not wine_binary:
self.logger.error("Could not find Wine binary for registry operations")
return False
# Find wineserver binary for flushing registry changes
wine_dir = os.path.dirname(wine_binary)
wineserver_binary = os.path.join(wine_dir, 'wineserver')
if not os.path.exists(wineserver_binary):
self.logger.warning(f"wineserver not found at {wineserver_binary}, registry flush may not work")
wineserver_binary = None
# Set environment for Wine registry operations
env = os.environ.copy()
env['WINEPREFIX'] = prefix_path
env['WINEDEBUG'] = '-all' # Suppress Wine debug output
# Shutdown any running wineserver processes to ensure clean slate
if wineserver_binary:
self.logger.debug("Shutting down wineserver before applying registry fixes...")
try:
subprocess.run([wineserver_binary, '-w'], env=env, timeout=30, capture_output=True)
self.logger.debug("Wineserver shutdown complete")
except Exception as e:
self.logger.warning(f"Wineserver shutdown failed (non-critical): {e}")
# Registry fix 1: Set *mscoree=native DLL override (asterisk for full override)
# This tells Wine to use native .NET runtime instead of Wine's implementation
self.logger.debug("Setting *mscoree=native DLL override...")
cmd1 = [
wine_binary, 'reg', 'add',
'HKEY_CURRENT_USER\\Software\\Wine\\DllOverrides',
'/v', '*mscoree', '/t', 'REG_SZ', '/d', 'native', '/f'
]
result1 = subprocess.run(cmd1, env=env, capture_output=True, text=True, errors='replace', timeout=30)
self.logger.info(f"*mscoree registry command result: returncode={result1.returncode}, stdout={result1.stdout[:200]}, stderr={result1.stderr[:200]}")
if result1.returncode == 0:
self.logger.info("Successfully applied *mscoree=native DLL override")
else:
self.logger.error(f"Failed to set *mscoree DLL override: returncode={result1.returncode}, stderr={result1.stderr}")
# Registry fix 2: Set OnlyUseLatestCLR=1
# This prevents .NET version conflicts by using the latest CLR
self.logger.debug("Setting OnlyUseLatestCLR=1 registry entry...")
cmd2 = [
wine_binary, 'reg', 'add',
'HKEY_LOCAL_MACHINE\\Software\\Microsoft\\.NETFramework',
'/v', 'OnlyUseLatestCLR', '/t', 'REG_DWORD', '/d', '1', '/f'
]
result2 = subprocess.run(cmd2, env=env, capture_output=True, text=True, errors='replace', timeout=30)
self.logger.info(f"OnlyUseLatestCLR registry command result: returncode={result2.returncode}, stdout={result2.stdout[:200]}, stderr={result2.stderr[:200]}")
if result2.returncode == 0:
self.logger.info("Successfully applied OnlyUseLatestCLR=1 registry entry")
else:
self.logger.error(f"Failed to set OnlyUseLatestCLR: returncode={result2.returncode}, stderr={result2.stderr}")
# Force wineserver to flush registry changes to disk
if wineserver_binary:
self.logger.debug("Flushing registry changes to disk via wineserver shutdown...")
try:
subprocess.run([wineserver_binary, '-w'], env=env, timeout=30, capture_output=True)
self.logger.debug("Registry changes flushed to disk")
except Exception as e:
self.logger.warning(f"Registry flush failed (non-critical): {e}")
# VERIFICATION: Confirm the registry entries persisted
self.logger.info("Verifying registry entries were applied and persisted...")
verification_passed = True
# Verify *mscoree=native
verify_cmd1 = [
wine_binary, 'reg', 'query',
'HKEY_CURRENT_USER\\Software\\Wine\\DllOverrides',
'/v', '*mscoree'
]
verify_result1 = subprocess.run(verify_cmd1, env=env, capture_output=True, text=True, errors='replace', timeout=30)
if verify_result1.returncode == 0 and 'native' in verify_result1.stdout:
self.logger.info("VERIFIED: *mscoree=native is set correctly")
else:
self.logger.error(f"VERIFICATION FAILED: *mscoree=native not found in registry. Query output: {verify_result1.stdout}")
verification_passed = False
# Verify OnlyUseLatestCLR=1
verify_cmd2 = [
wine_binary, 'reg', 'query',
'HKEY_LOCAL_MACHINE\\Software\\Microsoft\\.NETFramework',
'/v', 'OnlyUseLatestCLR'
]
verify_result2 = subprocess.run(verify_cmd2, env=env, capture_output=True, text=True, errors='replace', timeout=30)
if verify_result2.returncode == 0 and ('0x1' in verify_result2.stdout or 'REG_DWORD' in verify_result2.stdout):
self.logger.info("VERIFIED: OnlyUseLatestCLR=1 is set correctly")
else:
self.logger.error(f"VERIFICATION FAILED: OnlyUseLatestCLR=1 not found in registry. Query output: {verify_result2.stdout}")
verification_passed = False
# Both fixes applied and verified
if result1.returncode == 0 and result2.returncode == 0 and verification_passed:
self.logger.info("Universal dotnet4.x compatibility fixes applied, flushed, and verified successfully")
return True
else:
self.logger.error("Registry fixes failed verification - fixes may not persist across prefix restarts")
return False
except Exception as e:
self.logger.error(f"Failed to apply universal dotnet4.x fixes: {e}")
return False
def _find_wine_binary_for_registry(self) -> Optional[str]:
"""Find wine binary from Install Proton path"""
try:
# Use Install Proton from config (used by jackify-engine)
from ..handlers.config_handler import ConfigHandler
config_handler = ConfigHandler()
proton_path = config_handler.get_proton_path()
if proton_path:
proton_path = Path(proton_path).expanduser()
# Check both GE-Proton and Valve Proton structures
wine_candidates = [
proton_path / "files" / "bin" / "wine", # GE-Proton
proton_path / "dist" / "bin" / "wine" # Valve Proton
]
for wine_bin in wine_candidates:
if wine_bin.exists() and wine_bin.is_file():
return str(wine_bin)
# Fallback: use best detected Proton
from ..handlers.wine_utils import WineUtils
best_proton = WineUtils.select_best_proton()
if best_proton:
wine_binary = WineUtils.find_proton_binary(best_proton['name'])
if wine_binary:
return wine_binary
return None
except Exception as e:
self.logger.error(f"Error finding Wine binary: {e}")
return None
def _search_wine_in_proton_directory(self, proton_path: Path) -> Optional[str]:
"""
Recursively search for wine binary within a Proton directory.
This handles cases where the directory structure might differ between Proton versions.
Args:
proton_path: Path to the Proton directory to search
Returns:
Path to wine binary if found, None otherwise
"""
try:
if not proton_path.exists() or not proton_path.is_dir():
return None
# Search for 'wine' executable (not 'wine64' or 'wine-preloader')
# Limit search depth to avoid scanning entire filesystem
max_depth = 5
for root, dirs, files in os.walk(proton_path, followlinks=False):
# Calculate depth relative to proton_path
depth = len(Path(root).relative_to(proton_path).parts)
if depth > max_depth:
dirs.clear() # Don't descend further
continue
# Check if 'wine' is in this directory
if 'wine' in files:
wine_path = Path(root) / 'wine'
# Verify it's actually an executable file
if wine_path.is_file() and os.access(wine_path, os.X_OK):
self.logger.debug(f"Found wine binary at: {wine_path}")
return str(wine_path)
return None
except Exception as e:
self.logger.debug(f"Error during recursive wine search in {proton_path}: {e}")
return None

View File

@@ -48,10 +48,11 @@ logger = logging.getLogger(__name__) # Standard logger init
# Helper function to get path to jackify-install-engine
def get_jackify_engine_path():
if getattr(sys, 'frozen', False) and hasattr(sys, '_MEIPASS'):
# Running in a PyInstaller bundle
# Engine is expected at <bundle_root>/jackify/engine/jackify-engine
return os.path.join(sys._MEIPASS, 'jackify', 'engine', 'jackify-engine')
appdir = os.environ.get('APPDIR')
if appdir:
# Running inside AppImage
# Engine is expected at <appdir>/opt/jackify/engine/jackify-engine
return os.path.join(appdir, 'opt', 'jackify', 'engine', 'jackify-engine')
else:
# Running in a normal Python environment from source
# Current file is in src/jackify/backend/handlers/modlist_install_cli.py
@@ -68,7 +69,7 @@ class ModlistInstallCLI:
def __init__(self, menu_handler: MenuHandler, steamdeck: bool = False):
self.menu_handler = menu_handler
self.steamdeck = steamdeck
self.protontricks_handler = ProtontricksHandler(steamdeck=steamdeck)
self.protontricks_handler = ProtontricksHandler(steamdeck)
self.shortcut_handler = ShortcutHandler(steamdeck=steamdeck)
self.context = {}
# Use standard logging (no file handler)
@@ -408,51 +409,78 @@ class ModlistInstallCLI:
self.context['download_dir'] = download_dir_path
self.logger.debug(f"Download directory context set to: {self.context['download_dir']}")
# 5. Prompt for Nexus API key (skip if in context)
# 5. Get Nexus authentication (OAuth or API key)
if 'nexus_api_key' not in self.context:
from jackify.backend.services.api_key_service import APIKeyService
api_key_service = APIKeyService()
saved_key = api_key_service.get_saved_api_key()
api_key = None
if saved_key:
print("\n" + "-" * 28)
print(f"{COLOR_INFO}A Nexus API Key is already saved.{COLOR_RESET}")
use_saved = input(f"{COLOR_PROMPT}Use the saved API key? [Y/n]: {COLOR_RESET}").strip().lower()
if use_saved in ('', 'y', 'yes'):
api_key = saved_key
from jackify.backend.services.nexus_auth_service import NexusAuthService
auth_service = NexusAuthService()
# Get current auth status
authenticated, method, username = auth_service.get_auth_status()
if authenticated:
# Already authenticated - use existing auth
if method == 'oauth':
print("\n" + "-" * 28)
print(f"{COLOR_SUCCESS}Nexus Authentication: Authorized via OAuth{COLOR_RESET}")
if username:
print(f"{COLOR_INFO}Logged in as: {username}{COLOR_RESET}")
elif method == 'api_key':
print("\n" + "-" * 28)
print(f"{COLOR_INFO}Nexus Authentication: Using API Key (Legacy){COLOR_RESET}")
# Get valid token/key and OAuth state for engine auto-refresh
api_key, oauth_info = auth_service.get_auth_for_engine()
if api_key:
self.context['nexus_api_key'] = api_key
self.context['nexus_oauth_info'] = oauth_info # For engine auto-refresh
else:
new_key = input(f"{COLOR_PROMPT}Enter a new Nexus API Key (or press Enter to keep the saved one): {COLOR_RESET}").strip()
if new_key:
api_key = new_key
replace = input(f"{COLOR_PROMPT}Replace the saved key with this one? [y/N]: {COLOR_RESET}").strip().lower()
if replace == 'y':
if api_key_service.save_api_key(api_key):
print(f"{COLOR_SUCCESS}API key saved successfully.{COLOR_RESET}")
else:
print(f"{COLOR_WARNING}Failed to save API key. Using for this session only.{COLOR_RESET}")
# Auth expired or invalid - prompt to set up
print(f"\n{COLOR_WARNING}Your authentication has expired or is invalid.{COLOR_RESET}")
authenticated = False
if not authenticated:
# Not authenticated - offer to set up OAuth
print("\n" + "-" * 28)
print(f"{COLOR_WARNING}Nexus Mods authentication is required for downloading mods.{COLOR_RESET}")
print(f"\n{COLOR_PROMPT}Would you like to authorize with Nexus now?{COLOR_RESET}")
print(f"{COLOR_INFO}This will open your browser for secure OAuth authorization.{COLOR_RESET}")
authorize = input(f"{COLOR_PROMPT}Authorize now? [Y/n]: {COLOR_RESET}").strip().lower()
if authorize in ('', 'y', 'yes'):
# Launch OAuth authorization
print(f"\n{COLOR_INFO}Starting OAuth authorization...{COLOR_RESET}")
print(f"{COLOR_WARNING}Your browser will open shortly.{COLOR_RESET}")
print(f"{COLOR_INFO}Note: Your browser may ask permission to open 'xdg-open' or{COLOR_RESET}")
print(f"{COLOR_INFO}Jackify's protocol handler - please click 'Open' or 'Allow'.{COLOR_RESET}")
def show_message(msg):
print(f"\n{COLOR_INFO}{msg}{COLOR_RESET}")
success = auth_service.authorize_oauth(show_browser_message_callback=show_message)
if success:
print(f"\n{COLOR_SUCCESS}OAuth authorization successful!{COLOR_RESET}")
_, _, username = auth_service.get_auth_status()
if username:
print(f"{COLOR_INFO}Authorized as: {username}{COLOR_RESET}")
api_key, oauth_info = auth_service.get_auth_for_engine()
if api_key:
self.context['nexus_api_key'] = api_key
self.context['nexus_oauth_info'] = oauth_info # For engine auto-refresh
else:
print(f"{COLOR_INFO}Using new key for this session only. Saved key unchanged.{COLOR_RESET}")
print(f"{COLOR_ERROR}Failed to retrieve auth token after authorization.{COLOR_RESET}")
return None
else:
api_key = saved_key
else:
print("\n" + "-" * 28)
print(f"{COLOR_INFO}A Nexus Mods API key is required for downloading mods.{COLOR_RESET}")
print(f"{COLOR_INFO}You can get your personal key at: {COLOR_SELECTION}https://www.nexusmods.com/users/myaccount?tab=api{COLOR_RESET}")
print(f"{COLOR_WARNING}Your API Key is NOT saved locally. It is used only for this session unless you choose to save it.{COLOR_RESET}")
api_key = input(f"{COLOR_PROMPT}Enter Nexus API Key (or 'q' to cancel): {COLOR_RESET}").strip()
if not api_key or api_key.lower() == 'q':
self.logger.info("User cancelled or provided no API key.")
return None
save = input(f"{COLOR_PROMPT}Would you like to save this API key for future use? [y/N]: {COLOR_RESET}").strip().lower()
if save == 'y':
if api_key_service.save_api_key(api_key):
print(f"{COLOR_SUCCESS}API key saved successfully.{COLOR_RESET}")
else:
print(f"{COLOR_WARNING}Failed to save API key. Using for this session only.{COLOR_RESET}")
print(f"\n{COLOR_ERROR}OAuth authorization failed.{COLOR_RESET}")
return None
else:
print(f"{COLOR_INFO}Using API key for this session only. It will not be saved.{COLOR_RESET}")
self.context['nexus_api_key'] = api_key
self.logger.debug(f"NEXUS_API_KEY is set in environment for engine (presence check).")
# User declined OAuth - cancelled
print(f"\n{COLOR_INFO}Authorization required to proceed. Installation cancelled.{COLOR_RESET}")
self.logger.info("User declined Nexus authorization.")
return None
self.logger.debug(f"Nexus authentication configured for engine.")
# Display summary and confirm
self._display_summary() # Ensure this method exists or implement it
@@ -501,11 +529,23 @@ class ModlistInstallCLI:
if isinstance(download_dir_display, tuple):
download_dir_display = download_dir_display[0] # Get the Path object from (Path, bool)
print(f"Download Directory: {download_dir_display}")
if self.context.get('nexus_api_key'):
print(f"Nexus API Key: [SET]")
# Show authentication method
from jackify.backend.services.nexus_auth_service import NexusAuthService
auth_service = NexusAuthService()
authenticated, method, username = auth_service.get_auth_status()
if method == 'oauth':
auth_display = f"Nexus Authentication: OAuth"
if username:
auth_display += f" ({username})"
elif method == 'api_key':
auth_display = "Nexus Authentication: API Key (Legacy)"
else:
print(f"Nexus API Key: [NOT SET - WILL LIKELY FAIL]")
# Should never reach here since we validate auth before getting to summary
auth_display = "Nexus Authentication: Unknown"
print(auth_display)
print(f"{COLOR_INFO}----------------------------------------{COLOR_RESET}")
def configuration_phase(self):
@@ -521,7 +561,8 @@ class ModlistInstallCLI:
start_time = time.time()
# --- BEGIN: TEE LOGGING SETUP & LOG ROTATION ---
log_dir = Path.home() / "Jackify" / "logs"
from jackify.shared.paths import get_jackify_logs_dir
log_dir = get_jackify_logs_dir()
log_dir.mkdir(parents=True, exist_ok=True)
workflow_log_path = log_dir / "Modlist_Install_workflow.log"
# Log rotation: keep last 3 logs, 1MB each (adjust as needed)
@@ -577,7 +618,17 @@ class ModlistInstallCLI:
modlist_arg = self.context.get('modlist_value') or self.context.get('machineid')
machineid = self.context.get('machineid')
api_key = self.context['nexus_api_key']
# CRITICAL: Re-check authentication right before launching engine
# This ensures we use current auth state, not stale cached values from context
# (e.g., if user revoked OAuth after context was created)
from jackify.backend.services.nexus_auth_service import NexusAuthService
auth_service = NexusAuthService()
current_api_key, current_oauth_info = auth_service.get_auth_for_engine()
# Use current auth state, fallback to context values only if current check failed
api_key = current_api_key or self.context.get('nexus_api_key')
oauth_info = current_oauth_info or self.context.get('nexus_oauth_info')
# Path to the engine binary
engine_path = get_jackify_engine_path()
@@ -597,8 +648,8 @@ class ModlistInstallCLI:
# --- End Patch ---
# Build command
cmd = [engine_path, 'install']
cmd = [engine_path, 'install', '--show-file-progress']
# Check for debug mode and pass --debug to engine if needed
from jackify.backend.handlers.config_handler import ConfigHandler
config_handler = ConfigHandler()
@@ -606,7 +657,7 @@ class ModlistInstallCLI:
if debug_mode:
cmd.append('--debug')
self.logger.info("Debug mode enabled in config - passing --debug flag to jackify-engine")
# Determine if this is a local .wabbajack file or an online modlist
modlist_value = self.context.get('modlist_value')
machineid = self.context.get('machineid')
@@ -637,24 +688,43 @@ class ModlistInstallCLI:
# Store original environment values to restore later
original_env_values = {
'NEXUS_API_KEY': os.environ.get('NEXUS_API_KEY'),
'NEXUS_OAUTH_INFO': os.environ.get('NEXUS_OAUTH_INFO'),
'DOTNET_SYSTEM_GLOBALIZATION_INVARIANT': os.environ.get('DOTNET_SYSTEM_GLOBALIZATION_INVARIANT')
}
try:
# Temporarily modify current process's environment
if api_key:
# Prefer NEXUS_OAUTH_INFO (supports auto-refresh) over NEXUS_API_KEY (legacy)
if oauth_info:
os.environ['NEXUS_OAUTH_INFO'] = oauth_info
# CRITICAL: Set client_id so engine can refresh tokens with correct client_id
# Engine's RefreshToken method reads this to use our "jackify" client_id instead of hardcoded "wabbajack"
from jackify.backend.services.nexus_oauth_service import NexusOAuthService
os.environ['NEXUS_OAUTH_CLIENT_ID'] = NexusOAuthService.CLIENT_ID
self.logger.debug(f"Set NEXUS_OAUTH_INFO and NEXUS_OAUTH_CLIENT_ID={NexusOAuthService.CLIENT_ID} for engine (supports auto-refresh)")
# Also set NEXUS_API_KEY for backward compatibility
if api_key:
os.environ['NEXUS_API_KEY'] = api_key
elif api_key:
# No OAuth info, use API key only (no auto-refresh support)
os.environ['NEXUS_API_KEY'] = api_key
self.logger.debug(f"Temporarily set os.environ['NEXUS_API_KEY'] for engine call using session-provided key.")
elif 'NEXUS_API_KEY' in os.environ: # api_key is None/empty, but a system key might exist
self.logger.debug(f"Session API key not provided. Temporarily removing inherited NEXUS_API_KEY ('{'[REDACTED]' if os.environ.get('NEXUS_API_KEY') else 'None'}') from os.environ for engine call to ensure it is not used.")
del os.environ['NEXUS_API_KEY']
# If api_key is None and NEXUS_API_KEY was not in os.environ, it remains unset, which is correct.
self.logger.debug(f"Set NEXUS_API_KEY for engine (no auto-refresh)")
else:
# No auth available, clear any inherited values
if 'NEXUS_API_KEY' in os.environ:
del os.environ['NEXUS_API_KEY']
if 'NEXUS_OAUTH_INFO' in os.environ:
del os.environ['NEXUS_OAUTH_INFO']
if 'NEXUS_OAUTH_CLIENT_ID' in os.environ:
del os.environ['NEXUS_OAUTH_CLIENT_ID']
self.logger.debug(f"No Nexus auth available, cleared inherited env vars")
os.environ['DOTNET_SYSTEM_GLOBALIZATION_INVARIANT'] = "1"
self.logger.debug(f"Temporarily set os.environ['DOTNET_SYSTEM_GLOBALIZATION_INVARIANT'] = '1' for engine call.")
self.logger.info("Environment prepared for jackify-engine install process by modifying os.environ.")
self.logger.debug(f"NEXUS_API_KEY in os.environ (pre-call): {'[SET]' if os.environ.get('NEXUS_API_KEY') else '[NOT SET]'}")
self.logger.debug(f"NEXUS_OAUTH_INFO in os.environ (pre-call): {'[SET]' if os.environ.get('NEXUS_OAUTH_INFO') else '[NOT SET]'}")
pretty_cmd = ' '.join([f'"{arg}"' if ' ' in arg else arg for arg in cmd])
print(f"{COLOR_INFO}Launching Jackify Install Engine with command:{COLOR_RESET} {pretty_cmd}")
@@ -667,8 +737,10 @@ class ModlistInstallCLI:
else:
self.logger.warning(f"File descriptor limit: {message}")
# Popen now inherits the modified os.environ because env=None
proc = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, text=False, env=None, cwd=engine_dir)
# Use cleaned environment to prevent AppImage variable inheritance
from jackify.backend.handlers.subprocess_utils import get_clean_subprocess_env
clean_env = get_clean_subprocess_env()
proc = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, text=False, env=clean_env, cwd=engine_dir)
# Start performance monitoring for the engine process
# Adjust monitoring based on debug mode
@@ -724,6 +796,16 @@ class ModlistInstallCLI:
if chunk == b'\n':
# Complete line - decode and print
line = buffer.decode('utf-8', errors='replace')
# Filter FILE_PROGRESS spam but keep the status line before it
if '[FILE_PROGRESS]' in line:
parts = line.split('[FILE_PROGRESS]', 1)
if parts[0].strip():
line = parts[0].rstrip()
else:
# Skip this line entirely if it's only FILE_PROGRESS
buffer = b''
last_progress_time = time.time()
continue
# Enhance Nexus download errors with modlist context
enhanced_line = self._enhance_nexus_error(line)
print(enhanced_line, end='')
@@ -732,6 +814,16 @@ class ModlistInstallCLI:
elif chunk == b'\r':
# Carriage return - decode and print without newline
line = buffer.decode('utf-8', errors='replace')
# Filter FILE_PROGRESS spam but keep the status line before it
if '[FILE_PROGRESS]' in line:
parts = line.split('[FILE_PROGRESS]', 1)
if parts[0].strip():
line = parts[0].rstrip()
else:
# Skip this line entirely if it's only FILE_PROGRESS
buffer = b''
last_progress_time = time.time()
continue
# Enhance Nexus download errors with modlist context
enhanced_line = self._enhance_nexus_error(line)
print(enhanced_line, end='')
@@ -945,6 +1037,9 @@ class ModlistInstallCLI:
if configuration_success:
self.logger.info("Post-installation configuration completed successfully")
# Check for TTW integration eligibility
self._check_and_prompt_ttw_integration(install_dir_str, detected_game, modlist_name)
else:
self.logger.warning("Post-installation configuration had issues")
else:
@@ -1098,11 +1193,23 @@ class ModlistInstallCLI:
if isinstance(download_dir_display, tuple):
download_dir_display = download_dir_display[0] # Get the Path object from (Path, bool)
print(f"Download Directory: {download_dir_display}")
if self.context.get('nexus_api_key'):
print(f"Nexus API Key: [SET]")
# Show authentication method
from jackify.backend.services.nexus_auth_service import NexusAuthService
auth_service = NexusAuthService()
authenticated, method, username = auth_service.get_auth_status()
if method == 'oauth':
auth_display = f"Nexus Authentication: OAuth"
if username:
auth_display += f" ({username})"
elif method == 'api_key':
auth_display = "Nexus Authentication: API Key (Legacy)"
else:
print(f"Nexus API Key: [NOT SET - WILL LIKELY FAIL]")
# Should never reach here since we validate auth before getting to summary
auth_display = "Nexus Authentication: Unknown"
print(auth_display)
print(f"{COLOR_INFO}----------------------------------------{COLOR_RESET}")
def _enhance_nexus_error(self, line: str) -> str:
@@ -1134,5 +1241,173 @@ class ModlistInstallCLI:
# Add URL on next line for easier debugging
return f"{line}\n Nexus URL: {mod_url}"
return line
return line
def _check_and_prompt_ttw_integration(self, install_dir: str, game_type: str, modlist_name: str):
"""Check if modlist is eligible for TTW integration and prompt user"""
try:
# Check eligibility: FNV game, TTW-compatible modlist, no existing TTW
if not self._is_ttw_eligible(install_dir, game_type, modlist_name):
return
# Prompt user for TTW installation
print(f"\n{COLOR_PROMPT}═══════════════════════════════════════════════════════════════{COLOR_RESET}")
print(f"{COLOR_INFO}TTW Integration Available{COLOR_RESET}")
print(f"{COLOR_PROMPT}═══════════════════════════════════════════════════════════════{COLOR_RESET}")
print(f"\nThis modlist ({modlist_name}) supports Tale of Two Wastelands (TTW).")
print(f"TTW combines Fallout 3 and New Vegas into a single game.")
print(f"\nWould you like to install TTW now?")
user_input = input(f"{COLOR_PROMPT}Install TTW? (yes/no): {COLOR_RESET}").strip().lower()
if user_input in ['yes', 'y']:
self._launch_ttw_installation(modlist_name, install_dir)
else:
print(f"{COLOR_INFO}Skipping TTW installation. You can install it later from the main menu.{COLOR_RESET}")
except Exception as e:
self.logger.error(f"Error during TTW eligibility check: {e}", exc_info=True)
def _is_ttw_eligible(self, install_dir: str, game_type: str, modlist_name: str) -> bool:
"""Check if modlist is eligible for TTW integration"""
try:
from pathlib import Path
# Check 1: Must be Fallout New Vegas
if not game_type or game_type.lower() not in ['falloutnv', 'fallout new vegas', 'fallout_new_vegas']:
return False
# Check 2: Must be on TTW compatibility whitelist
from jackify.backend.data.ttw_compatible_modlists import is_ttw_compatible
if not is_ttw_compatible(modlist_name):
return False
# Check 3: TTW must not already be installed
if self._detect_existing_ttw(install_dir):
self.logger.info(f"TTW already installed in {install_dir}, skipping prompt")
return False
return True
except Exception as e:
self.logger.error(f"Error checking TTW eligibility: {e}")
return False
def _detect_existing_ttw(self, install_dir: str) -> bool:
"""Detect if TTW is already installed in the modlist"""
try:
from pathlib import Path
install_path = Path(install_dir)
# Search for TTW indicators in common locations
search_paths = [
install_path,
install_path / "mods",
install_path / "Stock Game",
install_path / "Game Root"
]
for search_path in search_paths:
if not search_path.exists():
continue
# Look for folders containing "tale" and "two" and "wastelands"
for folder in search_path.iterdir():
if not folder.is_dir():
continue
folder_name_lower = folder.name.lower()
if all(keyword in folder_name_lower for keyword in ['tale', 'two', 'wastelands']):
# Verify it has the TTW ESM file
for file in folder.rglob('*.esm'):
if 'taleoftwowastelands' in file.name.lower():
self.logger.info(f"Found existing TTW installation: {file}")
return True
return False
except Exception as e:
self.logger.error(f"Error detecting existing TTW: {e}")
return False
def _launch_ttw_installation(self, modlist_name: str, install_dir: str):
"""Launch TTW installation workflow"""
try:
print(f"\n{COLOR_INFO}Starting TTW installation workflow...{COLOR_RESET}")
# Import TTW installation handler
from jackify.backend.handlers.ttw_installer_handler import TTWInstallerHandler
from jackify.backend.models.configuration import SystemInfo
from pathlib import Path
system_info = SystemInfo()
ttw_installer_handler = TTWInstallerHandler(
steamdeck=system_info.is_steamdeck if hasattr(system_info, 'is_steamdeck') else False,
verbose=self.verbose if hasattr(self, 'verbose') else False,
filesystem_handler=self.filesystem_handler if hasattr(self, 'filesystem_handler') else None,
config_handler=self.config_handler if hasattr(self, 'config_handler') else None
)
# Check if TTW_Linux_Installer is installed
ttw_installer_handler._check_installation()
if not ttw_installer_handler.ttw_installer_installed:
print(f"{COLOR_INFO}TTW_Linux_Installer is not installed.{COLOR_RESET}")
user_input = input(f"{COLOR_PROMPT}Install TTW_Linux_Installer? (yes/no): {COLOR_RESET}").strip().lower()
if user_input not in ['yes', 'y']:
print(f"{COLOR_INFO}TTW installation cancelled.{COLOR_RESET}")
return
# Install TTW_Linux_Installer
print(f"{COLOR_INFO}Installing TTW_Linux_Installer...{COLOR_RESET}")
success, message = ttw_installer_handler.install_ttw_installer()
if not success:
print(f"{COLOR_ERROR}Failed to install TTW_Linux_Installer: {message}{COLOR_RESET}")
return
print(f"{COLOR_INFO}TTW_Linux_Installer installed successfully.{COLOR_RESET}")
# Prompt for TTW .mpi file
print(f"\n{COLOR_PROMPT}TTW Installer File (.mpi){COLOR_RESET}")
mpi_path = input(f"{COLOR_PROMPT}Path to TTW .mpi file: {COLOR_RESET}").strip()
if not mpi_path:
print(f"{COLOR_WARNING}No .mpi file specified. Cancelling.{COLOR_RESET}")
return
mpi_path = Path(mpi_path).expanduser()
if not mpi_path.exists() or not mpi_path.is_file():
print(f"{COLOR_ERROR}TTW .mpi file not found: {mpi_path}{COLOR_RESET}")
return
# Prompt for TTW installation directory
print(f"\n{COLOR_PROMPT}TTW Installation Directory{COLOR_RESET}")
default_ttw_dir = os.path.join(install_dir, 'TTW')
print(f"Default: {default_ttw_dir}")
ttw_install_dir = input(f"{COLOR_PROMPT}TTW install directory (Enter for default): {COLOR_RESET}").strip()
if not ttw_install_dir:
ttw_install_dir = default_ttw_dir
# Run TTW installation
print(f"\n{COLOR_INFO}Installing TTW using TTW_Linux_Installer...{COLOR_RESET}")
print(f"{COLOR_INFO}This may take a while (15-30 minutes depending on your system).{COLOR_RESET}")
success, message = ttw_installer_handler.install_ttw_backend(Path(mpi_path), Path(ttw_install_dir))
if success:
print(f"\n{COLOR_INFO}═══════════════════════════════════════════════════════════════{COLOR_RESET}")
print(f"{COLOR_INFO}TTW Installation Complete!{COLOR_RESET}")
print(f"{COLOR_PROMPT}═══════════════════════════════════════════════════════════════{COLOR_RESET}")
print(f"\nTTW has been installed to: {ttw_install_dir}")
print(f"The modlist '{modlist_name}' is now ready to use with TTW.")
else:
print(f"\n{COLOR_ERROR}TTW installation failed. Check the logs for details.{COLOR_RESET}")
print(f"{COLOR_ERROR}Error: {message}{COLOR_RESET}")
except Exception as e:
self.logger.error(f"Error during TTW installation: {e}", exc_info=True)
print(f"{COLOR_ERROR}Error during TTW installation: {e}{COLOR_RESET}")

View File

@@ -0,0 +1,442 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
OAuth Token Handler
Handles encrypted storage and retrieval of OAuth tokens
"""
import os
import json
import base64
import hashlib
import logging
import time
from typing import Optional, Dict
from pathlib import Path
logger = logging.getLogger(__name__)
class OAuthTokenHandler:
"""
Handles OAuth token storage with simple encryption
Stores tokens in ~/.config/jackify/nexus-oauth.json
"""
def __init__(self, config_dir: Optional[str] = None):
"""
Initialize token handler
Args:
config_dir: Optional custom config directory (defaults to ~/.config/jackify)
"""
if config_dir:
self.config_dir = Path(config_dir)
else:
self.config_dir = Path.home() / ".config" / "jackify"
self.token_file = self.config_dir / "nexus-oauth.json"
# Ensure config directory exists
self.config_dir.mkdir(parents=True, exist_ok=True)
# Generate encryption key based on machine-specific data
self._encryption_key = self._generate_encryption_key()
def _generate_encryption_key(self) -> bytes:
"""
Generate encryption key based on machine-specific data using Fernet
Uses hostname + username + machine ID as key material, similar to DPAPI approach.
This provides proper symmetric encryption while remaining machine-specific.
Returns:
Fernet-compatible 32-byte encryption key
"""
import socket
import getpass
try:
hostname = socket.gethostname()
username = getpass.getuser()
# Try to get machine ID for additional entropy
machine_id = None
try:
# Linux machine-id
with open('/etc/machine-id', 'r') as f:
machine_id = f.read().strip()
except:
try:
# Alternative locations
with open('/var/lib/dbus/machine-id', 'r') as f:
machine_id = f.read().strip()
except:
pass
# Combine multiple sources of machine-specific data
if machine_id:
key_material = f"{hostname}:{username}:{machine_id}:jackify"
else:
key_material = f"{hostname}:{username}:jackify"
except Exception as e:
logger.warning(f"Failed to get machine info for encryption: {e}")
key_material = "jackify:default:key"
# Generate 32-byte key using SHA256 for Fernet
# Fernet requires base64-encoded 32-byte key
key_bytes = hashlib.sha256(key_material.encode('utf-8')).digest()
return base64.urlsafe_b64encode(key_bytes)
def _encrypt_data(self, data: str) -> str:
"""
Encrypt data using AES-GCM (authenticated encryption)
Uses pycryptodome for cross-platform compatibility.
AES-GCM provides authenticated encryption similar to Fernet.
Args:
data: Plain text data
Returns:
Encrypted data as base64 string (nonce:ciphertext:tag format)
"""
try:
from Crypto.Cipher import AES
from Crypto.Random import get_random_bytes
# Derive 32-byte AES key from encryption_key (which is base64-encoded)
key = base64.urlsafe_b64decode(self._encryption_key)
# Generate random nonce (12 bytes for GCM)
nonce = get_random_bytes(12)
# Create AES-GCM cipher
cipher = AES.new(key, AES.MODE_GCM, nonce=nonce)
# Encrypt and get authentication tag
data_bytes = data.encode('utf-8')
ciphertext, tag = cipher.encrypt_and_digest(data_bytes)
# Combine nonce:ciphertext:tag and base64 encode
combined = nonce + ciphertext + tag
return base64.b64encode(combined).decode('utf-8')
except ImportError:
logger.error("pycryptodome package not available for token encryption")
return ""
except Exception as e:
logger.error(f"Failed to encrypt data: {e}")
return ""
def _decrypt_data(self, encrypted_data: str) -> Optional[str]:
"""
Decrypt data using AES-GCM (authenticated encryption)
Args:
encrypted_data: Encrypted data string (base64-encoded nonce:ciphertext:tag)
Returns:
Decrypted plain text or None on failure
"""
try:
from Crypto.Cipher import AES
# Check if MODE_GCM is available (pycryptodome has it, old pycrypto doesn't)
if not hasattr(AES, 'MODE_GCM'):
logger.error("pycryptodome required for token decryption (pycrypto doesn't support MODE_GCM)")
return None
# Derive 32-byte AES key from encryption_key
key = base64.urlsafe_b64decode(self._encryption_key)
# Decode base64 and split nonce:ciphertext:tag
combined = base64.b64decode(encrypted_data.encode('utf-8'))
nonce = combined[:12]
tag = combined[-16:]
ciphertext = combined[12:-16]
# Create AES-GCM cipher
cipher = AES.new(key, AES.MODE_GCM, nonce=nonce)
# Decrypt and verify authentication tag
plaintext = cipher.decrypt_and_verify(ciphertext, tag)
return plaintext.decode('utf-8')
except ImportError:
logger.error("pycryptodome package not available for token decryption")
return None
except AttributeError:
logger.error("pycryptodome required for token decryption (pycrypto doesn't support MODE_GCM)")
return None
except Exception as e:
logger.error(f"Failed to decrypt data: {e}")
return None
def save_token(self, token_data: Dict) -> bool:
"""
Save OAuth token to encrypted file with proper permissions
Args:
token_data: Token data dict from OAuth response
Returns:
True if saved successfully
"""
try:
# Add timestamp for tracking
token_data['_saved_at'] = int(time.time())
# Convert to JSON
json_data = json.dumps(token_data, indent=2)
# Encrypt using Fernet
encrypted = self._encrypt_data(json_data)
if not encrypted:
logger.error("Encryption failed, cannot save token")
return False
# Save to file with restricted permissions
# Write to temp file first, then move (atomic operation)
import tempfile
fd, temp_path = tempfile.mkstemp(dir=self.config_dir, prefix='.oauth_tmp_')
try:
with os.fdopen(fd, 'w') as f:
json.dump({'encrypted_data': encrypted}, f, indent=2)
# Set restrictive permissions (owner read/write only)
os.chmod(temp_path, 0o600)
# Atomic move
os.replace(temp_path, self.token_file)
logger.info(f"Saved encrypted OAuth token to {self.token_file}")
return True
except Exception as e:
# Clean up temp file on error
try:
os.unlink(temp_path)
except:
pass
raise e
except Exception as e:
logger.error(f"Failed to save OAuth token: {e}")
return False
def load_token(self) -> Optional[Dict]:
"""
Load OAuth token from encrypted file
Returns:
Token data dict or None if not found or invalid
"""
if not self.token_file.exists():
logger.debug("No OAuth token file found")
return None
try:
# Load encrypted data
with open(self.token_file, 'r') as f:
data = json.load(f)
encrypted = data.get('encrypted_data')
if not encrypted:
logger.error("Token file missing encrypted_data field")
return None
# Decrypt
decrypted = self._decrypt_data(encrypted)
if not decrypted:
logger.error("Failed to decrypt token data")
return None
# Parse JSON
token_data = json.loads(decrypted)
logger.debug("Successfully loaded OAuth token")
return token_data
except json.JSONDecodeError as e:
logger.error(f"Token file contains invalid JSON: {e}")
return None
except Exception as e:
logger.error(f"Failed to load OAuth token: {e}")
return None
def delete_token(self) -> bool:
"""
Delete OAuth token file
Returns:
True if deleted successfully
"""
try:
if self.token_file.exists():
self.token_file.unlink()
logger.info("Deleted OAuth token file")
return True
else:
logger.debug("No OAuth token file to delete")
return False
except Exception as e:
logger.error(f"Failed to delete OAuth token: {e}")
return False
def has_token(self) -> bool:
"""
Check if OAuth token file exists
Returns:
True if token file exists
"""
return self.token_file.exists()
def is_token_expired(self, token_data: Optional[Dict] = None, buffer_minutes: int = 5) -> bool:
"""
Check if token is expired or close to expiring
Args:
token_data: Optional token data dict (loads from file if not provided)
buffer_minutes: Minutes before expiry to consider token expired (default 5)
Returns:
True if token is expired or will expire within buffer_minutes
"""
if token_data is None:
token_data = self.load_token()
if not token_data:
return True
# Extract OAuth data if nested
oauth_data = token_data.get('oauth', token_data)
# Get expiry information
expires_in = oauth_data.get('expires_in')
saved_at = token_data.get('_saved_at')
if not expires_in or not saved_at:
logger.debug("Token missing expiry information, assuming valid")
return False # Assume token is valid if no expiry info
# Calculate expiry time
expires_at = saved_at + expires_in
buffer_seconds = buffer_minutes * 60
now = int(time.time())
# Check if expired or within buffer
is_expired = (expires_at - buffer_seconds) < now
if is_expired:
remaining = expires_at - now
if remaining < 0:
logger.debug(f"Token expired {-remaining} seconds ago")
else:
logger.debug(f"Token expires in {remaining} seconds (within buffer)")
return is_expired
def get_access_token(self) -> Optional[str]:
"""
Get access token from storage
Returns:
Access token string or None if not found or expired
"""
token_data = self.load_token()
if not token_data:
return None
# Check if expired
if self.is_token_expired(token_data):
logger.debug("Stored token is expired")
return None
# Extract access token from OAuth structure
oauth_data = token_data.get('oauth', token_data)
access_token = oauth_data.get('access_token')
if not access_token:
logger.error("Token data missing access_token field")
return None
return access_token
def get_refresh_token(self) -> Optional[str]:
"""
Get refresh token from storage
Returns:
Refresh token string or None if not found
"""
token_data = self.load_token()
if not token_data:
return None
# Extract refresh token from OAuth structure
oauth_data = token_data.get('oauth', token_data)
refresh_token = oauth_data.get('refresh_token')
return refresh_token
def get_token_info(self) -> Dict:
"""
Get diagnostic information about current token
Returns:
Dict with token status information
"""
token_data = self.load_token()
if not token_data:
return {
'has_token': False,
'error': 'No token file found'
}
oauth_data = token_data.get('oauth', token_data)
expires_in = oauth_data.get('expires_in')
saved_at = token_data.get('_saved_at')
# Check if refresh token is likely expired (30 days since last auth)
# Nexus doesn't provide refresh token expiry, so we estimate conservatively
REFRESH_TOKEN_LIFETIME_DAYS = 30
now = int(time.time())
refresh_token_age_days = (now - saved_at) / 86400 if saved_at else 0
refresh_token_likely_expired = refresh_token_age_days > REFRESH_TOKEN_LIFETIME_DAYS
if expires_in and saved_at:
expires_at = saved_at + expires_in
remaining_seconds = expires_at - now
return {
'has_token': True,
'has_refresh_token': bool(oauth_data.get('refresh_token')),
'expires_in_seconds': remaining_seconds,
'expires_in_minutes': remaining_seconds / 60,
'expires_in_hours': remaining_seconds / 3600,
'is_expired': remaining_seconds < 0,
'expires_soon_5min': remaining_seconds < 300,
'expires_soon_15min': remaining_seconds < 900,
'saved_at': saved_at,
'expires_at': expires_at,
'refresh_token_age_days': refresh_token_age_days,
'refresh_token_likely_expired': refresh_token_likely_expired,
}
else:
return {
'has_token': True,
'has_refresh_token': bool(oauth_data.get('refresh_token')),
'refresh_token_age_days': refresh_token_age_days,
'refresh_token_likely_expired': refresh_token_likely_expired,
'error': 'Token missing expiry information'
}

View File

@@ -12,6 +12,7 @@ import shutil
from pathlib import Path
from typing import Optional, Union, Dict, Any, List, Tuple
from datetime import datetime
import vdf
# Initialize logger
logger = logging.getLogger(__name__)
@@ -32,14 +33,21 @@ class PathHandler:
@staticmethod
def _strip_sdcard_path_prefix(path_obj: Path) -> str:
"""
Removes the '/run/media/mmcblk0p1/' prefix if present.
Removes any detected SD card mount prefix dynamically.
Handles both /run/media/mmcblk0p1 and /run/media/deck/UUID patterns.
Returns the path as a POSIX-style string (using /).
"""
path_str = path_obj.as_posix() # Work with consistent forward slashes
if path_str.lower().startswith(SDCARD_PREFIX.lower()):
# Return the part *after* the prefix, ensuring no leading slash remains unless root
relative_part = path_str[len(SDCARD_PREFIX):]
return relative_part if relative_part else "." # Return '.' if it was exactly the prefix
from .wine_utils import WineUtils
path_str = path_obj.as_posix() # Work with consistent forward slashes
# Use dynamic SD card detection from WineUtils
stripped_path = WineUtils._strip_sdcard_path(path_str)
if stripped_path != path_str:
# Path was stripped, remove leading slash for relative path
return stripped_path.lstrip('/') if stripped_path != '/' else '.'
return path_str
@staticmethod
@@ -251,7 +259,7 @@ class PathHandler:
return False
@staticmethod
def create_dxvk_conf(modlist_dir, modlist_sdcard, steam_library, basegame_sdcard, game_var_full, vanilla_game_dir=None):
def create_dxvk_conf(modlist_dir, modlist_sdcard, steam_library, basegame_sdcard, game_var_full, vanilla_game_dir=None, stock_game_path=None):
"""
Create dxvk.conf file in the appropriate location
@@ -262,6 +270,7 @@ class PathHandler:
basegame_sdcard (bool): Whether the base game is on an SD card
game_var_full (str): Full name of the game (e.g., "Skyrim Special Edition")
vanilla_game_dir (str): Optional path to vanilla game directory for fallback
stock_game_path (str): Direct path to detected stock game directory (if available)
Returns:
bool: True on success, False on failure
@@ -269,49 +278,45 @@ class PathHandler:
try:
logger.info("Creating dxvk.conf file...")
# Determine the location for dxvk.conf
dxvk_conf_path = None
candidate_dirs = PathHandler._build_dxvk_candidate_dirs(
modlist_dir=modlist_dir,
stock_game_path=stock_game_path,
steam_library=steam_library,
game_var_full=game_var_full,
vanilla_game_dir=vanilla_game_dir
)
# Check for common stock game directories first, then vanilla as fallback
stock_game_paths = [
os.path.join(modlist_dir, "Stock Game"),
os.path.join(modlist_dir, "Game Root"),
os.path.join(modlist_dir, "STOCK GAME"),
os.path.join(modlist_dir, "Stock Game Folder"),
os.path.join(modlist_dir, "Stock Folder"),
os.path.join(modlist_dir, "Skyrim Stock"),
os.path.join(modlist_dir, "root", "Skyrim Special Edition")
]
if not candidate_dirs:
logger.error("Could not determine location for dxvk.conf (no candidate directories found)")
return False
# Add vanilla game directory as fallback if steam_library and game_var_full are provided
if steam_library and game_var_full:
stock_game_paths.append(os.path.join(steam_library, "steamapps", "common", game_var_full))
for path in stock_game_paths:
if os.path.exists(path):
dxvk_conf_path = os.path.join(path, "dxvk.conf")
target_dir = None
for directory in candidate_dirs:
if directory.is_dir():
target_dir = directory
break
if not dxvk_conf_path:
# Fallback: Try vanilla game directory if provided
if vanilla_game_dir and os.path.exists(vanilla_game_dir):
logger.info(f"Attempting fallback to vanilla game directory: {vanilla_game_dir}")
dxvk_conf_path = os.path.join(vanilla_game_dir, "dxvk.conf")
logger.info(f"Using vanilla game directory for dxvk.conf: {dxvk_conf_path}")
if target_dir is None:
fallback_dir = Path(modlist_dir) if modlist_dir and Path(modlist_dir).is_dir() else None
if fallback_dir:
logger.warning(f"No stock/vanilla directories found; falling back to modlist directory: {fallback_dir}")
target_dir = fallback_dir
else:
logger.error("Could not determine location for dxvk.conf")
logger.error("All candidate directories for dxvk.conf are missing.")
return False
dxvk_conf_path = target_dir / "dxvk.conf"
# The required line that Jackify needs
required_line = "dxvk.enableGraphicsPipelineLibrary = False"
# Check if dxvk.conf already exists
if os.path.exists(dxvk_conf_path):
if dxvk_conf_path.exists():
logger.info(f"Found existing dxvk.conf at {dxvk_conf_path}")
# Read existing content
try:
with open(dxvk_conf_path, 'r') as f:
with open(dxvk_conf_path, 'r', encoding='utf-8') as f:
existing_content = f.read().strip()
# Check if our required line is already present
@@ -332,7 +337,7 @@ class PathHandler:
updated_content = required_line + '\n'
logger.info("Adding required DXVK setting to empty file")
with open(dxvk_conf_path, 'w') as f:
with open(dxvk_conf_path, 'w', encoding='utf-8') as f:
f.write(updated_content)
logger.info(f"dxvk.conf updated successfully at {dxvk_conf_path}")
@@ -346,7 +351,8 @@ class PathHandler:
# Create new dxvk.conf file (original behavior)
dxvk_conf_content = required_line + '\n'
with open(dxvk_conf_path, 'w') as f:
dxvk_conf_path.parent.mkdir(parents=True, exist_ok=True)
with open(dxvk_conf_path, 'w', encoding='utf-8') as f:
f.write(dxvk_conf_content)
logger.info(f"dxvk.conf created successfully at {dxvk_conf_path}")
@@ -355,6 +361,99 @@ class PathHandler:
except Exception as e:
logger.error(f"Error creating dxvk.conf: {e}")
return False
@staticmethod
def verify_dxvk_conf_exists(modlist_dir, steam_library, game_var_full, vanilla_game_dir=None, stock_game_path=None) -> bool:
"""
Verify that dxvk.conf exists in at least one of the candidate directories and contains the required setting.
"""
required_line = "dxvk.enableGraphicsPipelineLibrary = False"
candidate_dirs = PathHandler._build_dxvk_candidate_dirs(
modlist_dir=modlist_dir,
stock_game_path=stock_game_path,
steam_library=steam_library,
game_var_full=game_var_full,
vanilla_game_dir=vanilla_game_dir
)
for directory in candidate_dirs:
conf_path = directory / "dxvk.conf"
if conf_path.is_file():
try:
with open(conf_path, 'r', encoding='utf-8') as f:
content = f.read()
if required_line not in content:
logger.warning(f"dxvk.conf found at {conf_path} but missing required setting. Appending now.")
with open(conf_path, 'a', encoding='utf-8') as f:
if not content.endswith('\n'):
f.write('\n')
f.write(required_line + '\n')
logger.info(f"Verified dxvk.conf at {conf_path}")
return True
except Exception as e:
logger.warning(f"Failed to verify dxvk.conf at {conf_path}: {e}")
logger.warning("dxvk.conf verification failed - file not found in any candidate directory.")
return False
@staticmethod
def _normalize_common_library_path(steam_library: Optional[str]) -> Optional[Path]:
if not steam_library:
return None
path = Path(steam_library)
parts_lower = [part.lower() for part in path.parts]
if len(parts_lower) >= 2 and parts_lower[-2:] == ['steamapps', 'common']:
return path
if parts_lower and parts_lower[-1] == 'common':
return path
if 'steamapps' in parts_lower:
idx = parts_lower.index('steamapps')
truncated = Path(*path.parts[:idx + 1])
return truncated / 'common'
return path / 'steamapps' / 'common'
@staticmethod
def _build_dxvk_candidate_dirs(modlist_dir, stock_game_path, steam_library, game_var_full, vanilla_game_dir) -> List[Path]:
candidates: List[Path] = []
seen = set()
def add_candidate(path_obj: Optional[Path]):
if not path_obj:
return
key = path_obj.resolve() if path_obj.exists() else path_obj
if key in seen:
return
seen.add(key)
candidates.append(path_obj)
if stock_game_path:
add_candidate(Path(stock_game_path))
if modlist_dir:
base_path = Path(modlist_dir)
common_names = [
"Stock Game",
"Game Root",
"STOCK GAME",
"Stock Game Folder",
"Stock Folder",
"Skyrim Stock",
os.path.join("root", "Skyrim Special Edition")
]
for name in common_names:
add_candidate(base_path / name)
steam_common = PathHandler._normalize_common_library_path(steam_library)
if steam_common and game_var_full:
add_candidate(steam_common / game_var_full)
if vanilla_game_dir:
add_candidate(Path(vanilla_game_dir))
if modlist_dir:
add_candidate(Path(modlist_dir))
return candidates
@staticmethod
def find_steam_config_vdf() -> Optional[Path]:
@@ -383,7 +482,7 @@ class PathHandler:
libraryfolders_vdf_paths = [
os.path.expanduser("~/.steam/steam/config/libraryfolders.vdf"),
os.path.expanduser("~/.local/share/Steam/config/libraryfolders.vdf"),
# Add other potential standard locations if necessary
os.path.expanduser("~/.var/app/com.valvesoftware.Steam/.local/share/Steam/config/libraryfolders.vdf"), # Flatpak
]
# Simple backup mechanism (optional but good practice)
@@ -484,40 +583,53 @@ class PathHandler:
logger.debug(f"Searching for compatdata directory for AppID: {appid}")
# Use libraryfolders.vdf to find all Steam library paths
# Use libraryfolders.vdf to find all Steam library paths, when available
library_paths = PathHandler.get_all_steam_library_paths()
if not library_paths:
logger.error("Could not find any Steam library paths from libraryfolders.vdf")
return None
if library_paths:
logger.debug(f"Checking compatdata in {len(library_paths)} Steam libraries")
# Check each Steam library's compatdata directory
for library_path in library_paths:
compatdata_base = library_path / "steamapps" / "compatdata"
if not compatdata_base.is_dir():
logger.debug(f"Compatdata directory does not exist: {compatdata_base}")
continue
potential_path = compatdata_base / appid
if potential_path.is_dir():
logger.info(f"Found compatdata directory: {potential_path}")
return potential_path
else:
logger.debug(f"Compatdata for AppID {appid} not found in {compatdata_base}")
logger.debug(f"Checking compatdata in {len(library_paths)} Steam libraries")
# Check fallback locations only if we didn't find valid libraries
# If we have valid libraries from libraryfolders.vdf, we should NOT fall back to wrong locations
is_flatpak_steam = any('.var/app/com.valvesoftware.Steam' in str(lib) for lib in library_paths) if library_paths else False
# Check each Steam library's compatdata directory
for library_path in library_paths:
compatdata_base = library_path / "steamapps" / "compatdata"
if not compatdata_base.is_dir():
logger.debug(f"Compatdata directory does not exist: {compatdata_base}")
continue
potential_path = compatdata_base / appid
if potential_path.is_dir():
logger.info(f"Found compatdata directory: {potential_path}")
return potential_path
if not library_paths or is_flatpak_steam:
# Only check Flatpak-specific fallbacks if we have Flatpak Steam
logger.debug("Checking fallback compatdata locations...")
if is_flatpak_steam:
# For Flatpak Steam, only check Flatpak-specific locations
fallback_locations = [
Path.home() / ".var/app/com.valvesoftware.Steam/.local/share/Steam/steamapps/compatdata",
Path.home() / ".var/app/com.valvesoftware.Steam/data/Steam/steamapps/compatdata",
]
else:
logger.debug(f"Compatdata for AppID {appid} not found in {compatdata_base}")
# Fallback: Broad search (can be slow, consider if needed)
# try:
# logger.debug(f"Compatdata not found in standard locations, attempting wider search...")
# # This can be very slow and resource-intensive
# # find_output = subprocess.check_output(['find', '/', '-type', 'd', '-name', appid, '-path', '*/compatdata/*', '-print', '-quit', '2>/dev/null'], text=True).strip()
# # if find_output:
# # logger.info(f"Found compatdata via find command: {find_output}")
# # return Path(find_output)
# except Exception as e:
# logger.warning(f"Error during 'find' command for compatdata: {e}")
logger.warning(f"Compatdata directory for AppID {appid} not found.")
# For native Steam or unknown, check standard locations
fallback_locations = [
Path.home() / ".local/share/Steam/steamapps/compatdata",
Path.home() / ".steam/steam/steamapps/compatdata",
]
for compatdata_base in fallback_locations:
if compatdata_base.is_dir():
potential_path = compatdata_base / appid
if potential_path.is_dir():
logger.warning(f"Found compatdata directory in fallback location (may be from old incorrect creation): {potential_path}")
return potential_path
logger.warning(f"Compatdata directory for AppID {appid} not found in any Steam library or fallback location.")
return None
@staticmethod
@@ -610,12 +722,22 @@ class PathHandler:
if vdf_path.is_file():
logger.info(f"[DEBUG] Parsing libraryfolders.vdf: {vdf_path}")
try:
with open(vdf_path) as f:
for line in f:
m = re.search(r'"path"\s*"([^"]+)"', line)
if m:
lib_path = Path(m.group(1))
library_paths.add(lib_path)
with open(vdf_path, 'r', encoding='utf-8') as f:
data = vdf.load(f)
# libraryfolders.vdf structure: libraryfolders -> "0", "1", etc. -> "path"
libraryfolders = data.get('libraryfolders', {})
for key, lib_data in libraryfolders.items():
if isinstance(lib_data, dict) and 'path' in lib_data:
lib_path = Path(lib_data['path'])
# Resolve symlinks for consistency (mmcblk0p1 -> deck/UUID)
try:
resolved_path = lib_path.resolve()
library_paths.add(resolved_path)
logger.debug(f"[DEBUG] Found library path: {resolved_path}")
except (OSError, RuntimeError) as resolve_err:
# If resolve fails, use original path
logger.warning(f"[DEBUG] Could not resolve {lib_path}, using as-is: {resolve_err}")
library_paths.add(lib_path)
except Exception as e:
logger.error(f"[DEBUG] Failed to parse {vdf_path}: {e}")
logger.info(f"[DEBUG] All detected Steam libraries: {library_paths}")
@@ -623,47 +745,30 @@ class PathHandler:
# Moved _find_shortcuts_vdf here from ShortcutHandler
def _find_shortcuts_vdf(self) -> Optional[str]:
"""Helper to find the active shortcuts.vdf file for a user.
"""Helper to find the active shortcuts.vdf file for the current Steam user.
Iterates through userdata directories and returns the path to the
first found shortcuts.vdf file.
Uses proper multi-user detection to find the correct Steam user instead
of just taking the first found user directory.
Returns:
Optional[str]: The full path to the shortcuts.vdf file, or None if not found.
"""
# This implementation was moved from ShortcutHandler
userdata_base_paths = [
os.path.expanduser("~/.steam/steam/userdata"),
os.path.expanduser("~/.local/share/Steam/userdata"),
os.path.expanduser("~/.var/app/com.valvesoftware.Steam/.local/share/Steam/userdata")
]
found_vdf_path = None
for base_path in userdata_base_paths:
if not os.path.isdir(base_path):
logger.debug(f"Userdata base path not found or not a directory: {base_path}")
continue
logger.debug(f"Searching for user IDs in: {base_path}")
try:
for item in os.listdir(base_path):
user_path = os.path.join(base_path, item)
if os.path.isdir(user_path) and item.isdigit():
logger.debug(f"Checking user directory: {user_path}")
config_path = os.path.join(user_path, "config")
shortcuts_file = os.path.join(config_path, "shortcuts.vdf")
if os.path.isfile(shortcuts_file):
logger.info(f"Found shortcuts.vdf at: {shortcuts_file}")
found_vdf_path = shortcuts_file
break # Found it for this base path
else:
logger.debug(f"shortcuts.vdf not found in {config_path}")
except OSError as e:
logger.warning(f"Could not access directory {base_path}: {e}")
continue # Try next base path
if found_vdf_path:
break # Found it in this base path
if not found_vdf_path:
logger.error("Could not find any shortcuts.vdf file in common Steam locations.")
return found_vdf_path
try:
# Use native Steam service for proper multi-user detection
from jackify.backend.services.native_steam_service import NativeSteamService
steam_service = NativeSteamService()
shortcuts_path = steam_service.get_shortcuts_vdf_path()
if shortcuts_path:
logger.info(f"Found shortcuts.vdf using multi-user detection: {shortcuts_path}")
return str(shortcuts_path)
else:
logger.error("Could not determine shortcuts.vdf path using multi-user detection")
return None
except Exception as e:
logger.error(f"Error using multi-user detection for shortcuts.vdf: {e}")
return None
@staticmethod
def find_game_install_paths(target_appids: Dict[str, str]) -> Dict[str, Path]:
@@ -686,10 +791,10 @@ class PathHandler:
# For each library path, look for each target game
for library_path in library_paths:
# Check if the common directory exists
common_dir = library_path / "common"
# Check if the common directory exists (games are in steamapps/common)
common_dir = library_path / "steamapps" / "common"
if not common_dir.is_dir():
logger.debug(f"No 'common' directory in library: {library_path}")
logger.debug(f"No 'steamapps/common' directory in library: {library_path}")
continue
# Get subdirectories in common dir
@@ -704,8 +809,8 @@ class PathHandler:
if game_name in results:
continue # Already found this game
# Try to find by appmanifest
appmanifest_path = library_path / f"appmanifest_{app_id}.acf"
# Try to find by appmanifest (manifests are in steamapps subdirectory)
appmanifest_path = library_path / "steamapps" / f"appmanifest_{app_id}.acf"
if appmanifest_path.is_file():
# Find the installdir value
try:
@@ -737,7 +842,7 @@ class PathHandler:
try:
with open(modlist_ini_path, 'r', encoding='utf-8', errors='ignore') as f:
lines = f.readlines()
drive_letter = "D:" if modlist_sdcard else "Z:"
drive_letter = "D:\\\\" if modlist_sdcard else "Z:\\\\"
processed_path = self._strip_sdcard_path_prefix(new_game_path)
windows_style = processed_path.replace('/', '\\')
windows_style_double = windows_style.replace('\\', '\\\\')
@@ -787,17 +892,36 @@ class PathHandler:
# Extract existing gamePath to use as source of truth for vanilla game location
existing_game_path = None
for line in lines:
gamepath_line_index = -1
for i, line in enumerate(lines):
if re.match(r'^\s*gamepath\s*=.*@ByteArray\(([^)]+)\)', line, re.IGNORECASE):
match = re.search(r'@ByteArray\(([^)]+)\)', line)
if match:
raw_path = match.group(1)
gamepath_line_index = i
# Convert Windows path back to Linux path
if raw_path.startswith(('Z:', 'D:')):
linux_path = raw_path[2:].replace('\\\\', '/').replace('\\', '/')
existing_game_path = linux_path
logger.debug(f"Extracted existing gamePath: {existing_game_path}")
break
# Special handling for gamePath in three-true scenario (engine_installed + steamdeck + sdcard)
if modlist_sdcard and existing_game_path and existing_game_path.startswith('/run/media') and gamepath_line_index != -1:
# Simple manual stripping of /run/media/deck/UUID pattern for SD card paths
# Match /run/media/deck/[UUID]/Games/... and extract just /Games/...
sdcard_pattern = r'^/run/media/deck/[^/]+(/Games/.*)$'
match = re.match(sdcard_pattern, existing_game_path)
if match:
stripped_path = match.group(1) # Just the /Games/... part
windows_path = stripped_path.replace('/', '\\\\')
new_gamepath_value = f"D:\\\\{windows_path}"
new_gamepath_line = f"gamePath = @ByteArray({new_gamepath_value})\n"
logger.info(f"Updating gamePath for SD card: {lines[gamepath_line_index].strip()} -> {new_gamepath_line.strip()}")
lines[gamepath_line_index] = new_gamepath_line
else:
logger.warning(f"SD card path doesn't match expected pattern: {existing_game_path}")
game_path_updated = False
binary_paths_updated = 0
@@ -862,10 +986,9 @@ class PathHandler:
else:
found_stock = None
for folder in STOCK_GAME_FOLDERS:
folder_pattern = f"/{folder.replace(' ', '')}".lower()
value_part_lower = value_part.replace(' ', '').lower()
if folder_pattern in value_part_lower:
idx = value_part_lower.index(folder_pattern)
folder_pattern = f"/{folder}"
if folder_pattern in value_part:
idx = value_part.index(folder_pattern)
rel_path = value_part[idx:].lstrip('/')
found_stock = folder
break
@@ -876,9 +999,10 @@ class PathHandler:
rel_path = value_part[idx:].lstrip('/')
else:
rel_path = exe_name
new_binary_path = f"{drive_prefix}/{modlist_dir_path}/{rel_path}".replace('\\', '/').replace('//', '/')
processed_modlist_path = PathHandler._strip_sdcard_path_prefix(modlist_dir_path) if modlist_sdcard else str(modlist_dir_path)
new_binary_path = f"{drive_prefix}/{processed_modlist_path}/{rel_path}".replace('\\', '/').replace('//', '/')
formatted_binary_path = PathHandler._format_binary_for_mo2(new_binary_path)
new_binary_line = f"{index}{backslash_style}binary={formatted_binary_path}"
new_binary_line = f"{index}{backslash_style}binary = {formatted_binary_path}"
logger.debug(f"Updating binary path: {line.strip()} -> {new_binary_line}")
lines[i] = new_binary_line + "\n"
binary_paths_updated += 1
@@ -893,7 +1017,7 @@ class PathHandler:
wd_path = drive_prefix + wd_path
formatted_wd_path = PathHandler._format_workingdir_for_mo2(wd_path)
key_part = f"{index}{backslash_style}workingDirectory"
new_wd_line = f"{key_part}={formatted_wd_path}"
new_wd_line = f"{key_part} = {formatted_wd_path}"
logger.debug(f"Updating working directory: {wd_line.strip()} -> {new_wd_line}")
lines[j] = new_wd_line + "\n"
working_dirs_updated += 1

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,51 @@
"""
Example usage of ProgressParser
This file demonstrates how to use the progress parser to extract
structured information from jackify-engine output.
R&D NOTE: This is experimental code for investigation purposes.
"""
from jackify.backend.handlers.progress_parser import ProgressStateManager
def example_usage():
"""Example of how to use the progress parser."""
# Create state manager
state_manager = ProgressStateManager()
# Simulate processing lines from jackify-engine output
sample_lines = [
"[00:00:00] === Installing files ===",
"[00:00:05] [12/14] Installing files (1.1GB/56.3GB)",
"[00:00:10] Installing: Enderal Remastered Armory.7z (42%)",
"[00:00:15] Extracting: Mandragora Sprouts.7z (96%)",
"[00:00:20] Downloading at 45.2MB/s",
"[00:00:25] Extracting at 267.3MB/s",
"[00:00:30] Progress: 85%",
]
print("Processing sample output lines...\n")
for line in sample_lines:
updated = state_manager.process_line(line)
if updated:
state = state_manager.get_state()
print(f"Line: {line}")
print(f" Phase: {state.phase.value} - {state.phase_name}")
print(f" Progress: {state.overall_percent:.1f}%")
print(f" Step: {state.phase_progress_text}")
print(f" Data: {state.data_progress_text}")
print(f" Active Files: {len(state.active_files)}")
for file_prog in state.active_files:
print(f" - {file_prog.filename}: {file_prog.percent:.1f}%")
print(f" Speeds: {state.speeds}")
print(f" Display: {state.display_text}")
print()
if __name__ == "__main__":
example_usage()

File diff suppressed because it is too large Load Diff

View File

@@ -41,7 +41,7 @@ class ShortcutHandler:
self._last_shortcuts_backup = None # Track the last backup path
self._safe_shortcuts_backup = None # Track backup made just before restart
# Initialize ProtontricksHandler here, passing steamdeck status
self.protontricks_handler = ProtontricksHandler(steamdeck=self.steamdeck)
self.protontricks_handler = ProtontricksHandler(self.steamdeck)
def _enable_tab_completion(self):
"""Enable tab completion for file paths using the shared completer"""
@@ -198,12 +198,15 @@ class ShortcutHandler:
if steam_vdf_spec is None:
# Try to install steam-vdf using pip
print("Installing required dependency (steam-vdf)...")
subprocess.check_call([sys.executable, "-m", "pip", "install", "steam-vdf", "--user"])
# CRITICAL: Use safe Python executable to prevent AppImage recursive spawning
from jackify.backend.handlers.subprocess_utils import get_safe_python_executable
python_exe = get_safe_python_executable()
subprocess.check_call([python_exe, "-m", "pip", "install", "steam-vdf", "--user"])
time.sleep(1) # Give some time for the install to complete
# Now import it
import steam_vdf
import vdf as steam_vdf
with open(shortcuts_file, 'rb') as f:
shortcuts_data = steam_vdf.load(f)
@@ -952,7 +955,10 @@ class ShortcutHandler:
def get_appid_for_shortcut(self, shortcut_name: str, exe_path: Optional[str] = None) -> Optional[str]:
"""
Find the current AppID for a given shortcut name and (optionally) executable path using protontricks.
Find the current AppID for a given shortcut name and (optionally) executable path.
Primary method: Read directly from shortcuts.vdf (reliable, no external dependencies)
Fallback method: Use protontricks (if available)
Args:
shortcut_name (str): The name of the Steam shortcut.
@@ -962,15 +968,22 @@ class ShortcutHandler:
Optional[str]: The found AppID string, or None if not found or error occurs.
"""
self.logger.info(f"Attempting to find current AppID for shortcut: '{shortcut_name}' (exe_path: '{exe_path}')")
try:
from .protontricks_handler import ProtontricksHandler # Local import
pt_handler = ProtontricksHandler(steamdeck=self.steamdeck)
appid = self.get_appid_from_vdf(shortcut_name, exe_path)
if appid:
self.logger.info(f"Successfully found AppID {appid} from shortcuts.vdf")
return appid
self.logger.info("AppID not found in shortcuts.vdf, trying protontricks as fallback...")
from .protontricks_handler import ProtontricksHandler
pt_handler = ProtontricksHandler(self.steamdeck)
if not pt_handler.detect_protontricks():
self.logger.error("Protontricks not detected")
self.logger.warning("Protontricks not detected - cannot use as fallback")
return None
result = pt_handler.run_protontricks("-l")
if not result or result.returncode != 0:
self.logger.error(f"Protontricks failed to list applications: {result.stderr if result else 'No result'}")
self.logger.warning(f"Protontricks fallback failed: {result.stderr if result else 'No result'}")
return None
# Build a list of all shortcuts
found_shortcuts = []
@@ -988,8 +1001,8 @@ class ShortcutHandler:
shortcuts_data = VDFHandler.load(shortcuts_vdf_path, binary=True)
if shortcuts_data and 'shortcuts' in shortcuts_data:
for idx, shortcut in shortcuts_data['shortcuts'].items():
app_name = shortcut.get('AppName', '').strip()
exe = shortcut.get('Exe', '').strip('"').strip()
app_name = shortcut.get('AppName', shortcut.get('appname', '')).strip()
exe = shortcut.get('Exe', shortcut.get('exe', '')).strip('"').strip()
vdf_shortcuts.append((app_name, exe, idx))
except Exception as e:
self.logger.error(f"Error parsing shortcuts.vdf for exe path matching: {e}")
@@ -1019,8 +1032,64 @@ class ShortcutHandler:
self.logger.exception("Traceback:")
return None
def get_appid_from_vdf(self, shortcut_name: str, exe_path: Optional[str] = None) -> Optional[str]:
"""
Get AppID directly from shortcuts.vdf by reading the file and matching shortcut name/exe.
This is more reliable than using protontricks since it doesn't depend on external tools.
Args:
shortcut_name (str): The name of the Steam shortcut.
exe_path (Optional[str]): The path to the executable for additional validation.
Returns:
Optional[str]: The AppID as a string, or None if not found.
"""
self.logger.info(f"Looking up AppID from shortcuts.vdf for shortcut: '{shortcut_name}' (exe: '{exe_path}')")
if not self.shortcuts_path or not os.path.isfile(self.shortcuts_path):
self.logger.warning(f"Shortcuts.vdf not found at {self.shortcuts_path}")
return None
try:
shortcuts_data = VDFHandler.load(self.shortcuts_path, binary=True)
if not shortcuts_data or 'shortcuts' not in shortcuts_data:
self.logger.warning("No shortcuts found in shortcuts.vdf")
return None
shortcut_name_clean = shortcut_name.strip().lower()
for idx, shortcut in shortcuts_data['shortcuts'].items():
name = shortcut.get('AppName', shortcut.get('appname', '')).strip()
if name.lower() == shortcut_name_clean:
appid = shortcut.get('appid')
if appid:
if exe_path:
vdf_exe = shortcut.get('Exe', shortcut.get('exe', '')).strip('"').strip()
exe_path_norm = os.path.abspath(os.path.expanduser(exe_path)).lower()
vdf_exe_norm = os.path.abspath(os.path.expanduser(vdf_exe)).lower()
if vdf_exe_norm == exe_path_norm:
self.logger.info(f"Found AppID {appid} for shortcut '{name}' with matching exe '{vdf_exe}'")
return str(int(appid) & 0xFFFFFFFF)
else:
self.logger.debug(f"Found shortcut '{name}' but exe doesn't match: '{vdf_exe}' vs '{exe_path}'")
continue
else:
self.logger.info(f"Found AppID {appid} for shortcut '{name}' (no exe validation)")
return str(int(appid) & 0xFFFFFFFF)
self.logger.warning(f"No matching shortcut found in shortcuts.vdf for '{shortcut_name}'")
return None
except Exception as e:
self.logger.error(f"Error reading shortcuts.vdf: {e}")
self.logger.exception("Traceback:")
return None
# --- Discovery Methods Moved from ModlistHandler ---
def _scan_shortcuts_for_executable(self, executable_name: str) -> List[Dict[str, str]]:
"""
Scans the user's shortcuts.vdf file for entries pointing to a specific executable.
@@ -1036,7 +1105,7 @@ class ShortcutHandler:
matched_shortcuts = []
if not self.shortcuts_path or not os.path.isfile(self.shortcuts_path):
self.logger.error(f"shortcuts.vdf path not found or invalid: {self.shortcuts_path}")
self.logger.info(f"No shortcuts.vdf file found at {self.shortcuts_path} - this is normal for new Steam installations")
return []
# Directly process the single shortcuts.vdf file found during init
@@ -1054,9 +1123,9 @@ class ShortcutHandler:
self.logger.warning(f"Skipping invalid shortcut entry (not a dict) at index {shortcut_id} in {shortcuts_file}")
continue
app_name = shortcut.get('AppName')
exe_path = shortcut.get('Exe', '').strip('"')
start_dir = shortcut.get('StartDir', '').strip('"')
app_name = shortcut.get('AppName', shortcut.get('appname'))
exe_path = shortcut.get('Exe', shortcut.get('exe', '')).strip('"')
start_dir = shortcut.get('StartDir', shortcut.get('startdir', '')).strip('"')
# Check if the base name of the exe_path matches the target
if app_name and start_dir and os.path.basename(exe_path) == executable_name:
@@ -1159,7 +1228,7 @@ class ShortcutHandler:
# --- Use the single shortcuts.vdf path found during init ---
if not self.shortcuts_path or not os.path.isfile(self.shortcuts_path):
self.logger.error(f"shortcuts.vdf path not found or invalid: {self.shortcuts_path}")
self.logger.info(f"No shortcuts.vdf file found at {self.shortcuts_path} - this is normal for new Steam installations")
return []
vdf_path = self.shortcuts_path
@@ -1180,18 +1249,21 @@ class ShortcutHandler:
self.logger.warning(f"Skipping invalid shortcut entry at index {index} in {vdf_path}")
continue
exe_path = shortcut_details.get('Exe', '').strip('"') # Get Exe path, remove quotes
app_name = shortcut_details.get('AppName', 'Unknown Shortcut')
exe_path = shortcut_details.get('Exe', shortcut_details.get('exe', '')).strip('"') # Get Exe path, remove quotes
app_name = shortcut_details.get('AppName', shortcut_details.get('appname', 'Unknown Shortcut'))
# Check if the executable_name is present in the Exe path
if executable_name in os.path.basename(exe_path):
self.logger.info(f"Found matching shortcut '{app_name}' in {vdf_path}")
# Extract relevant details
# Extract relevant details with case-insensitive fallbacks
app_id = shortcut_details.get('appid', shortcut_details.get('AppID', shortcut_details.get('appId', None)))
start_dir = shortcut_details.get('StartDir', shortcut_details.get('startdir', '')).strip('"')
match = {
'AppName': app_name,
'Exe': exe_path, # Store unquoted path
'StartDir': shortcut_details.get('StartDir', '').strip('"') # Unquoted
# Add other useful fields if needed, e.g., 'ShortcutPath'
'StartDir': start_dir,
'appid': app_id # Include the AppID for conversion to unsigned
}
matching_shortcuts.append(match)
else:

View File

@@ -3,17 +3,114 @@ import signal
import subprocess
import time
import resource
import sys
import shutil
import logging
def get_safe_python_executable():
"""
Get a safe Python executable for subprocess calls.
When running as AppImage, returns system Python instead of AppImage path
to prevent recursive AppImage spawning.
Returns:
str: Path to Python executable safe for subprocess calls
"""
# Check if we're running as AppImage
is_appimage = (
'APPIMAGE' in os.environ or
'APPDIR' in os.environ or
(sys.argv[0] and sys.argv[0].endswith('.AppImage'))
)
if is_appimage:
# Running as AppImage - use system Python to avoid recursive spawning
# Try to find system Python (same logic as AppRun)
for cmd in ['python3', 'python3.13', 'python3.12', 'python3.11', 'python3.10', 'python3.9', 'python3.8']:
python_path = shutil.which(cmd)
if python_path:
return python_path
# Fallback: if we can't find system Python, this is a problem
# But we'll still return sys.executable as last resort
return sys.executable
else:
# Not AppImage - sys.executable is safe
return sys.executable
def get_clean_subprocess_env(extra_env=None):
"""
Returns a copy of os.environ with PyInstaller and other problematic variables removed.
Returns a copy of os.environ with bundled-runtime variables and other problematic entries removed.
Optionally merges in extra_env dict.
Also ensures bundled tools (lz4, cabextract, winetricks) are in PATH when running as AppImage.
CRITICAL: Preserves system PATH to ensure system utilities (wget, curl, unzip, xz, gzip, sha256sum) are available.
"""
from pathlib import Path
env = os.environ.copy()
# Remove PyInstaller-specific variables
# Save APPDIR before removing it (we need it to find bundled tools)
appdir = env.get('APPDIR')
# Remove AppImage-specific variables that can confuse subprocess calls
# These variables cause subprocesses to be interpreted as new AppImage launches
for key in ['APPIMAGE', 'APPDIR', 'ARGV0', 'OWD']:
env.pop(key, None)
# Remove bundle-specific variables
for k in list(env):
if k.startswith('_MEIPASS'):
del env[k]
# Get current PATH - ensure we preserve system paths
current_path = env.get('PATH', '')
# Ensure common system directories are in PATH if not already present
# This is critical for tools like lz4 that might be in /usr/bin, /usr/local/bin, etc.
system_paths = ['/usr/bin', '/usr/local/bin', '/bin', '/sbin', '/usr/sbin']
path_parts = current_path.split(':') if current_path else []
for sys_path in system_paths:
if sys_path not in path_parts and os.path.isdir(sys_path):
path_parts.append(sys_path)
# Add bundled tools directory to PATH if running as AppImage
# This ensures cabextract and winetricks are available to subprocesses
# System utilities (wget, curl, unzip, xz, gzip, sha256sum) come from system PATH
# Note: appdir was saved before env cleanup above
# Note: lz4 was only needed for TTW installer and is no longer bundled
tools_dir = None
if appdir:
# Running as AppImage - use APPDIR
tools_dir = os.path.join(appdir, 'opt', 'jackify', 'tools')
logger = logging.getLogger(__name__)
if not os.path.isdir(tools_dir):
logger.debug(f"Tools directory not found: {tools_dir}")
tools_dir = None
else:
# Tools directory exists - add it to PATH for cabextract, winetricks, etc.
logger.debug(f"Found bundled tools directory at: {tools_dir}")
else:
logging.getLogger(__name__).debug("APPDIR not set - not running as AppImage, skipping bundled tools")
# Build final PATH: system PATH first, then bundled tools (lz4, cabextract, winetricks)
# System utilities (wget, curl, unzip, xz, gzip, sha256sum) are preferred from system
final_path_parts = []
# Add all other paths first (system utilities take precedence)
seen = set()
for path_part in path_parts:
if path_part and path_part not in seen:
final_path_parts.append(path_part)
seen.add(path_part)
# Then add bundled tools directory (for cabextract, winetricks, etc.)
if tools_dir and os.path.isdir(tools_dir) and tools_dir not in seen:
final_path_parts.append(tools_dir)
seen.add(tools_dir)
env['PATH'] = ':'.join(final_path_parts)
# Optionally restore LD_LIBRARY_PATH to system default if needed
# (You can add more logic here if you know your system's default)
if extra_env:
@@ -59,7 +156,11 @@ class ProcessManager:
"""
def __init__(self, cmd, env=None, cwd=None, text=False, bufsize=0):
self.cmd = cmd
self.env = env
# Default to cleaned environment if None to prevent AppImage variable inheritance
if env is None:
self.env = get_clean_subprocess_env()
else:
self.env = env
self.cwd = cwd
self.text = text
self.bufsize = bufsize

View File

@@ -0,0 +1,734 @@
"""
TTW_Linux_Installer Handler
Handles downloading, installation, and execution of TTW_Linux_Installer for TTW installations.
Replaces hoolamike for TTW-specific functionality.
"""
import logging
import os
import subprocess
import tarfile
import zipfile
from pathlib import Path
from typing import Optional, Tuple
import requests
from .path_handler import PathHandler
from .filesystem_handler import FileSystemHandler
from .config_handler import ConfigHandler
from .logging_handler import LoggingHandler
from .subprocess_utils import get_clean_subprocess_env
logger = logging.getLogger(__name__)
# Define default TTW_Linux_Installer paths
from jackify.shared.paths import get_jackify_data_dir
JACKIFY_BASE_DIR = get_jackify_data_dir()
DEFAULT_TTW_INSTALLER_DIR = JACKIFY_BASE_DIR / "TTW_Linux_Installer"
TTW_INSTALLER_EXECUTABLE_NAME = "ttw_linux_gui" # Same executable, runs in CLI mode with args
# GitHub release info
TTW_INSTALLER_REPO = "SulfurNitride/TTW_Linux_Installer"
TTW_INSTALLER_RELEASE_URL = f"https://api.github.com/repos/{TTW_INSTALLER_REPO}/releases/latest"
class TTWInstallerHandler:
"""Handles TTW installation using TTW_Linux_Installer (replaces hoolamike for TTW)."""
def __init__(self, steamdeck: bool, verbose: bool, filesystem_handler: FileSystemHandler,
config_handler: ConfigHandler, menu_handler=None):
"""Initialize the handler."""
self.steamdeck = steamdeck
self.verbose = verbose
self.path_handler = PathHandler()
self.filesystem_handler = filesystem_handler
self.config_handler = config_handler
self.menu_handler = menu_handler
# Set up logging
logging_handler = LoggingHandler()
logging_handler.rotate_log_for_logger('ttw-install', 'TTW_Install_workflow.log')
self.logger = logging_handler.setup_logger('ttw-install', 'TTW_Install_workflow.log')
# Installation paths
self.ttw_installer_dir: Path = DEFAULT_TTW_INSTALLER_DIR
self.ttw_installer_executable_path: Optional[Path] = None
self.ttw_installer_installed: bool = False
# Load saved install path from config
saved_path_str = self.config_handler.get('ttw_installer_install_path')
if saved_path_str and Path(saved_path_str).is_dir():
self.ttw_installer_dir = Path(saved_path_str)
self.logger.info(f"Loaded TTW_Linux_Installer path from config: {self.ttw_installer_dir}")
# Check if already installed
self._check_installation()
def _ensure_dirs_exist(self):
"""Ensure base directories exist."""
self.ttw_installer_dir.mkdir(parents=True, exist_ok=True)
def _check_installation(self):
"""Check if TTW_Linux_Installer is installed at expected location."""
self._ensure_dirs_exist()
potential_exe_path = self.ttw_installer_dir / TTW_INSTALLER_EXECUTABLE_NAME
if potential_exe_path.is_file() and os.access(potential_exe_path, os.X_OK):
self.ttw_installer_executable_path = potential_exe_path
self.ttw_installer_installed = True
self.logger.info(f"Found TTW_Linux_Installer at: {self.ttw_installer_executable_path}")
else:
self.ttw_installer_installed = False
self.ttw_installer_executable_path = None
self.logger.info(f"TTW_Linux_Installer not found at {potential_exe_path}")
def install_ttw_installer(self, install_dir: Optional[Path] = None) -> Tuple[bool, str]:
"""Download and install TTW_Linux_Installer from GitHub releases.
Args:
install_dir: Optional directory to install to (defaults to ~/Jackify/TTW_Linux_Installer)
Returns:
(success: bool, message: str)
"""
try:
self._ensure_dirs_exist()
target_dir = Path(install_dir) if install_dir else self.ttw_installer_dir
target_dir.mkdir(parents=True, exist_ok=True)
# Fetch latest release info
self.logger.info(f"Fetching latest TTW_Linux_Installer release from {TTW_INSTALLER_RELEASE_URL}")
resp = requests.get(TTW_INSTALLER_RELEASE_URL, timeout=15, verify=True)
resp.raise_for_status()
data = resp.json()
release_tag = data.get("tag_name") or data.get("name")
# Find Linux asset - universal-mpi-installer pattern (can be .zip or .tar.gz)
linux_asset = None
asset_names = [asset.get("name", "") for asset in data.get("assets", [])]
self.logger.info(f"Available release assets: {asset_names}")
for asset in data.get("assets", []):
name = asset.get("name", "").lower()
# Look for universal-mpi-installer pattern
if "universal-mpi-installer" in name and name.endswith((".zip", ".tar.gz")):
linux_asset = asset
self.logger.info(f"Found Linux asset: {asset.get('name')}")
break
if not linux_asset:
# Log all available assets for debugging
all_assets = [asset.get("name", "") for asset in data.get("assets", [])]
self.logger.error(f"No suitable Linux asset found. Available assets: {all_assets}")
return False, f"No suitable Linux TTW_Linux_Installer asset found in latest release. Available assets: {', '.join(all_assets)}"
download_url = linux_asset.get("browser_download_url")
asset_name = linux_asset.get("name")
if not download_url or not asset_name:
return False, "Latest release is missing required asset metadata"
# Download to target directory
temp_path = target_dir / asset_name
self.logger.info(f"Downloading {asset_name} from {download_url}")
if not self.filesystem_handler.download_file(download_url, temp_path, overwrite=True, quiet=True):
return False, "Failed to download TTW_Linux_Installer asset"
# Extract archive (zip or tar.gz)
try:
self.logger.info(f"Extracting {asset_name} to {target_dir}")
if asset_name.lower().endswith('.tar.gz'):
with tarfile.open(temp_path, "r:gz") as tf:
tf.extractall(path=target_dir)
elif asset_name.lower().endswith('.zip'):
with zipfile.ZipFile(temp_path, "r") as zf:
zf.extractall(path=target_dir)
else:
return False, f"Unsupported archive format: {asset_name}"
finally:
try:
temp_path.unlink(missing_ok=True) # cleanup
except Exception:
pass
# Find executable (may be in subdirectory or root)
exe_path = target_dir / TTW_INSTALLER_EXECUTABLE_NAME
if not exe_path.is_file():
# Search for it
for p in target_dir.rglob(TTW_INSTALLER_EXECUTABLE_NAME):
if p.is_file():
exe_path = p
break
if not exe_path.is_file():
return False, "TTW_Linux_Installer executable not found after extraction"
# Set executable permissions
try:
os.chmod(exe_path, 0o755)
except Exception as e:
self.logger.warning(f"Failed to chmod +x on {exe_path}: {e}")
# Update state
self.ttw_installer_dir = target_dir
self.ttw_installer_executable_path = exe_path
self.ttw_installer_installed = True
self.config_handler.set('ttw_installer_install_path', str(target_dir))
if release_tag:
self.config_handler.set('ttw_installer_version', release_tag)
self.logger.info(f"TTW_Linux_Installer installed successfully at {exe_path}")
return True, f"TTW_Linux_Installer installed at {target_dir}"
except Exception as e:
self.logger.error(f"Error installing TTW_Linux_Installer: {e}", exc_info=True)
return False, f"Error installing TTW_Linux_Installer: {e}"
def get_installed_ttw_installer_version(self) -> Optional[str]:
"""Return the installed TTW_Linux_Installer version stored in Jackify config, if any."""
try:
v = self.config_handler.get('ttw_installer_version')
return str(v) if v else None
except Exception:
return None
def is_ttw_installer_update_available(self) -> Tuple[bool, Optional[str], Optional[str]]:
"""
Check GitHub for the latest TTW_Linux_Installer release and compare with installed version.
Returns (update_available, installed_version, latest_version).
"""
installed = self.get_installed_ttw_installer_version()
# If executable exists but no version is recorded, don't show as "out of date"
# This can happen if the executable was installed before version tracking was added
if not installed and self.ttw_installer_installed:
self.logger.info("TTW_Linux_Installer executable found but no version recorded in config")
# Don't treat as update available - just show as "Ready" (unknown version)
return (False, None, None)
try:
resp = requests.get(TTW_INSTALLER_RELEASE_URL, timeout=10, verify=True)
resp.raise_for_status()
latest = resp.json().get('tag_name') or resp.json().get('name')
if not latest:
return (False, installed, None)
if not installed:
# No version recorded and executable doesn't exist; treat as not installed
return (False, None, str(latest))
return (installed != str(latest), installed, str(latest))
except Exception as e:
self.logger.warning(f"Error checking for TTW_Linux_Installer updates: {e}")
return (False, installed, None)
def install_ttw_backend(self, ttw_mpi_path: Path, ttw_output_path: Path) -> Tuple[bool, str]:
"""Install TTW using TTW_Linux_Installer.
Args:
ttw_mpi_path: Path to TTW .mpi file
ttw_output_path: Target installation directory
Returns:
(success: bool, message: str)
"""
self.logger.info("Starting Tale of Two Wastelands installation via TTW_Linux_Installer")
# Validate parameters
if not ttw_mpi_path or not ttw_output_path:
return False, "Missing required parameters: ttw_mpi_path and ttw_output_path are required"
ttw_mpi_path = Path(ttw_mpi_path)
ttw_output_path = Path(ttw_output_path)
# Validate paths
if not ttw_mpi_path.exists():
return False, f"TTW .mpi file not found: {ttw_mpi_path}"
if not ttw_mpi_path.is_file():
return False, f"TTW .mpi path is not a file: {ttw_mpi_path}"
if ttw_mpi_path.suffix.lower() != '.mpi':
return False, f"TTW path does not have .mpi extension: {ttw_mpi_path}"
if not ttw_output_path.exists():
try:
ttw_output_path.mkdir(parents=True, exist_ok=True)
except Exception as e:
return False, f"Failed to create output directory: {e}"
# Check installation
if not self.ttw_installer_installed:
# Try to install automatically
self.logger.info("TTW_Linux_Installer not found, attempting to install...")
success, message = self.install_ttw_installer()
if not success:
return False, f"TTW_Linux_Installer not installed and auto-install failed: {message}"
if not self.ttw_installer_executable_path or not self.ttw_installer_executable_path.is_file():
return False, "TTW_Linux_Installer executable not found"
# Detect game paths
required_games = ['Fallout 3', 'Fallout New Vegas']
detected_games = self.path_handler.find_vanilla_game_paths()
missing_games = [game for game in required_games if game not in detected_games]
if missing_games:
return False, f"Missing required games: {', '.join(missing_games)}. TTW requires both Fallout 3 and Fallout New Vegas."
fallout3_path = detected_games.get('Fallout 3')
falloutnv_path = detected_games.get('Fallout New Vegas')
if not fallout3_path or not falloutnv_path:
return False, "Could not detect Fallout 3 or Fallout New Vegas installation paths"
# Construct command - run in CLI mode with arguments
cmd = [
str(self.ttw_installer_executable_path),
"--fo3", str(fallout3_path),
"--fnv", str(falloutnv_path),
"--mpi", str(ttw_mpi_path),
"--output", str(ttw_output_path),
"--start"
]
self.logger.info(f"Executing TTW_Linux_Installer: {' '.join(cmd)}")
try:
env = get_clean_subprocess_env()
# CRITICAL: cwd must be the directory containing the executable, not the extraction root
# This is because AppContext.BaseDirectory (used by TTW installer to find BundledBinaries)
# is the directory containing the executable, not the working directory
exe_dir = str(self.ttw_installer_executable_path.parent)
process = subprocess.Popen(
cmd,
cwd=exe_dir,
env=env,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
text=True,
bufsize=1,
universal_newlines=True
)
# Stream output to logger
if process.stdout:
for line in process.stdout:
line = line.rstrip()
if line:
self.logger.info(f"TTW_Linux_Installer: {line}")
process.wait()
ret = process.returncode
if ret == 0:
self.logger.info("TTW installation completed successfully.")
return True, "TTW installation completed successfully!"
else:
self.logger.error(f"TTW installation process returned non-zero exit code: {ret}")
return False, f"TTW installation failed with exit code {ret}"
except Exception as e:
self.logger.error(f"Error executing TTW_Linux_Installer: {e}", exc_info=True)
return False, f"Error executing TTW_Linux_Installer: {e}"
def start_ttw_installation(self, ttw_mpi_path: Path, ttw_output_path: Path, output_file: Path):
"""Start TTW installation process (non-blocking).
Starts the TTW_Linux_Installer subprocess with output redirected to a file.
Returns immediately with process handle. Caller should poll process and read output file.
Args:
ttw_mpi_path: Path to TTW .mpi file
ttw_output_path: Target installation directory
output_file: Path to file where stdout/stderr will be written
Returns:
(process: subprocess.Popen, error_message: str) - process is None if failed
"""
self.logger.info("Starting TTW installation (non-blocking mode)")
# Validate parameters
if not ttw_mpi_path or not ttw_output_path:
return None, "Missing required parameters: ttw_mpi_path and ttw_output_path are required"
ttw_mpi_path = Path(ttw_mpi_path)
ttw_output_path = Path(ttw_output_path)
# Validate paths
if not ttw_mpi_path.exists():
return None, f"TTW .mpi file not found: {ttw_mpi_path}"
if not ttw_mpi_path.is_file():
return None, f"TTW .mpi path is not a file: {ttw_mpi_path}"
if ttw_mpi_path.suffix.lower() != '.mpi':
return None, f"TTW path does not have .mpi extension: {ttw_mpi_path}"
if not ttw_output_path.exists():
try:
ttw_output_path.mkdir(parents=True, exist_ok=True)
except Exception as e:
return None, f"Failed to create output directory: {e}"
# Check installation
if not self.ttw_installer_installed:
self.logger.info("TTW_Linux_Installer not found, attempting to install...")
success, message = self.install_ttw_installer()
if not success:
return None, f"TTW_Linux_Installer not installed and auto-install failed: {message}"
if not self.ttw_installer_executable_path or not self.ttw_installer_executable_path.is_file():
return None, "TTW_Linux_Installer executable not found"
# Detect game paths
required_games = ['Fallout 3', 'Fallout New Vegas']
detected_games = self.path_handler.find_vanilla_game_paths()
missing_games = [game for game in required_games if game not in detected_games]
if missing_games:
return None, f"Missing required games: {', '.join(missing_games)}. TTW requires both Fallout 3 and Fallout New Vegas."
fallout3_path = detected_games.get('Fallout 3')
falloutnv_path = detected_games.get('Fallout New Vegas')
if not fallout3_path or not falloutnv_path:
return None, "Could not detect Fallout 3 or Fallout New Vegas installation paths"
# Construct command
cmd = [
str(self.ttw_installer_executable_path),
"--fo3", str(fallout3_path),
"--fnv", str(falloutnv_path),
"--mpi", str(ttw_mpi_path),
"--output", str(ttw_output_path),
"--start"
]
self.logger.info(f"Executing TTW_Linux_Installer: {' '.join(cmd)}")
try:
env = get_clean_subprocess_env()
# Note: TTW_Linux_Installer bundles its own lz4 and will find it via AppContext.BaseDirectory
# We set cwd to the executable's directory so AppContext.BaseDirectory matches the working directory
# Open output file for writing
output_fh = open(output_file, 'w', encoding='utf-8', buffering=1)
# Start process with output redirected to file
# CRITICAL: cwd must be the directory containing the executable, not the extraction root
# This is because AppContext.BaseDirectory (used by TTW installer to find BundledBinaries)
# is the directory containing the executable, not the working directory
exe_dir = str(self.ttw_installer_executable_path.parent)
process = subprocess.Popen(
cmd,
cwd=exe_dir,
env=env,
stdout=output_fh,
stderr=subprocess.STDOUT,
bufsize=1
)
self.logger.info(f"TTW_Linux_Installer process started (PID: {process.pid}), output to {output_file}")
# Store file handle so it can be closed later
process._output_fh = output_fh
return process, None
except Exception as e:
self.logger.error(f"Error starting TTW_Linux_Installer: {e}", exc_info=True)
return None, f"Error starting TTW_Linux_Installer: {e}"
@staticmethod
def cleanup_ttw_process(process):
"""Clean up after TTW installation process.
Closes file handles and ensures process is terminated properly.
Args:
process: subprocess.Popen object from start_ttw_installation()
"""
if process:
# Close output file handle if attached
if hasattr(process, '_output_fh'):
try:
process._output_fh.close()
except Exception:
pass
# Terminate if still running
if process.poll() is None:
try:
process.terminate()
process.wait(timeout=5)
except Exception:
try:
process.kill()
except Exception:
pass
def install_ttw_backend_with_output_stream(self, ttw_mpi_path: Path, ttw_output_path: Path, output_callback=None):
"""Install TTW with streaming output for GUI (DEPRECATED - use start_ttw_installation instead).
Args:
ttw_mpi_path: Path to TTW .mpi file
ttw_output_path: Target installation directory
output_callback: Optional callback function(line: str) for real-time output
Returns:
(success: bool, message: str)
"""
self.logger.info("Starting Tale of Two Wastelands installation via TTW_Linux_Installer (with output stream)")
# Validate parameters (same as install_ttw_backend)
if not ttw_mpi_path or not ttw_output_path:
return False, "Missing required parameters: ttw_mpi_path and ttw_output_path are required"
ttw_mpi_path = Path(ttw_mpi_path)
ttw_output_path = Path(ttw_output_path)
# Validate paths
if not ttw_mpi_path.exists():
return False, f"TTW .mpi file not found: {ttw_mpi_path}"
if not ttw_mpi_path.is_file():
return False, f"TTW .mpi path is not a file: {ttw_mpi_path}"
if ttw_mpi_path.suffix.lower() != '.mpi':
return False, f"TTW path does not have .mpi extension: {ttw_mpi_path}"
if not ttw_output_path.exists():
try:
ttw_output_path.mkdir(parents=True, exist_ok=True)
except Exception as e:
return False, f"Failed to create output directory: {e}"
# Check installation
if not self.ttw_installer_installed:
if output_callback:
output_callback("TTW_Linux_Installer not found, installing...")
self.logger.info("TTW_Linux_Installer not found, attempting to install...")
success, message = self.install_ttw_installer()
if not success:
return False, f"TTW_Linux_Installer not installed and auto-install failed: {message}"
if not self.ttw_installer_executable_path or not self.ttw_installer_executable_path.is_file():
return False, "TTW_Linux_Installer executable not found"
# Detect game paths
required_games = ['Fallout 3', 'Fallout New Vegas']
detected_games = self.path_handler.find_vanilla_game_paths()
missing_games = [game for game in required_games if game not in detected_games]
if missing_games:
return False, f"Missing required games: {', '.join(missing_games)}. TTW requires both Fallout 3 and Fallout New Vegas."
fallout3_path = detected_games.get('Fallout 3')
falloutnv_path = detected_games.get('Fallout New Vegas')
if not fallout3_path or not falloutnv_path:
return False, "Could not detect Fallout 3 or Fallout New Vegas installation paths"
# Construct command
cmd = [
str(self.ttw_installer_executable_path),
"--fo3", str(fallout3_path),
"--fnv", str(falloutnv_path),
"--mpi", str(ttw_mpi_path),
"--output", str(ttw_output_path),
"--start"
]
self.logger.info(f"Executing TTW_Linux_Installer: {' '.join(cmd)}")
try:
env = get_clean_subprocess_env()
# CRITICAL: cwd must be the directory containing the executable, not the extraction root
# This is because AppContext.BaseDirectory (used by TTW installer to find BundledBinaries)
# is the directory containing the executable, not the working directory
exe_dir = str(self.ttw_installer_executable_path.parent)
process = subprocess.Popen(
cmd,
cwd=exe_dir,
env=env,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
text=True,
bufsize=1,
universal_newlines=True
)
# Stream output to both logger and callback
if process.stdout:
for line in process.stdout:
line = line.rstrip()
if line:
self.logger.info(f"TTW_Linux_Installer: {line}")
if output_callback:
output_callback(line)
process.wait()
ret = process.returncode
if ret == 0:
self.logger.info("TTW installation completed successfully.")
return True, "TTW installation completed successfully!"
else:
self.logger.error(f"TTW installation process returned non-zero exit code: {ret}")
return False, f"TTW installation failed with exit code {ret}"
except Exception as e:
self.logger.error(f"Error executing TTW_Linux_Installer: {e}", exc_info=True)
return False, f"Error executing TTW_Linux_Installer: {e}"
@staticmethod
def integrate_ttw_into_modlist(ttw_output_path: Path, modlist_install_dir: Path, ttw_version: str) -> bool:
"""Integrate TTW output into a modlist's MO2 structure
This method:
1. Copies TTW output to the modlist's mods folder
2. Updates modlist.txt for all profiles
3. Updates plugins.txt with TTW ESMs in correct order
Args:
ttw_output_path: Path to TTW output directory
modlist_install_dir: Path to modlist installation directory
ttw_version: TTW version string (e.g., "3.4")
Returns:
bool: True if integration successful, False otherwise
"""
logging_handler = LoggingHandler()
logging_handler.rotate_log_for_logger('ttw-install', 'TTW_Install_workflow.log')
logger = logging_handler.setup_logger('ttw-install', 'TTW_Install_workflow.log')
try:
import shutil
# Validate paths
if not ttw_output_path.exists():
logger.error(f"TTW output path does not exist: {ttw_output_path}")
return False
mods_dir = modlist_install_dir / "mods"
profiles_dir = modlist_install_dir / "profiles"
if not mods_dir.exists() or not profiles_dir.exists():
logger.error(f"Invalid modlist directory structure: {modlist_install_dir}")
return False
# Create mod folder name with version
mod_folder_name = f"[NoDelete] Tale of Two Wastelands {ttw_version}" if ttw_version else "[NoDelete] Tale of Two Wastelands"
target_mod_dir = mods_dir / mod_folder_name
# Copy TTW output to mods directory
logger.info(f"Copying TTW output to {target_mod_dir}")
if target_mod_dir.exists():
logger.info(f"Removing existing TTW mod at {target_mod_dir}")
shutil.rmtree(target_mod_dir)
shutil.copytree(ttw_output_path, target_mod_dir)
logger.info("TTW output copied successfully")
# TTW ESMs in correct load order
ttw_esms = [
"Fallout3.esm",
"Anchorage.esm",
"ThePitt.esm",
"BrokenSteel.esm",
"PointLookout.esm",
"Zeta.esm",
"TaleOfTwoWastelands.esm",
"YUPTTW.esm"
]
# Process each profile
for profile_dir in profiles_dir.iterdir():
if not profile_dir.is_dir():
continue
profile_name = profile_dir.name
logger.info(f"Processing profile: {profile_name}")
# Update modlist.txt
modlist_file = profile_dir / "modlist.txt"
if modlist_file.exists():
# Read existing modlist
with open(modlist_file, 'r', encoding='utf-8') as f:
lines = f.readlines()
# Find the TTW placeholder separator and insert BEFORE it
separator_found = False
ttw_mod_line = f"+{mod_folder_name}\n"
new_lines = []
for line in lines:
# Skip existing TTW mod entries (but keep separators and other TTW-related mods)
# Match patterns: "+[NoDelete] Tale of Two Wastelands", "+[NoDelete] TTW", etc.
stripped = line.strip()
if stripped.startswith('+') and '[nodelete]' in stripped.lower():
# Check if it's the main TTW mod (not other TTW-related mods like "TTW Quick Start")
if ('tale of two wastelands' in stripped.lower() and 'quick start' not in stripped.lower() and
'loading wheel' not in stripped.lower()) or stripped.lower().startswith('+[nodelete] ttw '):
logger.info(f"Removing existing TTW mod entry: {stripped}")
continue
# Insert TTW mod BEFORE the placeholder separator (MO2 order is bottom-up)
# Check BEFORE appending so TTW mod appears before separator in file
if "put tale of two wastelands mod here" in line.lower() and "_separator" in line.lower():
new_lines.append(ttw_mod_line)
separator_found = True
logger.info(f"Inserted TTW mod before separator: {line.strip()}")
new_lines.append(line)
# If no separator found, append at the end
if not separator_found:
new_lines.append(ttw_mod_line)
logger.warning(f"No TTW separator found in {profile_name}, appended to end")
# Write back
with open(modlist_file, 'w', encoding='utf-8') as f:
f.writelines(new_lines)
logger.info(f"Updated modlist.txt for {profile_name}")
else:
logger.warning(f"modlist.txt not found for profile {profile_name}")
# Update plugins.txt
plugins_file = profile_dir / "plugins.txt"
if plugins_file.exists():
# Read existing plugins
with open(plugins_file, 'r', encoding='utf-8') as f:
lines = f.readlines()
# Remove any existing TTW ESMs
ttw_esm_set = set(esm.lower() for esm in ttw_esms)
lines = [line for line in lines if line.strip().lower() not in ttw_esm_set]
# Find CaravanPack.esm and insert TTW ESMs after it
insert_index = None
for i, line in enumerate(lines):
if line.strip().lower() == "caravanpack.esm":
insert_index = i + 1
break
if insert_index is not None:
# Insert TTW ESMs in correct order
for esm in reversed(ttw_esms):
lines.insert(insert_index, f"{esm}\n")
else:
logger.warning(f"CaravanPack.esm not found in {profile_name}, appending TTW ESMs to end")
for esm in ttw_esms:
lines.append(f"{esm}\n")
# Write back
with open(plugins_file, 'w', encoding='utf-8') as f:
f.writelines(lines)
logger.info(f"Updated plugins.txt for {profile_name}")
else:
logger.warning(f"plugins.txt not found for profile {profile_name}")
logger.info("TTW integration completed successfully")
return True
except Exception as e:
logger.error(f"Error integrating TTW into modlist: {e}", exc_info=True)
return False

View File

@@ -222,15 +222,21 @@ class ValidationHandler:
def validate_steam_shortcut(self, app_id: str) -> Tuple[bool, str]:
"""Validate a Steam shortcut."""
try:
# Check if shortcuts.vdf exists
shortcuts_path = Path.home() / '.steam' / 'steam' / 'userdata' / '75424832' / 'config' / 'shortcuts.vdf'
# Use native Steam service to get proper shortcuts.vdf path with multi-user support
from jackify.backend.services.native_steam_service import NativeSteamService
steam_service = NativeSteamService()
shortcuts_path = steam_service.get_shortcuts_vdf_path()
if not shortcuts_path:
return False, "Could not determine shortcuts.vdf path (no active Steam user found)"
if not shortcuts_path.exists():
return False, "shortcuts.vdf not found"
# Check if shortcuts.vdf is accessible
if not os.access(shortcuts_path, os.R_OK | os.W_OK):
return False, "shortcuts.vdf is not accessible"
# Parse shortcuts.vdf using VDFHandler
shortcuts_data = VDFHandler.load(str(shortcuts_path), binary=True)
@@ -294,7 +300,7 @@ class ValidationHandler:
def looks_like_modlist_dir(self, path: Path) -> bool:
"""Return True if the directory contains files/folders typical of a modlist install."""
expected = [
'ModOrganizer.exe', 'profiles', 'mods', 'downloads', '.wabbajack', '.jackify_modlist_marker', 'ModOrganizer.ini'
'ModOrganizer.exe', 'profiles', 'mods', '.wabbajack', '.jackify_modlist_marker', 'ModOrganizer.ini'
]
for item in expected:
if (path / item).exists():

View File

@@ -1196,7 +1196,8 @@ class InstallWabbajackHandler:
"""Displays the final success message and next steps."""
# Basic log file path (assuming standard location)
# TODO: Get log file path more reliably if needed
log_path = Path.home() / "Jackify" / "logs" / "jackify-cli.log"
from jackify.shared.paths import get_jackify_logs_dir
log_path = get_jackify_logs_dir() / "jackify-cli.log"
print("\n───────────────────────────────────────────────────────────────────")
print(f"{COLOR_INFO}Wabbajack Installation Completed Successfully!{COLOR_RESET}")

View File

@@ -0,0 +1,601 @@
"""
Wabbajack Installer Handler
Automated Wabbajack.exe installation and configuration via Proton.
Self-contained implementation inspired by Wabbajack-Proton-AuCu (MIT).
This handler provides:
- Automatic Wabbajack.exe download
- Steam shortcuts.vdf manipulation
- WebView2 installation
- Win7 registry configuration
- Optional Heroic GOG game detection
"""
import json
import logging
import os
import shutil
import subprocess
import tempfile
import urllib.request
import zlib
from pathlib import Path
from typing import Optional, List, Dict, Tuple
try:
import vdf
except ImportError:
vdf = None
class WabbajackInstallerHandler:
"""Handles automated Wabbajack installation via Proton"""
# Download URLs
WABBAJACK_URL = "https://github.com/wabbajack-tools/wabbajack/releases/latest/download/Wabbajack.exe"
WEBVIEW2_URL = "https://files.omnigaming.org/MicrosoftEdgeWebView2RuntimeInstallerX64-WabbajackProton.exe"
# Minimal Win7 registry settings for Wabbajack compatibility
WIN7_REGISTRY = """REGEDIT4
[HKEY_LOCAL_MACHINE\\Software\\Microsoft\\Windows NT\\CurrentVersion]
"ProductName"="Microsoft Windows 7"
"CSDVersion"="Service Pack 1"
"CurrentBuild"="7601"
"CurrentBuildNumber"="7601"
"CurrentVersion"="6.1"
[HKEY_LOCAL_MACHINE\\System\\CurrentControlSet\\Control\\Windows]
"CSDVersion"=dword:00000100
[HKEY_CURRENT_USER\\Software\\Wine\\AppDefaults\\Wabbajack.exe\\X11 Driver]
"Decorated"="N"
"""
def __init__(self):
self.logger = logging.getLogger(__name__)
def calculate_app_id(self, exe_path: str, app_name: str) -> int:
"""
Calculate Steam AppID using CRC32 algorithm.
Args:
exe_path: Path to executable (must be quoted)
app_name: Application name
Returns:
AppID (31-bit to fit signed 32-bit integer range for VDF binary format)
"""
input_str = f"{exe_path}{app_name}"
crc = zlib.crc32(input_str.encode()) & 0x7FFFFFFF # Use 31 bits for signed int
return crc
def find_steam_userdata_path(self) -> Optional[Path]:
"""
Find most recently used Steam userdata directory.
Returns:
Path to userdata/<userid> or None if not found
"""
home = Path.home()
steam_paths = [
home / ".steam/steam",
home / ".local/share/Steam",
home / ".var/app/com.valvesoftware.Steam/.local/share/Steam",
]
for steam_path in steam_paths:
userdata = steam_path / "userdata"
if not userdata.exists():
continue
# Find most recently modified numeric user directory
user_dirs = []
for entry in userdata.iterdir():
if entry.is_dir() and entry.name.isdigit():
user_dirs.append(entry)
if user_dirs:
# Sort by modification time (most recent first)
user_dirs.sort(key=lambda p: p.stat().st_mtime, reverse=True)
self.logger.info(f"Found Steam userdata: {user_dirs[0]}")
return user_dirs[0]
return None
def get_shortcuts_vdf_path(self) -> Optional[Path]:
"""Get path to shortcuts.vdf file"""
userdata = self.find_steam_userdata_path()
if userdata:
return userdata / "config/shortcuts.vdf"
return None
def add_to_steam_shortcuts(self, exe_path: Path) -> int:
"""
Add Wabbajack to Steam shortcuts.vdf and return calculated AppID.
Args:
exe_path: Path to Wabbajack.exe
Returns:
Calculated AppID
Raises:
RuntimeError: If vdf library not available or shortcuts.vdf not found
"""
if vdf is None:
raise RuntimeError("vdf library not installed. Install with: pip install vdf")
shortcuts_path = self.get_shortcuts_vdf_path()
if not shortcuts_path:
raise RuntimeError("Could not find Steam shortcuts.vdf path")
self.logger.info(f"Shortcuts.vdf path: {shortcuts_path}")
# Read existing shortcuts or create new
if shortcuts_path.exists():
with open(shortcuts_path, 'rb') as f:
shortcuts = vdf.binary_load(f)
else:
shortcuts = {'shortcuts': {}}
# Ensure parent directory exists
shortcuts_path.parent.mkdir(parents=True, exist_ok=True)
# Calculate AppID
exe_str = f'"{str(exe_path)}"'
app_id = self.calculate_app_id(exe_str, "Wabbajack")
self.logger.info(f"Calculated AppID: {app_id}")
# Create shortcut entry
idx = str(len(shortcuts.get('shortcuts', {})))
shortcuts.setdefault('shortcuts', {})[idx] = {
'appid': app_id,
'AppName': 'Wabbajack',
'Exe': exe_str,
'StartDir': f'"{str(exe_path.parent)}"',
'icon': str(exe_path),
'ShortcutPath': '',
'LaunchOptions': '',
'IsHidden': 0,
'AllowDesktopConfig': 1,
'AllowOverlay': 1,
'OpenVR': 0,
'Devkit': 0,
'DevkitGameID': '',
'DevkitOverrideAppID': 0,
'LastPlayTime': 0,
'FlatpakAppID': '',
'tags': {}
}
# Write back (binary format)
with open(shortcuts_path, 'wb') as f:
vdf.binary_dump(shortcuts, f)
self.logger.info(f"Added Wabbajack to Steam shortcuts with AppID {app_id}")
return app_id
def create_dotnet_cache(self, install_folder: Path):
"""
Create .NET bundle extract cache directory.
Wabbajack requires: <install_path>/<home_path>/.cache/dotnet_bundle_extract
Args:
install_folder: Wabbajack installation directory
"""
home = Path.home()
# Strip leading slash to make it relative
home_relative = str(home).lstrip('/')
cache_dir = install_folder / home_relative / '.cache/dotnet_bundle_extract'
cache_dir.mkdir(parents=True, exist_ok=True)
self.logger.info(f"Created dotnet cache: {cache_dir}")
def download_file(self, url: str, dest: Path, description: str = "file") -> None:
"""
Download file with progress logging.
Args:
url: Download URL
dest: Destination path
description: Description for logging
Raises:
RuntimeError: If download fails
"""
self.logger.info(f"Downloading {description} from {url}")
try:
# Ensure parent directory exists
dest.parent.mkdir(parents=True, exist_ok=True)
# Download with user agent
request = urllib.request.Request(
url,
headers={'User-Agent': 'Jackify-WabbajackInstaller'}
)
with urllib.request.urlopen(request) as response:
with open(dest, 'wb') as f:
shutil.copyfileobj(response, f)
self.logger.info(f"Downloaded {description} to {dest}")
except Exception as e:
raise RuntimeError(f"Failed to download {description}: {e}")
def download_wabbajack(self, install_folder: Path) -> Path:
"""
Download Wabbajack.exe to installation folder.
Args:
install_folder: Installation directory
Returns:
Path to downloaded Wabbajack.exe
"""
install_folder.mkdir(parents=True, exist_ok=True)
wabbajack_exe = install_folder / "Wabbajack.exe"
# Skip if already exists
if wabbajack_exe.exists():
self.logger.info(f"Wabbajack.exe already exists at {wabbajack_exe}")
return wabbajack_exe
self.download_file(self.WABBAJACK_URL, wabbajack_exe, "Wabbajack.exe")
return wabbajack_exe
def find_proton_experimental(self) -> Optional[Path]:
"""
Find Proton Experimental installation path.
Returns:
Path to Proton Experimental directory or None
"""
home = Path.home()
steam_paths = [
home / ".steam/steam",
home / ".local/share/Steam",
home / ".var/app/com.valvesoftware.Steam/.local/share/Steam",
]
for steam_path in steam_paths:
proton_path = steam_path / "steamapps/common/Proton - Experimental"
if proton_path.exists():
self.logger.info(f"Found Proton Experimental: {proton_path}")
return proton_path
return None
def get_compat_data_path(self, app_id: int) -> Optional[Path]:
"""Get compatdata path for AppID"""
home = Path.home()
steam_paths = [
home / ".steam/steam",
home / ".local/share/Steam",
home / ".var/app/com.valvesoftware.Steam/.local/share/Steam",
]
for steam_path in steam_paths:
compat_path = steam_path / f"steamapps/compatdata/{app_id}"
if compat_path.parent.exists():
# Parent exists, so this is valid location even if prefix doesn't exist yet
return compat_path
return None
def init_wine_prefix(self, app_id: int) -> Path:
"""
Initialize Wine prefix using Proton.
Args:
app_id: Steam AppID
Returns:
Path to created prefix
Raises:
RuntimeError: If prefix creation fails
"""
proton_path = self.find_proton_experimental()
if not proton_path:
raise RuntimeError("Proton Experimental not found. Please install it from Steam.")
compat_data = self.get_compat_data_path(app_id)
if not compat_data:
raise RuntimeError("Could not determine compatdata path")
prefix_path = compat_data / "pfx"
# Create compat data directory
compat_data.mkdir(parents=True, exist_ok=True)
# Run wineboot to initialize prefix
proton_bin = proton_path / "proton"
env = os.environ.copy()
env['STEAM_COMPAT_DATA_PATH'] = str(compat_data)
env['STEAM_COMPAT_CLIENT_INSTALL_PATH'] = str(compat_data.parent.parent.parent)
self.logger.info(f"Initializing Wine prefix for AppID {app_id}...")
result = subprocess.run(
[str(proton_bin), 'run', 'wineboot'],
env=env,
capture_output=True,
text=True,
timeout=120
)
if result.returncode != 0:
raise RuntimeError(f"Failed to initialize Wine prefix: {result.stderr}")
self.logger.info(f"Prefix created: {prefix_path}")
return prefix_path
def run_in_prefix(self, app_id: int, exe_path: Path, args: List[str] = None) -> None:
"""
Run executable in Wine prefix using Proton.
Args:
app_id: Steam AppID
exe_path: Path to executable
args: Optional command line arguments
Raises:
RuntimeError: If execution fails
"""
proton_path = self.find_proton_experimental()
if not proton_path:
raise RuntimeError("Proton Experimental not found")
compat_data = self.get_compat_data_path(app_id)
if not compat_data:
raise RuntimeError("Could not determine compatdata path")
proton_bin = proton_path / "proton"
cmd = [str(proton_bin), 'run', str(exe_path)]
if args:
cmd.extend(args)
env = os.environ.copy()
env['STEAM_COMPAT_DATA_PATH'] = str(compat_data)
env['STEAM_COMPAT_CLIENT_INSTALL_PATH'] = str(compat_data.parent.parent.parent)
self.logger.info(f"Running {exe_path.name} in prefix...")
result = subprocess.run(
cmd,
env=env,
capture_output=True,
text=True,
timeout=300
)
if result.returncode != 0:
error_msg = f"Failed to run {exe_path.name} (exit code {result.returncode})"
if result.stderr:
error_msg += f"\nStderr: {result.stderr}"
if result.stdout:
error_msg += f"\nStdout: {result.stdout}"
self.logger.error(error_msg)
raise RuntimeError(error_msg)
def apply_registry(self, app_id: int, reg_content: str) -> None:
"""
Apply registry content to Wine prefix.
Args:
app_id: Steam AppID
reg_content: Registry file content
Raises:
RuntimeError: If registry application fails
"""
proton_path = self.find_proton_experimental()
if not proton_path:
raise RuntimeError("Proton Experimental not found")
compat_data = self.get_compat_data_path(app_id)
if not compat_data:
raise RuntimeError("Could not determine compatdata path")
prefix_path = compat_data / "pfx"
if not prefix_path.exists():
raise RuntimeError(f"Prefix not found: {prefix_path}")
# Write registry content to temp file
with tempfile.NamedTemporaryFile(mode='w', suffix='.reg', delete=False) as f:
f.write(reg_content)
temp_reg = Path(f.name)
try:
# Use Proton's wine directly
wine_bin = proton_path / "files/bin/wine64"
self.logger.info("Applying registry settings...")
env = os.environ.copy()
env['WINEPREFIX'] = str(prefix_path)
result = subprocess.run(
[str(wine_bin), 'regedit', str(temp_reg)],
env=env,
capture_output=True,
text=True,
timeout=30
)
if result.returncode != 0:
raise RuntimeError(f"Failed to apply registry: {result.stderr}")
self.logger.info("Registry settings applied")
finally:
# Cleanup temp file
if temp_reg.exists():
temp_reg.unlink()
def install_webview2(self, app_id: int, install_folder: Path) -> None:
"""
Download and install WebView2 runtime.
Args:
app_id: Steam AppID
install_folder: Directory to download installer to
Raises:
RuntimeError: If installation fails
"""
webview_installer = install_folder / "webview2_installer.exe"
# Download installer
self.download_file(self.WEBVIEW2_URL, webview_installer, "WebView2 installer")
try:
# Run installer with silent flags
self.logger.info("Installing WebView2 (this may take a minute)...")
self.logger.info(f"WebView2 installer path: {webview_installer}")
self.logger.info(f"AppID: {app_id}")
try:
self.run_in_prefix(app_id, webview_installer, ["/silent", "/install"])
self.logger.info("WebView2 installed successfully")
except RuntimeError as e:
self.logger.error(f"WebView2 installation failed: {e}")
# Re-raise to let caller handle it
raise
finally:
# Cleanup installer
if webview_installer.exists():
try:
webview_installer.unlink()
self.logger.debug("Cleaned up WebView2 installer")
except Exception as e:
self.logger.warning(f"Failed to cleanup WebView2 installer: {e}")
def apply_win7_registry(self, app_id: int) -> None:
"""
Apply Windows 7 registry settings.
Args:
app_id: Steam AppID
Raises:
RuntimeError: If registry application fails
"""
self.apply_registry(app_id, self.WIN7_REGISTRY)
def detect_heroic_gog_games(self) -> List[Dict]:
"""
Detect GOG games installed via Heroic Games Launcher.
Returns:
List of dicts with keys: app_name, title, install_path, build_id
"""
heroic_paths = [
Path.home() / ".config/heroic",
Path.home() / ".var/app/com.heroicgameslauncher.hgl/config/heroic"
]
for heroic_path in heroic_paths:
if not heroic_path.exists():
continue
installed_json = heroic_path / "gog_store/installed.json"
if not installed_json.exists():
continue
try:
# Read installed games
with open(installed_json) as f:
data = json.load(f)
installed = data.get('installed', [])
# Read library for titles
library_json = heroic_path / "store_cache/gog_library.json"
titles = {}
if library_json.exists():
with open(library_json) as f:
lib = json.load(f)
titles = {g['app_name']: g['title'] for g in lib.get('games', [])}
# Build game list
games = []
for game in installed:
app_name = game.get('appName')
if not app_name:
continue
games.append({
'app_name': app_name,
'title': titles.get(app_name, f"GOG Game {app_name}"),
'install_path': game.get('install_path', ''),
'build_id': game.get('buildId', '')
})
if games:
self.logger.info(f"Found {len(games)} GOG games from Heroic")
for game in games:
self.logger.debug(f" - {game['title']} ({game['app_name']})")
return games
except Exception as e:
self.logger.warning(f"Failed to read Heroic config: {e}")
continue
return []
def generate_gog_registry(self, games: List[Dict]) -> str:
"""
Generate registry file content for GOG games.
Args:
games: List of GOG game dicts from detect_heroic_gog_games()
Returns:
Registry file content
"""
reg = "REGEDIT4\n\n"
reg += "[HKEY_LOCAL_MACHINE\\Software\\GOG.com]\n\n"
reg += "[HKEY_LOCAL_MACHINE\\Software\\GOG.com\\Games]\n\n"
reg += "[HKEY_LOCAL_MACHINE\\Software\\WOW6432Node\\GOG.com]\n\n"
reg += "[HKEY_LOCAL_MACHINE\\Software\\WOW6432Node\\GOG.com\\Games]\n\n"
for game in games:
# Convert Linux path to Wine Z: drive
linux_path = game['install_path']
wine_path = f"Z:{linux_path}".replace('/', '\\\\')
# Add to both 32-bit and 64-bit registry locations
for prefix in ['Software\\GOG.com\\Games', 'Software\\WOW6432Node\\GOG.com\\Games']:
reg += f"[HKEY_LOCAL_MACHINE\\{prefix}\\{game['app_name']}]\n"
reg += f'"path"="{wine_path}"\n'
reg += f'"gameID"="{game["app_name"]}"\n'
reg += f'"gameName"="{game["title"]}"\n'
reg += f'"buildId"="{game["build_id"]}"\n'
reg += f'"workingDir"="{wine_path}"\n\n'
return reg
def inject_gog_registry(self, app_id: int) -> int:
"""
Inject Heroic GOG games into Wine prefix registry.
Args:
app_id: Steam AppID
Returns:
Number of games injected
"""
games = self.detect_heroic_gog_games()
if not games:
self.logger.info("No GOG games found in Heroic")
return 0
reg_content = self.generate_gog_registry(games)
self.logger.info(f"Injecting {len(games)} GOG games into prefix...")
self.apply_registry(app_id, reg_content)
self.logger.info(f"Injected {len(games)} GOG games")
return len(games)

View File

@@ -132,7 +132,8 @@ class WabbajackParser:
'falloutnv': 'Fallout New Vegas',
'oblivion': 'Oblivion',
'starfield': 'Starfield',
'oblivion_remastered': 'Oblivion Remastered'
'oblivion_remastered': 'Oblivion Remastered',
'enderal': 'Enderal'
}
return [display_names.get(game, game) for game in self.supported_games]

View File

@@ -13,7 +13,7 @@ import shutil
import time
from pathlib import Path
import glob
from typing import Optional, Tuple
from typing import Optional, Tuple, List, Dict
from .subprocess_utils import get_clean_subprocess_env
# Initialize logger
@@ -197,16 +197,58 @@ class WineUtils:
logger.error(f"Error editing binary working paths: {e}")
return False
@staticmethod
def _get_sd_card_mounts():
"""
Detect SD card mount points using df.
Returns list of actual mount paths from /run/media (e.g., /run/media/deck/MicroSD).
"""
import subprocess
import re
result = subprocess.run(['df', '-h'], capture_output=True, text=True, timeout=5)
sd_mounts = []
for line in result.stdout.split('\n'):
if '/run/media' in line:
parts = line.split()
if len(parts) >= 6:
mount_point = parts[-1] # Last column is the mount point
if mount_point.startswith('/run/media/'):
sd_mounts.append(mount_point)
# Sort by length (longest first) to match most specific paths first
sd_mounts.sort(key=len, reverse=True)
logger.debug(f"Detected SD card mounts from df: {sd_mounts}")
return sd_mounts
@staticmethod
def _strip_sdcard_path(path):
"""
Strip /run/media/deck/UUID from SD card paths
Internal helper method
Strip SD card mount prefix from path.
Handles both /run/media/mmcblk0p1 and /run/media/deck/UUID patterns.
Pattern: /run/media/deck/UUID/Games/... becomes /Games/...
Pattern: /run/media/mmcblk0p1/Games/... becomes /Games/...
"""
if path.startswith("/run/media/deck/"):
parts = path.split("/", 5)
if len(parts) >= 6:
return "/" + parts[5]
import re
# Pattern 1: /run/media/deck/UUID/... strip everything up to and including UUID
# This matches the bash: "${path#*/run/media/deck/*/*}"
deck_pattern = r'^/run/media/deck/[^/]+(/.*)?$'
match = re.match(deck_pattern, path)
if match:
stripped = match.group(1) if match.group(1) else "/"
logger.debug(f"Stripped SD card path (deck pattern): {path} -> {stripped}")
return stripped
# Pattern 2: /run/media/mmcblk0p1/... strip /run/media/mmcblk0p1
# This matches the bash: "${path#*mmcblk0p1}"
if path.startswith('/run/media/mmcblk0p1/'):
stripped = path.replace('/run/media/mmcblk0p1', '', 1)
logger.debug(f"Stripped SD card path (mmcblk pattern): {path} -> {stripped}")
return stripped
# No SD card pattern matched
return path
@staticmethod
@@ -230,47 +272,45 @@ class WineUtils:
@staticmethod
def chown_chmod_modlist_dir(modlist_dir):
"""
Change ownership and permissions of modlist directory
Returns True on success, False on failure
DEPRECATED: Use FileSystemHandler.verify_ownership_and_permissions() instead.
Verify and fix ownership/permissions for modlist directory.
Returns True if successful, False if sudo required.
"""
if WineUtils.all_owned_by_user(modlist_dir):
logger.info(f"All files in {modlist_dir} are already owned by the current user. Skipping sudo chown/chmod.")
return True
logger.warn("Changing Ownership and Permissions of modlist directory (may require sudo password)")
try:
user = subprocess.run("whoami", shell=True, capture_output=True, text=True).stdout.strip()
group = subprocess.run("id -gn", shell=True, capture_output=True, text=True).stdout.strip()
logger.debug(f"User is {user} and Group is {group}")
# Change ownership
result1 = subprocess.run(
f"sudo chown -R {user}:{group} \"{modlist_dir}\"",
shell=True,
capture_output=True,
text=True
)
# Change permissions
result2 = subprocess.run(
f"sudo chmod -R 755 \"{modlist_dir}\"",
shell=True,
capture_output=True,
text=True
)
if result1.returncode != 0 or result2.returncode != 0:
logger.error("Failed to change ownership/permissions")
logger.error(f"chown output: {result1.stderr}")
logger.error(f"chmod output: {result2.stderr}")
if not WineUtils.all_owned_by_user(modlist_dir):
# Files not owned by us - need sudo to fix
logger.error(f"Ownership issue detected: Some files in {modlist_dir} are not owned by the current user")
try:
user = subprocess.run("whoami", shell=True, capture_output=True, text=True).stdout.strip()
group = subprocess.run("id -gn", shell=True, capture_output=True, text=True).stdout.strip()
logger.error("To fix ownership issues, open a terminal and run:")
logger.error(f" sudo chown -R {user}:{group} \"{modlist_dir}\"")
logger.error(f" sudo chmod -R 755 \"{modlist_dir}\"")
logger.error("After running these commands, retry the operation.")
return False
except Exception as e:
logger.error(f"Error checking ownership: {e}")
return False
# Files are owned by us - try to fix permissions ourselves
logger.info(f"Files in {modlist_dir} are owned by current user, verifying permissions...")
try:
result = subprocess.run(
['chmod', '-R', '755', modlist_dir],
capture_output=True,
text=True,
check=False
)
if result.returncode == 0:
logger.info(f"Permissions set successfully for {modlist_dir}")
else:
logger.warning(f"chmod returned non-zero but continuing: {result.stderr}")
return True
except Exception as e:
logger.error(f"Error changing ownership and permissions: {e}")
return False
logger.warning(f"Error running chmod: {e}, continuing anyway")
return True
@staticmethod
def create_dxvk_file(modlist_dir, modlist_sdcard, steam_library, basegame_sdcard, game_var_full):
@@ -510,10 +550,7 @@ class WineUtils:
if "mods" in binary_path:
# mods path type found
if modlist_sdcard:
path_middle = modlist_dir.split('mmcblk0p1', 1)[1] if 'mmcblk0p1' in modlist_dir else modlist_dir
# Strip /run/media/deck/UUID if present
if '/run/media/' in path_middle:
path_middle = '/' + path_middle.split('/run/media/', 1)[1].split('/', 2)[2]
path_middle = WineUtils._strip_sdcard_path(modlist_dir)
else:
path_middle = modlist_dir
@@ -523,10 +560,7 @@ class WineUtils:
elif any(x in binary_path for x in ["Stock Game", "Game Root", "STOCK GAME", "Stock Game Folder", "Stock Folder", "Skyrim Stock", "root/Skyrim Special Edition"]):
# Stock/Game Root found
if modlist_sdcard:
path_middle = modlist_dir.split('mmcblk0p1', 1)[1] if 'mmcblk0p1' in modlist_dir else modlist_dir
# Strip /run/media/deck/UUID if present
if '/run/media/' in path_middle:
path_middle = '/' + path_middle.split('/run/media/', 1)[1].split('/', 2)[2]
path_middle = WineUtils._strip_sdcard_path(modlist_dir)
else:
path_middle = modlist_dir
@@ -562,7 +596,7 @@ class WineUtils:
elif "steamapps" in binary_path:
# Steamapps found
if basegame_sdcard:
path_middle = steam_library.split('mmcblk0p1', 1)[1] if 'mmcblk0p1' in steam_library else steam_library
path_middle = WineUtils._strip_sdcard_path(steam_library)
drive_letter = "D:"
else:
path_middle = steam_library.split('steamapps', 1)[0] if 'steamapps' in steam_library else steam_library
@@ -609,12 +643,49 @@ class WineUtils:
"""
# Clean up the version string for directory matching
version_patterns = [proton_version, proton_version.replace(' ', '_'), proton_version.replace(' ', '')]
# Standard Steam library locations
steam_common_paths = [
Path.home() / ".steam/steam/steamapps/common",
Path.home() / ".local/share/Steam/steamapps/common",
Path.home() / ".steam/root/steamapps/common"
]
# Get actual Steam library paths from libraryfolders.vdf (smart detection)
steam_common_paths = []
compatibility_paths = []
try:
from .path_handler import PathHandler
# Get root Steam library paths (without /steamapps/common suffix)
root_steam_libs = PathHandler.get_all_steam_library_paths()
for lib_path in root_steam_libs:
lib = Path(lib_path)
if lib.exists():
# Valve Proton: {library}/steamapps/common
common_path = lib / "steamapps/common"
if common_path.exists():
steam_common_paths.append(common_path)
# GE-Proton: same Steam installation root + compatibilitytools.d
compatibility_paths.append(lib / "compatibilitytools.d")
except Exception as e:
logger.warning(f"Could not detect Steam libraries from libraryfolders.vdf: {e}")
# Fallback locations if dynamic detection fails
if not steam_common_paths:
steam_common_paths = [
Path.home() / ".steam/steam/steamapps/common",
Path.home() / ".local/share/Steam/steamapps/common",
Path.home() / ".steam/root/steamapps/common"
]
if not compatibility_paths:
compatibility_paths = [
Path.home() / ".steam/steam/compatibilitytools.d",
Path.home() / ".local/share/Steam/compatibilitytools.d"
]
# Add standard compatibility tool locations (covers edge cases like Flatpak)
compatibility_paths.extend([
Path.home() / ".steam/root/compatibilitytools.d",
Path.home() / ".var/app/com.valvesoftware.Steam/.local/share/Steam/compatibilitytools.d",
# Flatpak GE-Proton extension paths
Path.home() / ".var/app/com.valvesoftware.Steam.CompatibilityTool.Proton-GE/.local/share/Steam/compatibilitytools.d",
Path.home() / ".var/app/com.valvesoftware.Steam/.local/share/Steam/compatibilitytools.d/GE-Proton"
])
# Special handling for Proton 9: try all possible directory names
if proton_version.strip().startswith("Proton 9"):
proton9_candidates = ["Proton 9.0", "Proton 9.0 (Beta)"]
@@ -628,8 +699,9 @@ class WineUtils:
wine_bin = subdir / "files/bin/wine"
if wine_bin.is_file():
return str(wine_bin)
# General case: try version patterns
for base_path in steam_common_paths:
# General case: try version patterns in both steamapps and compatibilitytools.d
all_paths = steam_common_paths + compatibility_paths
for base_path in all_paths:
if not base_path.is_dir():
continue
for pattern in version_patterns:
@@ -643,7 +715,20 @@ class WineUtils:
wine_bin = subdir / "files/bin/wine"
if wine_bin.is_file():
return str(wine_bin)
# Fallback: Try 'Proton - Experimental' if present
# Fallback: Try user's configured Proton version
try:
from .config_handler import ConfigHandler
config = ConfigHandler()
fallback_path = config.get_proton_path()
if fallback_path != 'auto':
fallback_wine_bin = Path(fallback_path) / "files/bin/wine"
if fallback_wine_bin.is_file():
logger.warning(f"Requested Proton version '{proton_version}' not found. Falling back to user's configured version.")
return str(fallback_wine_bin)
except Exception:
pass
# Final fallback: Try 'Proton - Experimental' if present
for base_path in steam_common_paths:
wine_bin = base_path / "Proton - Experimental" / "files/bin/wine"
if wine_bin.is_file():
@@ -698,4 +783,307 @@ class WineUtils:
proton_path = str(Path(wine_bin).parent.parent)
logger.debug(f"Found Proton path: {proton_path}")
return compatdata_path, proton_path, wine_bin
return compatdata_path, proton_path, wine_bin
@staticmethod
def get_steam_library_paths() -> List[Path]:
"""
Get all Steam library paths from libraryfolders.vdf (handles Flatpak, custom locations, etc.).
Returns:
List of Path objects for Steam library directories
"""
steam_common_paths = []
try:
from .path_handler import PathHandler
# Use existing PathHandler that reads libraryfolders.vdf
library_paths = PathHandler.get_all_steam_library_paths()
logger.info(f"PathHandler found Steam libraries: {library_paths}")
# Convert to steamapps/common paths for Proton scanning
for lib_path in library_paths:
common_path = lib_path / "steamapps" / "common"
if common_path.exists():
steam_common_paths.append(common_path)
logger.debug(f"Added Steam library: {common_path}")
else:
logger.debug(f"Steam library path doesn't exist: {common_path}")
except Exception as e:
logger.error(f"PathHandler failed to read libraryfolders.vdf: {e}")
# Always add fallback paths in case PathHandler missed something
fallback_paths = [
Path.home() / ".steam/steam/steamapps/common",
Path.home() / ".local/share/Steam/steamapps/common",
Path.home() / ".steam/root/steamapps/common"
]
for fallback_path in fallback_paths:
if fallback_path.exists() and fallback_path not in steam_common_paths:
steam_common_paths.append(fallback_path)
logger.debug(f"Added fallback Steam library: {fallback_path}")
logger.info(f"Final Steam library paths for Proton scanning: {steam_common_paths}")
return steam_common_paths
@staticmethod
def get_compatibility_tool_paths() -> List[Path]:
"""
Get all compatibility tool paths for GE-Proton and other custom Proton versions.
Returns:
List of Path objects for compatibility tool directories
"""
compat_paths = [
Path.home() / ".steam/steam/compatibilitytools.d",
Path.home() / ".local/share/Steam/compatibilitytools.d",
Path.home() / ".steam/root/compatibilitytools.d",
Path.home() / ".var/app/com.valvesoftware.Steam/.local/share/Steam/compatibilitytools.d",
# Flatpak GE-Proton extension paths
Path.home() / ".var/app/com.valvesoftware.Steam.CompatibilityTool.Proton-GE/.local/share/Steam/compatibilitytools.d",
Path.home() / ".var/app/com.valvesoftware.Steam/.local/share/Steam/compatibilitytools.d/GE-Proton"
]
# Return only existing paths
return [path for path in compat_paths if path.exists()]
@staticmethod
def scan_ge_proton_versions() -> List[Dict[str, any]]:
"""
Scan for available GE-Proton versions in compatibilitytools.d directories.
Returns:
List of dicts with version info, sorted by priority (newest first)
"""
logger.info("Scanning for available GE-Proton versions...")
found_versions = []
compat_paths = WineUtils.get_compatibility_tool_paths()
if not compat_paths:
logger.warning("No compatibility tool paths found")
return []
for compat_path in compat_paths:
logger.debug(f"Scanning compatibility tools: {compat_path}")
try:
# Look for GE-Proton directories
for proton_dir in compat_path.iterdir():
if not proton_dir.is_dir():
continue
dir_name = proton_dir.name
if not dir_name.startswith("GE-Proton"):
continue
# Check for wine binary
wine_bin = proton_dir / "files" / "bin" / "wine"
if not wine_bin.exists() or not wine_bin.is_file():
logger.debug(f"Skipping {dir_name} - no wine binary found")
continue
# Parse version from directory name (e.g., "GE-Proton10-16")
version_match = re.match(r'GE-Proton(\d+)-(\d+)', dir_name)
if version_match:
major_ver = int(version_match.group(1))
minor_ver = int(version_match.group(2))
# Calculate priority: GE-Proton gets highest priority
# Priority format: 200 (base) + major*10 + minor (e.g., 200 + 100 + 16 = 316)
priority = 200 + (major_ver * 10) + minor_ver
found_versions.append({
'name': dir_name,
'path': proton_dir,
'wine_bin': wine_bin,
'priority': priority,
'major_version': major_ver,
'minor_version': minor_ver,
'type': 'GE-Proton'
})
logger.debug(f"Found {dir_name} at {proton_dir} (priority: {priority})")
else:
logger.debug(f"Skipping {dir_name} - unknown GE-Proton version format")
except Exception as e:
logger.warning(f"Error scanning {compat_path}: {e}")
# Sort by priority (highest first, so newest GE-Proton versions come first)
found_versions.sort(key=lambda x: x['priority'], reverse=True)
logger.info(f"Found {len(found_versions)} GE-Proton version(s)")
return found_versions
@staticmethod
def scan_valve_proton_versions() -> List[Dict[str, any]]:
"""
Scan for available Valve Proton versions with fallback priority.
Returns:
List of dicts with version info, sorted by priority (best first)
"""
logger.info("Scanning for available Valve Proton versions...")
found_versions = []
steam_libs = WineUtils.get_steam_library_paths()
if not steam_libs:
logger.warning("No Steam library paths found")
return []
# Priority order for Valve Proton versions
# Note: GE-Proton uses 200+ range, so Valve Proton gets 100+ range
preferred_versions = [
("Proton - Experimental", 150), # Higher priority than regular Valve Proton
("Proton 10.0", 140),
("Proton 9.0", 130),
("Proton 9.0 (Beta)", 125)
]
for steam_path in steam_libs:
logger.debug(f"Scanning Steam library: {steam_path}")
for version_name, priority in preferred_versions:
proton_path = steam_path / version_name
wine_bin = proton_path / "files" / "bin" / "wine"
if wine_bin.exists() and wine_bin.is_file():
found_versions.append({
'name': version_name,
'path': proton_path,
'wine_bin': wine_bin,
'priority': priority,
'type': 'Valve-Proton'
})
logger.debug(f"Found {version_name} at {proton_path}")
# Sort by priority (highest first)
found_versions.sort(key=lambda x: x['priority'], reverse=True)
# Remove duplicates while preserving order
unique_versions = []
seen_names = set()
for version in found_versions:
if version['name'] not in seen_names:
unique_versions.append(version)
seen_names.add(version['name'])
logger.info(f"Found {len(unique_versions)} unique Valve Proton version(s)")
return unique_versions
@staticmethod
def scan_all_proton_versions() -> List[Dict[str, any]]:
"""
Scan for all available Proton versions (GE-Proton + Valve Proton) with unified priority.
Priority Chain (highest to lowest):
1. GE-Proton10-16+ (priority 316+)
2. GE-Proton10-* (priority 200+)
3. Proton - Experimental (priority 150)
4. Proton 10.0 (priority 140)
5. Proton 9.0 (priority 130)
6. Proton 9.0 (Beta) (priority 125)
Returns:
List of dicts with version info, sorted by priority (best first)
"""
logger.info("Scanning for all available Proton versions...")
all_versions = []
# Scan GE-Proton versions (highest priority)
ge_versions = WineUtils.scan_ge_proton_versions()
all_versions.extend(ge_versions)
# Scan Valve Proton versions
valve_versions = WineUtils.scan_valve_proton_versions()
all_versions.extend(valve_versions)
# Sort by priority (highest first)
all_versions.sort(key=lambda x: x['priority'], reverse=True)
# Remove duplicates while preserving order
unique_versions = []
seen_names = set()
for version in all_versions:
if version['name'] not in seen_names:
unique_versions.append(version)
seen_names.add(version['name'])
if unique_versions:
logger.debug(f"Found {len(unique_versions)} total Proton version(s)")
logger.debug(f"Best available: {unique_versions[0]['name']} ({unique_versions[0]['type']})")
else:
logger.warning("No Proton versions found")
return unique_versions
@staticmethod
def select_best_proton() -> Optional[Dict[str, any]]:
"""
Select the best available Proton version (GE-Proton or Valve Proton) using unified precedence.
Returns:
Dict with version info for the best Proton, or None if none found
"""
available_versions = WineUtils.scan_all_proton_versions()
if not available_versions:
logger.warning("No compatible Proton versions found")
return None
# Return the highest priority version (first in sorted list)
best_version = available_versions[0]
logger.info(f"Selected best Proton version: {best_version['name']} ({best_version['type']})")
return best_version
@staticmethod
def select_best_valve_proton() -> Optional[Dict[str, any]]:
"""
Select the best available Valve Proton version using fallback precedence.
Note: This method is kept for backward compatibility. Consider using select_best_proton() instead.
Returns:
Dict with version info for the best Proton, or None if none found
"""
available_versions = WineUtils.scan_valve_proton_versions()
if not available_versions:
logger.warning("No compatible Valve Proton versions found")
return None
# Return the highest priority version (first in sorted list)
best_version = available_versions[0]
logger.info(f"Selected Valve Proton version: {best_version['name']}")
return best_version
@staticmethod
def check_proton_requirements() -> Tuple[bool, str, Optional[Dict[str, any]]]:
"""
Check if compatible Proton version is available for workflows.
Returns:
tuple: (requirements_met, status_message, proton_info)
- requirements_met: True if compatible Proton found
- status_message: Human-readable status for display to user
- proton_info: Dict with Proton details if found, None otherwise
"""
logger.info("Checking Proton requirements for workflow...")
# Scan for available Proton versions (includes GE-Proton + Valve Proton)
best_proton = WineUtils.select_best_proton()
if best_proton:
# Compatible Proton found
proton_type = best_proton.get('type', 'Unknown')
status_msg = f"✓ Using {best_proton['name']} ({proton_type}) for this workflow"
logger.info(f"Proton requirements satisfied: {best_proton['name']} ({proton_type})")
return True, status_msg, best_proton
else:
# No compatible Proton found
status_msg = "✗ No compatible Proton version found (GE-Proton 10+, Proton 9+, 10, or Experimental required)"
logger.warning("Proton requirements not met - no compatible version found")
return False, status_msg, None

File diff suppressed because it is too large Load Diff

View File

@@ -68,7 +68,9 @@ class SystemInfo:
steam_root: Optional[Path] = None
steam_user_id: Optional[str] = None
proton_version: Optional[str] = None
is_flatpak_steam: bool = False
is_native_steam: bool = False
def to_dict(self) -> Dict[str, Any]:
"""Convert to dictionary."""
return {
@@ -76,4 +78,6 @@ class SystemInfo:
'steam_root': str(self.steam_root) if self.steam_root else None,
'steam_user_id': self.steam_user_id,
'proton_version': self.proton_version,
'is_flatpak_steam': self.is_flatpak_steam,
'is_native_steam': self.is_native_steam,
}

View File

@@ -23,6 +23,7 @@ class ModlistContext:
mo2_exe_path: Optional[Path] = None
skip_confirmation: bool = False
engine_installed: bool = False # True if installed via jackify-engine
enb_detected: bool = False # True if ENB was detected during configuration
def __post_init__(self):
"""Convert string paths to Path objects."""

View File

@@ -0,0 +1,216 @@
"""
Data models for modlist metadata from jackify-engine JSON output.
These models match the JSON schema documented in MODLIST_METADATA_IMPLEMENTATION.md
"""
from dataclasses import dataclass, field
from typing import List, Optional
from datetime import datetime
@dataclass
class ModlistImages:
"""Image URLs for modlist (small thumbnail and large banner)"""
small: str
large: str
@dataclass
class ModlistLinks:
"""External links associated with the modlist"""
image: Optional[str] = None
readme: Optional[str] = None
download: Optional[str] = None
discordURL: Optional[str] = None
websiteURL: Optional[str] = None
@dataclass
class ModlistSizes:
"""Size information for modlist downloads and installation"""
downloadSize: int
downloadSizeFormatted: str
installSize: int
installSizeFormatted: str
totalSize: int
totalSizeFormatted: str
numberOfArchives: int
numberOfInstalledFiles: int
@dataclass
class ModlistValidation:
"""Validation status from Wabbajack build server (optional)"""
failed: int = 0
passed: int = 0
updating: int = 0
mirrored: int = 0
modListIsMissing: bool = False
hasFailures: bool = False
@dataclass
class ModlistMetadata:
"""Complete modlist metadata from jackify-engine"""
# Basic information
title: str
description: str
author: str
maintainers: List[str]
namespacedName: str
repositoryName: str
machineURL: str
# Game information
game: str
gameHumanFriendly: str
# Status flags
official: bool
nsfw: bool
utilityList: bool
forceDown: bool
imageContainsTitle: bool
# Version information
version: Optional[str] = None
displayVersionOnlyInInstallerView: bool = False
# Dates
dateCreated: Optional[str] = None # ISO8601 format
dateUpdated: Optional[str] = None # ISO8601 format
# Categorization
tags: List[str] = field(default_factory=list)
# Nested objects
links: Optional[ModlistLinks] = None
sizes: Optional[ModlistSizes] = None
images: Optional[ModlistImages] = None
# Optional data (only if flags specified)
validation: Optional[ModlistValidation] = None
mods: List[str] = field(default_factory=list)
def is_available(self) -> bool:
"""Check if modlist is available for installation"""
if self.forceDown:
return False
if self.validation and self.validation.hasFailures:
return False
return True
def is_broken(self) -> bool:
"""Check if modlist has validation failures"""
return self.validation.hasFailures if self.validation else False
def get_date_updated_datetime(self) -> Optional[datetime]:
"""Parse dateUpdated string to datetime object"""
if not self.dateUpdated:
return None
try:
return datetime.fromisoformat(self.dateUpdated.replace('Z', '+00:00'))
except (ValueError, AttributeError):
return None
def get_date_created_datetime(self) -> Optional[datetime]:
"""Parse dateCreated string to datetime object"""
if not self.dateCreated:
return None
try:
return datetime.fromisoformat(self.dateCreated.replace('Z', '+00:00'))
except (ValueError, AttributeError):
return None
@dataclass
class ModlistMetadataResponse:
"""Root response object from jackify-engine list-modlists --json"""
metadataVersion: str
timestamp: str # ISO8601 format
count: int
modlists: List[ModlistMetadata]
def get_timestamp_datetime(self) -> Optional[datetime]:
"""Parse timestamp string to datetime object"""
try:
return datetime.fromisoformat(self.timestamp.replace('Z', '+00:00'))
except (ValueError, AttributeError):
return None
def filter_by_game(self, game: str) -> List[ModlistMetadata]:
"""Filter modlists by game name"""
return [m for m in self.modlists if m.game.lower() == game.lower()]
def filter_available_only(self) -> List[ModlistMetadata]:
"""Filter to only available (non-broken, non-forced-down) modlists"""
return [m for m in self.modlists if m.is_available()]
def filter_by_tag(self, tag: str) -> List[ModlistMetadata]:
"""Filter modlists by tag"""
return [m for m in self.modlists if tag.lower() in [t.lower() for t in m.tags]]
def filter_official_only(self) -> List[ModlistMetadata]:
"""Filter to only official modlists"""
return [m for m in self.modlists if m.official]
def search(self, query: str) -> List[ModlistMetadata]:
"""Search modlists by title, description, or author"""
query_lower = query.lower()
return [
m for m in self.modlists
if query_lower in m.title.lower()
or query_lower in m.description.lower()
or query_lower in m.author.lower()
]
def parse_modlist_metadata_from_dict(data: dict) -> ModlistMetadata:
"""Parse a modlist metadata dictionary into ModlistMetadata object"""
# Parse nested objects
images = ModlistImages(**data['images']) if 'images' in data and data['images'] else None
links = ModlistLinks(**data['links']) if 'links' in data and data['links'] else None
sizes = ModlistSizes(**data['sizes']) if 'sizes' in data and data['sizes'] else None
validation = ModlistValidation(**data['validation']) if 'validation' in data and data['validation'] else None
# Create ModlistMetadata with nested objects
metadata = ModlistMetadata(
title=data['title'],
description=data['description'],
author=data['author'],
maintainers=data.get('maintainers', []),
namespacedName=data['namespacedName'],
repositoryName=data['repositoryName'],
machineURL=data['machineURL'],
game=data['game'],
gameHumanFriendly=data['gameHumanFriendly'],
official=data['official'],
nsfw=data['nsfw'],
utilityList=data['utilityList'],
forceDown=data['forceDown'],
imageContainsTitle=data['imageContainsTitle'],
version=data.get('version'),
displayVersionOnlyInInstallerView=data.get('displayVersionOnlyInInstallerView', False),
dateCreated=data.get('dateCreated'),
dateUpdated=data.get('dateUpdated'),
tags=data.get('tags', []),
links=links,
sizes=sizes,
images=images,
validation=validation,
mods=data.get('mods', [])
)
return metadata
def parse_modlist_metadata_response(data: dict) -> ModlistMetadataResponse:
"""Parse the full JSON response from jackify-engine into ModlistMetadataResponse"""
modlists = [parse_modlist_metadata_from_dict(m) for m in data.get('modlists', [])]
return ModlistMetadataResponse(
metadataVersion=data['metadataVersion'],
timestamp=data['timestamp'],
count=data['count'],
modlists=modlists
)

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,455 @@
"""
Service for fetching and managing modlist metadata for the gallery view.
Handles jackify-engine integration, caching, and image management.
"""
import json
import subprocess
import time
import threading
from pathlib import Path
from typing import Optional, List, Dict
from datetime import datetime, timedelta
import urllib.request
from jackify.backend.models.modlist_metadata import (
ModlistMetadataResponse,
ModlistMetadata,
parse_modlist_metadata_response
)
from jackify.backend.core.modlist_operations import get_jackify_engine_path
from jackify.backend.handlers.config_handler import ConfigHandler
from jackify.shared.paths import get_jackify_data_dir
class ModlistGalleryService:
"""Service for fetching and caching modlist metadata from jackify-engine"""
# REMOVED: CACHE_VALIDITY_DAYS - metadata is now always fetched fresh from engine
# Images are still cached indefinitely (managed separately)
# CRITICAL: Thread lock to prevent concurrent engine calls that could cause recursive spawning
_engine_call_lock = threading.Lock()
def __init__(self):
"""Initialize the gallery service"""
self.config_handler = ConfigHandler()
# Cache directories in Jackify Data Directory
jackify_data_dir = get_jackify_data_dir()
self.CACHE_DIR = jackify_data_dir / "modlist-cache" / "metadata"
self.IMAGE_CACHE_DIR = jackify_data_dir / "modlist-cache" / "images"
self.METADATA_CACHE_FILE = self.CACHE_DIR / "modlist_metadata.json"
self._ensure_cache_dirs()
# Tag metadata caches (avoid refetching per render)
self._tag_mappings_cache: Optional[Dict[str, str]] = None
self._tag_mapping_lookup: Optional[Dict[str, str]] = None
self._allowed_tags_cache: Optional[set] = None
self._allowed_tags_lookup: Optional[Dict[str, str]] = None
def _ensure_cache_dirs(self):
"""Create cache directories if they don't exist"""
self.CACHE_DIR.mkdir(parents=True, exist_ok=True)
self.IMAGE_CACHE_DIR.mkdir(parents=True, exist_ok=True)
def fetch_modlist_metadata(
self,
include_validation: bool = True,
include_search_index: bool = False,
sort_by: str = "title",
force_refresh: bool = False
) -> Optional[ModlistMetadataResponse]:
"""
Fetch modlist metadata from jackify-engine.
NOTE: Metadata is ALWAYS fetched fresh from the engine to ensure up-to-date
version numbers and sizes for frequently-updated modlists. Only images are cached.
Args:
include_validation: Include validation status (slower)
include_search_index: Include mod search index (slower)
sort_by: Sort order (title, size, date)
force_refresh: Deprecated parameter (kept for API compatibility)
Returns:
ModlistMetadataResponse or None if fetch fails
"""
# Always fetch fresh data from jackify-engine
# The engine itself is fast (~1-2 seconds) and always gets latest metadata
try:
metadata = self._fetch_from_engine(
include_validation=include_validation,
include_search_index=include_search_index,
sort_by=sort_by
)
# Still save to cache as a fallback for offline scenarios
if metadata:
self._save_to_cache(metadata)
return metadata
except Exception as e:
print(f"Error fetching modlist metadata: {e}")
print("Falling back to cached metadata (may be outdated)")
# Fall back to cache if network/engine fails
return self._load_from_cache()
def _fetch_from_engine(
self,
include_validation: bool,
include_search_index: bool,
sort_by: str
) -> Optional[ModlistMetadataResponse]:
"""Call jackify-engine to fetch modlist metadata"""
# CRITICAL: Use thread lock to prevent concurrent engine calls
# Multiple simultaneous calls could cause recursive spawning issues
with self._engine_call_lock:
# CRITICAL: Get engine path BEFORE cleaning environment
# get_jackify_engine_path() may need APPDIR to locate the engine
engine_path = get_jackify_engine_path()
if not engine_path:
raise FileNotFoundError("jackify-engine not found")
# Build command
cmd = [str(engine_path), "list-modlists", "--json", "--sort-by", sort_by]
if include_validation:
cmd.append("--include-validation-status")
if include_search_index:
cmd.append("--include-search-index")
# Execute command
# CRITICAL: Use centralized clean environment to prevent AppImage recursive spawning
# This must happen AFTER engine path resolution
from jackify.backend.handlers.subprocess_utils import get_clean_subprocess_env
clean_env = get_clean_subprocess_env()
result = subprocess.run(
cmd,
capture_output=True,
text=True,
timeout=300, # 5 minute timeout for large data
env=clean_env
)
if result.returncode != 0:
raise RuntimeError(f"jackify-engine failed: {result.stderr}")
# Parse JSON response - skip progress messages and extract JSON
# jackify-engine prints progress to stdout before the JSON
stdout = result.stdout.strip()
# Find the start of JSON (first '{' on its own line)
lines = stdout.split('\n')
json_start = 0
for i, line in enumerate(lines):
if line.strip().startswith('{'):
json_start = i
break
json_text = '\n'.join(lines[json_start:])
data = json.loads(json_text)
return parse_modlist_metadata_response(data)
def _load_from_cache(self) -> Optional[ModlistMetadataResponse]:
"""Load metadata from cache file"""
if not self.METADATA_CACHE_FILE.exists():
return None
try:
with open(self.METADATA_CACHE_FILE, 'r', encoding='utf-8') as f:
data = json.load(f)
return parse_modlist_metadata_response(data)
except Exception as e:
print(f"Error loading cache: {e}")
return None
def _save_to_cache(self, metadata: ModlistMetadataResponse):
"""Save metadata to cache file"""
try:
# Convert to dict for JSON serialization
data = {
'metadataVersion': metadata.metadataVersion,
'timestamp': metadata.timestamp,
'count': metadata.count,
'modlists': [self._metadata_to_dict(m) for m in metadata.modlists]
}
with open(self.METADATA_CACHE_FILE, 'w', encoding='utf-8') as f:
json.dump(data, f, indent=2)
except Exception as e:
print(f"Error saving cache: {e}")
def _metadata_to_dict(self, metadata: ModlistMetadata) -> dict:
"""Convert ModlistMetadata to dict for JSON serialization"""
result = {
'title': metadata.title,
'description': metadata.description,
'author': metadata.author,
'maintainers': metadata.maintainers,
'namespacedName': metadata.namespacedName,
'repositoryName': metadata.repositoryName,
'machineURL': metadata.machineURL,
'game': metadata.game,
'gameHumanFriendly': metadata.gameHumanFriendly,
'official': metadata.official,
'nsfw': metadata.nsfw,
'utilityList': metadata.utilityList,
'forceDown': metadata.forceDown,
'imageContainsTitle': metadata.imageContainsTitle,
'version': metadata.version,
'displayVersionOnlyInInstallerView': metadata.displayVersionOnlyInInstallerView,
'dateCreated': metadata.dateCreated,
'dateUpdated': metadata.dateUpdated,
'tags': metadata.tags,
'mods': metadata.mods
}
if metadata.images:
result['images'] = {
'small': metadata.images.small,
'large': metadata.images.large
}
if metadata.links:
result['links'] = {
'image': metadata.links.image,
'readme': metadata.links.readme,
'download': metadata.links.download,
'discordURL': metadata.links.discordURL,
'websiteURL': metadata.links.websiteURL
}
if metadata.sizes:
result['sizes'] = {
'downloadSize': metadata.sizes.downloadSize,
'downloadSizeFormatted': metadata.sizes.downloadSizeFormatted,
'installSize': metadata.sizes.installSize,
'installSizeFormatted': metadata.sizes.installSizeFormatted,
'totalSize': metadata.sizes.totalSize,
'totalSizeFormatted': metadata.sizes.totalSizeFormatted,
'numberOfArchives': metadata.sizes.numberOfArchives,
'numberOfInstalledFiles': metadata.sizes.numberOfInstalledFiles
}
if metadata.validation:
result['validation'] = {
'failed': metadata.validation.failed,
'passed': metadata.validation.passed,
'updating': metadata.validation.updating,
'mirrored': metadata.validation.mirrored,
'modListIsMissing': metadata.validation.modListIsMissing,
'hasFailures': metadata.validation.hasFailures
}
return result
def download_images(
self,
game_filter: Optional[str] = None,
size: str = "both",
overwrite: bool = False
) -> bool:
"""
Download modlist images to cache using jackify-engine.
Args:
game_filter: Filter by game name (None = all games)
size: Image size to download (small, large, both)
overwrite: Overwrite existing images
Returns:
True if successful, False otherwise
"""
# Build command (engine path will be resolved inside lock)
cmd = [
"placeholder", # Will be replaced with actual engine path
"download-modlist-images",
"--output", str(self.IMAGE_CACHE_DIR),
"--size", size
]
if game_filter:
cmd.extend(["--game", game_filter])
if overwrite:
cmd.append("--overwrite")
# Execute command
try:
# CRITICAL: Use thread lock to prevent concurrent engine calls
with self._engine_call_lock:
# CRITICAL: Get engine path BEFORE cleaning environment
# get_jackify_engine_path() may need APPDIR to locate the engine
engine_path = get_jackify_engine_path()
if not engine_path:
return False
# Update cmd with resolved engine path
cmd[0] = str(engine_path)
# CRITICAL: Use centralized clean environment to prevent AppImage recursive spawning
# This must happen AFTER engine path resolution
from jackify.backend.handlers.subprocess_utils import get_clean_subprocess_env
clean_env = get_clean_subprocess_env()
result = subprocess.run(
cmd,
capture_output=True,
text=True,
timeout=3600, # 1 hour timeout for downloads
env=clean_env
)
return result.returncode == 0
except Exception as e:
print(f"Error downloading images: {e}")
return False
def get_cached_image_path(self, metadata: ModlistMetadata, size: str = "large") -> Optional[Path]:
"""
Get path to cached image for a modlist (only if it exists).
Args:
metadata: Modlist metadata
size: Image size (small or large)
Returns:
Path to cached image or None if not cached
"""
filename = f"{metadata.machineURL}_{size}.webp"
image_path = self.IMAGE_CACHE_DIR / metadata.repositoryName / filename
if image_path.exists():
return image_path
return None
def get_image_cache_path(self, metadata: ModlistMetadata, size: str = "large") -> Path:
"""
Get path where image should be cached (always returns path, even if file doesn't exist).
Args:
metadata: Modlist metadata
size: Image size (small or large)
Returns:
Path where image should be cached
"""
filename = f"{metadata.machineURL}_{size}.webp"
return self.IMAGE_CACHE_DIR / metadata.repositoryName / filename
def get_image_url(self, metadata: ModlistMetadata, size: str = "large") -> Optional[str]:
"""
Get image URL for a modlist.
Args:
metadata: Modlist metadata
size: Image size (small or large)
Returns:
Image URL or None if images not available
"""
if not metadata.images:
return None
return metadata.images.large if size == "large" else metadata.images.small
def clear_cache(self):
"""Clear all cached metadata and images"""
if self.METADATA_CACHE_FILE.exists():
self.METADATA_CACHE_FILE.unlink()
# Clear image cache
if self.IMAGE_CACHE_DIR.exists():
import shutil
shutil.rmtree(self.IMAGE_CACHE_DIR)
self.IMAGE_CACHE_DIR.mkdir(parents=True, exist_ok=True)
def get_installed_modlists(self) -> List[str]:
"""
Get list of installed modlist machine URLs.
Returns:
List of machine URLs for installed modlists
"""
# TODO: Integrate with existing modlist database/config
# For now, return empty list - will be implemented when integrated with existing modlist tracking
return []
def is_modlist_installed(self, machine_url: str) -> bool:
"""Check if a modlist is installed"""
return machine_url in self.get_installed_modlists()
def load_tag_mappings(self) -> Dict[str, str]:
"""
Load tag mappings from Wabbajack GitHub repository.
Maps variant tag names to canonical tag names.
Returns:
Dictionary mapping variant tags to canonical tags
"""
url = "https://raw.githubusercontent.com/wabbajack-tools/mod-lists/master/tag_mappings.json"
try:
with urllib.request.urlopen(url, timeout=10) as response:
data = json.loads(response.read().decode('utf-8'))
return data
except Exception as e:
print(f"Warning: Could not load tag mappings: {e}")
return {}
def load_allowed_tags(self) -> set:
"""
Load allowed tags from Wabbajack GitHub repository.
Returns:
Set of allowed tag names (preserving original case)
"""
url = "https://raw.githubusercontent.com/wabbajack-tools/mod-lists/master/allowed_tags.json"
try:
with urllib.request.urlopen(url, timeout=10) as response:
data = json.loads(response.read().decode('utf-8'))
return set(data) # Return as set preserving original case
except Exception as e:
print(f"Warning: Could not load allowed tags: {e}")
return set()
def _ensure_tag_metadata(self):
"""Ensure tag mappings/allowed tags (and lookups) are cached."""
if self._tag_mappings_cache is None:
self._tag_mappings_cache = self.load_tag_mappings()
if self._tag_mapping_lookup is None:
self._tag_mapping_lookup = {k.lower(): v for k, v in self._tag_mappings_cache.items()}
if self._allowed_tags_cache is None:
self._allowed_tags_cache = self.load_allowed_tags()
if self._allowed_tags_lookup is None:
self._allowed_tags_lookup = {tag.lower(): tag for tag in self._allowed_tags_cache}
def normalize_tag_value(self, tag: str) -> str:
"""
Normalize a tag to its canonical display form using Wabbajack mappings.
Returns the normalized tag (original casing preserved when possible).
"""
if not tag:
return ""
self._ensure_tag_metadata()
tag_key = tag.strip().lower()
if not tag_key:
return ""
canonical = self._tag_mapping_lookup.get(tag_key, tag.strip())
# Prefer allowed tag casing if available
return self._allowed_tags_lookup.get(canonical.lower(), canonical)
def normalize_tags_for_display(self, tags: Optional[List[str]]) -> List[str]:
"""Normalize a list of tags for UI display (deduped, canonical casing)."""
if not tags:
return []
self._ensure_tag_metadata()
normalized = []
seen = set()
for tag in tags:
normalized_tag = self.normalize_tag_value(tag)
key = normalized_tag.lower()
if key and key not in seen:
normalized.append(normalized_tag)
seen.add(key)
return normalized

View File

@@ -34,8 +34,10 @@ class ModlistService:
"""Lazy initialization of modlist handler."""
if self._modlist_handler is None:
from ..handlers.modlist_handler import ModlistHandler
# Initialize with proper dependencies
self._modlist_handler = ModlistHandler()
from ..services.platform_detection_service import PlatformDetectionService
# Initialize with proper dependencies and centralized Steam Deck detection
platform_service = PlatformDetectionService.get_instance()
self._modlist_handler = ModlistHandler(steamdeck=platform_service.is_steamdeck)
return self._modlist_handler
def _get_wabbajack_handler(self):
@@ -273,8 +275,17 @@ class ModlistService:
actual_download_path = Path(download_dir_context)
download_dir_str = str(actual_download_path)
api_key = context['nexus_api_key']
# CRITICAL: Re-check authentication right before launching engine
# This ensures we use current auth state, not stale cached values from context
# (e.g., if user revoked OAuth after context was created)
from ..services.nexus_auth_service import NexusAuthService
auth_service = NexusAuthService()
current_api_key, current_oauth_info = auth_service.get_auth_for_engine()
# Use current auth state, fallback to context values only if current check failed
api_key = current_api_key or context.get('nexus_api_key')
oauth_info = current_oauth_info or context.get('nexus_oauth_info')
# Path to the engine binary (copied from working code)
engine_path = get_jackify_engine_path()
engine_dir = os.path.dirname(engine_path)
@@ -283,8 +294,9 @@ class ModlistService:
output_callback(f"Jackify Install Engine not found or not executable at: {engine_path}")
return False
# Build command (copied from working code)
cmd = [engine_path, 'install']
# Build command (copied from working code)
cmd = [engine_path, 'install', '--show-file-progress']
modlist_value = context.get('modlist_value')
if modlist_value and modlist_value.endswith('.wabbajack') and os.path.isfile(modlist_value):
cmd += ['-w', modlist_value]
@@ -293,30 +305,36 @@ class ModlistService:
elif context.get('machineid'):
cmd += ['-m', context['machineid']]
cmd += ['-o', install_dir_str, '-d', download_dir_str]
# Check for debug mode and add --debug flag
from ..handlers.config_handler import ConfigHandler
config_handler = ConfigHandler()
debug_mode = config_handler.get('debug_mode', False)
if debug_mode:
cmd.append('--debug')
logger.debug("DEBUG: Added --debug flag to jackify-engine command")
# NOTE: API key is passed via environment variable only, not as command line argument
# Store original environment values (copied from working code)
original_env_values = {
'NEXUS_API_KEY': os.environ.get('NEXUS_API_KEY'),
'NEXUS_OAUTH_INFO': os.environ.get('NEXUS_OAUTH_INFO'),
'DOTNET_SYSTEM_GLOBALIZATION_INVARIANT': os.environ.get('DOTNET_SYSTEM_GLOBALIZATION_INVARIANT')
}
try:
# Environment setup (copied from working code)
if api_key:
# Environment setup - prefer NEXUS_OAUTH_INFO (supports auto-refresh) over NEXUS_API_KEY
if oauth_info:
os.environ['NEXUS_OAUTH_INFO'] = oauth_info
# CRITICAL: Set client_id so engine can refresh tokens with correct client_id
# Engine's RefreshToken method reads this to use our "jackify" client_id instead of hardcoded "wabbajack"
from jackify.backend.services.nexus_oauth_service import NexusOAuthService
os.environ['NEXUS_OAUTH_CLIENT_ID'] = NexusOAuthService.CLIENT_ID
# Also set NEXUS_API_KEY for backward compatibility
if api_key:
os.environ['NEXUS_API_KEY'] = api_key
elif api_key:
os.environ['NEXUS_API_KEY'] = api_key
elif 'NEXUS_API_KEY' in os.environ:
del os.environ['NEXUS_API_KEY']
else:
# No auth available, clear any inherited values
if 'NEXUS_API_KEY' in os.environ:
del os.environ['NEXUS_API_KEY']
if 'NEXUS_OAUTH_INFO' in os.environ:
del os.environ['NEXUS_OAUTH_INFO']
os.environ['DOTNET_SYSTEM_GLOBALIZATION_INVARIANT'] = "1"
pretty_cmd = ' '.join([f'"{arg}"' if ' ' in arg else arg for arg in cmd])
@@ -332,8 +350,10 @@ class ModlistService:
else:
output_callback(f"File descriptor limit warning: {message}")
# Subprocess call (copied from working code)
proc = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, text=False, env=None, cwd=engine_dir)
# Subprocess call with cleaned environment to prevent AppImage variable inheritance
from jackify.backend.handlers.subprocess_utils import get_clean_subprocess_env
clean_env = get_clean_subprocess_env()
proc = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, text=False, env=clean_env, cwd=engine_dir)
# Output processing (copied from working code)
buffer = b''
@@ -541,18 +561,42 @@ class ModlistService:
success = modlist_menu.run_modlist_configuration_phase(config_context)
debug_callback(f"Configuration phase result: {success}")
# Restore stdout before calling completion callback
# Restore stdout before ENB detection and completion callback
if original_stdout:
sys.stdout = original_stdout
original_stdout = None
# Configure ENB for Linux compatibility (non-blocking)
# Do this BEFORE completion callback so we can pass detection status
enb_detected = False
try:
from ..handlers.enb_handler import ENBHandler
enb_handler = ENBHandler()
enb_success, enb_message, enb_detected = enb_handler.configure_enb_for_linux(context.install_dir)
if enb_message:
if enb_success:
logger.info(enb_message)
if progress_callback:
progress_callback(enb_message)
else:
logger.warning(enb_message)
# Non-blocking: continue workflow even if ENB config fails
except Exception as e:
logger.warning(f"ENB configuration skipped due to error: {e}")
# Continue workflow - ENB config is optional
# Store ENB detection status in context for GUI to use
context.enb_detected = enb_detected
if completion_callback:
if success:
debug_callback("Configuration completed successfully, calling completion callback")
completion_callback(True, "Configuration completed successfully!", context.name)
# Pass ENB detection status through callback
completion_callback(True, "Configuration completed successfully!", context.name, enb_detected)
else:
debug_callback("Configuration failed, calling completion callback with failure")
completion_callback(False, "Configuration failed", context.name)
completion_callback(False, "Configuration failed", context.name, False)
return success
@@ -584,7 +628,7 @@ class ModlistService:
except Exception as e:
logger.error(f"Failed to configure modlist {context.name}: {e}")
if completion_callback:
completion_callback(False, f"Configuration failed: {e}", context.name)
completion_callback(False, f"Configuration failed: {e}", context.name, False)
# Clean up GUI log handler on exception
if gui_log_handler:
@@ -637,8 +681,13 @@ class ModlistService:
'mo2_exe_path': str(context.install_dir / 'ModOrganizer.exe'),
'resolution': getattr(context, 'resolution', None),
'skip_confirmation': True, # Service layer should be non-interactive
'manual_steps_completed': False
'manual_steps_completed': False,
'appid': getattr(context, 'app_id', None) # Fix: Include appid like other configuration paths
}
# DEBUG: Log what resolution we're passing
logger.info(f"DEBUG: config_context resolution = {config_context['resolution']}")
logger.info(f"DEBUG: context.resolution = {getattr(context, 'resolution', 'NOT_SET')}")
# Run the complete configuration phase
success = modlist_menu.run_modlist_configuration_phase(config_context)
@@ -646,11 +695,11 @@ class ModlistService:
if success:
logger.info("Modlist configuration completed successfully")
if completion_callback:
completion_callback(True, "Configuration completed successfully", context.name)
completion_callback(True, "Configuration completed successfully", context.name, False)
else:
logger.warning("Modlist configuration had issues")
if completion_callback:
completion_callback(False, "Configuration failed", context.name)
completion_callback(False, "Configuration failed", context.name, False)
return success

View File

@@ -0,0 +1,191 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Native Steam Operations Service
This service provides direct Steam operations using VDF parsing and path discovery.
Replaces protontricks dependencies with native Steam functionality.
"""
import os
import logging
import vdf
from pathlib import Path
from typing import Dict, Optional, List
import subprocess
import shutil
logger = logging.getLogger(__name__)
class NativeSteamOperationsService:
"""
Service providing native Steam operations for shortcut discovery and prefix management.
Replaces protontricks functionality with:
- Direct VDF parsing for shortcut discovery
- Native compatdata path construction
- Direct Steam library detection
"""
def __init__(self, steamdeck: bool = False):
self.steamdeck = steamdeck
self.logger = logger
def list_non_steam_shortcuts(self) -> Dict[str, str]:
"""
List non-Steam shortcuts via direct VDF parsing.
Returns:
Dict mapping shortcut name to AppID string
"""
logger.info("Listing non-Steam shortcuts via native VDF parsing...")
shortcuts = {}
try:
# Find all possible shortcuts.vdf locations
shortcuts_paths = self._find_shortcuts_vdf_paths()
for shortcuts_path in shortcuts_paths:
logger.debug(f"Checking shortcuts.vdf at: {shortcuts_path}")
if not shortcuts_path.exists():
continue
try:
with open(shortcuts_path, 'rb') as f:
data = vdf.binary_load(f)
shortcuts_data = data.get('shortcuts', {})
for shortcut_key, shortcut_data in shortcuts_data.items():
if isinstance(shortcut_data, dict):
app_name = shortcut_data.get('AppName', '').strip()
app_id = shortcut_data.get('appid', '')
if app_name and app_id:
# Convert to positive AppID string (compatible format)
positive_appid = str(abs(int(app_id)))
shortcuts[app_name] = positive_appid
logger.debug(f"Found non-Steam shortcut: '{app_name}' with AppID {positive_appid}")
except Exception as e:
logger.warning(f"Error reading shortcuts.vdf at {shortcuts_path}: {e}")
continue
if not shortcuts:
logger.warning("No non-Steam shortcuts found in any shortcuts.vdf")
except Exception as e:
logger.error(f"Error listing non-Steam shortcuts: {e}")
return shortcuts
def set_steam_permissions(self, modlist_dir: str, steamdeck: bool = False) -> bool:
"""
Handle Steam access permissions for native operations.
Since we're using direct file access, no special permissions needed.
Args:
modlist_dir: Modlist directory path (for future use)
steamdeck: Steam Deck flag (for future use)
Returns:
Always True (no permissions needed for native operations)
"""
logger.debug("Using native Steam operations, no permission setting needed")
return True
def get_wine_prefix_path(self, appid: str) -> Optional[str]:
"""
Get WINEPREFIX path via direct compatdata discovery.
Args:
appid: Steam AppID string
Returns:
WINEPREFIX path string or None if not found
"""
logger.debug(f"Getting WINEPREFIX for AppID {appid} using native path discovery")
try:
# Find all possible compatdata locations
compatdata_paths = self._find_compatdata_paths()
for compatdata_base in compatdata_paths:
prefix_path = compatdata_base / appid / "pfx"
logger.debug(f"Checking prefix path: {prefix_path}")
if prefix_path.exists():
logger.debug(f"Found WINEPREFIX: {prefix_path}")
return str(prefix_path)
logger.error(f"WINEPREFIX not found for AppID {appid} in any compatdata location")
return None
except Exception as e:
logger.error(f"Error getting WINEPREFIX for AppID {appid}: {e}")
return None
def _find_shortcuts_vdf_paths(self) -> List[Path]:
"""Find all possible shortcuts.vdf file locations"""
paths = []
# Standard Steam locations
steam_locations = [
Path.home() / ".steam/steam",
Path.home() / ".local/share/Steam",
# Flatpak Steam - direct data directory
Path.home() / ".var/app/com.valvesoftware.Steam/.local/share/Steam",
# Flatpak Steam - symlinked home paths
Path.home() / ".var/app/com.valvesoftware.Steam/home/.steam/steam",
Path.home() / ".var/app/com.valvesoftware.Steam/home/.local/share/Steam"
]
for steam_root in steam_locations:
if not steam_root.exists():
continue
# Find userdata directories
userdata_path = steam_root / "userdata"
if userdata_path.exists():
for user_dir in userdata_path.iterdir():
if user_dir.is_dir() and user_dir.name.isdigit():
shortcuts_path = user_dir / "config" / "shortcuts.vdf"
paths.append(shortcuts_path)
return paths
def _find_compatdata_paths(self) -> List[Path]:
"""Find all possible compatdata directory locations"""
paths = []
# Standard compatdata locations
standard_locations = [
Path.home() / ".steam/steam/steamapps/compatdata",
Path.home() / ".local/share/Steam/steamapps/compatdata",
# Flatpak Steam - direct data directory
Path.home() / ".var/app/com.valvesoftware.Steam/.local/share/Steam/steamapps/compatdata",
# Flatpak Steam - symlinked home paths
Path.home() / ".var/app/com.valvesoftware.Steam/home/.steam/steam/steamapps/compatdata",
Path.home() / ".var/app/com.valvesoftware.Steam/home/.local/share/Steam/steamapps/compatdata"
]
for path in standard_locations:
if path.exists():
paths.append(path)
# Also check additional Steam libraries via libraryfolders.vdf
try:
from jackify.backend.handlers.path_handler import PathHandler
all_steam_libs = PathHandler.get_all_steam_library_paths()
for lib_path in all_steam_libs:
compatdata_path = lib_path / "steamapps" / "compatdata"
if compatdata_path.exists():
paths.append(compatdata_path)
except Exception as e:
logger.debug(f"Could not get additional Steam library paths: {e}")
return paths

View File

@@ -15,6 +15,8 @@ import vdf
from pathlib import Path
from typing import Optional, Tuple, Dict, Any, List
from ..handlers.vdf_handler import VDFHandler
logger = logging.getLogger(__name__)
class NativeSteamService:
@@ -28,37 +30,153 @@ class NativeSteamService:
"""
def __init__(self):
self.steam_path = Path.home() / ".steam" / "steam"
self.userdata_path = self.steam_path / "userdata"
self.steam_paths = [
Path.home() / ".steam" / "steam",
Path.home() / ".local" / "share" / "Steam",
Path.home() / ".var" / "app" / "com.valvesoftware.Steam" / "data" / "Steam",
Path.home() / ".var" / "app" / "com.valvesoftware.Steam" / ".local" / "share" / "Steam",
Path.home() / ".var" / "app" / "com.valvesoftware.Steam" / "home" / ".local" / "share" / "Steam"
]
self.steam_path = None
self.userdata_path = None
self.user_id = None
self.user_config_path = None
def find_steam_user(self) -> bool:
"""Find the active Steam user directory"""
"""
Find the active Steam user directory using Steam's own configuration files.
No more guessing - uses loginusers.vdf to get the most recent user and converts SteamID64 to SteamID3.
"""
try:
if not self.userdata_path.exists():
logger.error("Steam userdata directory not found")
# Step 1: Find Steam installation using Steam's own file structure
if not self._find_steam_installation():
logger.error("No Steam installation found")
return False
# Find the first user directory (usually there's only one)
user_dirs = [d for d in self.userdata_path.iterdir() if d.is_dir() and d.name.isdigit()]
if not user_dirs:
logger.error("No Steam user directories found")
# Step 2: Parse loginusers.vdf to get the most recent user (SteamID64)
steamid64 = self._get_most_recent_user_from_loginusers()
if not steamid64:
logger.warning("Could not determine most recent Steam user from loginusers.vdf, trying fallback method")
# Fallback: Look for existing user directories in userdata
steamid3 = self._find_user_from_userdata_directory()
if steamid3:
logger.info(f"Found Steam user using userdata directory fallback: SteamID3={steamid3}")
# Skip the conversion step since we already have SteamID3
self.user_id = str(steamid3)
self.user_config_path = self.userdata_path / str(steamid3) / "config"
logger.info(f"Steam user set up via fallback: {self.user_id}")
logger.info(f"User config path: {self.user_config_path}")
return True
else:
logger.error("Could not determine Steam user using any method")
return False
# Step 3: Convert SteamID64 to SteamID3 (userdata directory format)
steamid3 = self._convert_steamid64_to_steamid3(steamid64)
logger.info(f"Most recent Steam user: SteamID64={steamid64}, SteamID3={steamid3}")
# Step 4: Verify the userdata directory exists
user_dir = self.userdata_path / str(steamid3)
if not user_dir.exists():
logger.error(f"Userdata directory does not exist: {user_dir}")
return False
# Use the first user directory
user_dir = user_dirs[0]
self.user_id = user_dir.name
self.user_config_path = user_dir / "config"
logger.info(f"Found Steam user: {self.user_id}")
config_dir = user_dir / "config"
if not config_dir.exists():
logger.error(f"User config directory does not exist: {config_dir}")
return False
# Step 5: Set up the service state
self.user_id = str(steamid3)
self.user_config_path = config_dir
logger.info(f"VERIFIED Steam user: {self.user_id}")
logger.info(f"User config path: {self.user_config_path}")
logger.info(f"Shortcuts.vdf will be at: {self.user_config_path / 'shortcuts.vdf'}")
return True
except Exception as e:
logger.error(f"Error finding Steam user: {e}")
logger.error(f"Error finding Steam user: {e}", exc_info=True)
return False
def _find_steam_installation(self) -> bool:
"""Find Steam installation by checking for config/loginusers.vdf"""
for steam_path in self.steam_paths:
loginusers_path = steam_path / "config" / "loginusers.vdf"
userdata_path = steam_path / "userdata"
if loginusers_path.exists() and userdata_path.exists():
self.steam_path = steam_path
self.userdata_path = userdata_path
logger.info(f"Found Steam installation at: {steam_path}")
return True
return False
def _get_most_recent_user_from_loginusers(self) -> Optional[str]:
"""
Parse loginusers.vdf to get the SteamID64 of the most recent user.
Uses Steam's own MostRecent flag and Timestamp.
"""
try:
loginusers_path = self.steam_path / "config" / "loginusers.vdf"
# Load VDF data
vdf_data = VDFHandler.load(str(loginusers_path), binary=False)
if not vdf_data:
logger.error("Failed to parse loginusers.vdf")
return None
users_section = vdf_data.get("users", {})
if not users_section:
logger.error("No users section found in loginusers.vdf")
return None
most_recent_user = None
most_recent_timestamp = 0
# Find user with MostRecent=1 or highest timestamp
for steamid64, user_data in users_section.items():
if isinstance(user_data, dict):
# Check for MostRecent flag first
if user_data.get("MostRecent") == "1":
logger.info(f"Found user marked as MostRecent: {steamid64}")
return steamid64
# Also track highest timestamp as fallback
timestamp = int(user_data.get("Timestamp", "0"))
if timestamp > most_recent_timestamp:
most_recent_timestamp = timestamp
most_recent_user = steamid64
# Return user with highest timestamp if no MostRecent flag found
if most_recent_user:
logger.info(f"Found most recent user by timestamp: {most_recent_user}")
return most_recent_user
logger.error("No valid users found in loginusers.vdf")
return None
except Exception as e:
logger.error(f"Error parsing loginusers.vdf: {e}")
return None
def _convert_steamid64_to_steamid3(self, steamid64: str) -> int:
"""
Convert SteamID64 to SteamID3 (used in userdata directory names).
Formula: SteamID3 = SteamID64 - 76561197960265728
"""
try:
steamid64_int = int(steamid64)
steamid3 = steamid64_int - 76561197960265728
logger.debug(f"Converted SteamID64 {steamid64} to SteamID3 {steamid3}")
return steamid3
except ValueError as e:
logger.error(f"Invalid SteamID64 format: {steamid64}")
raise
def get_shortcuts_vdf_path(self) -> Optional[Path]:
"""Get the path to shortcuts.vdf"""
if not self.user_config_path:
@@ -241,16 +359,22 @@ class NativeSteamService:
def set_proton_version(self, app_id: int, proton_version: str = "proton_experimental") -> bool:
"""
Set the Proton version for a specific app using ONLY config.vdf like steam-conductor does.
Args:
app_id: The unsigned AppID
app_id: The unsigned AppID
proton_version: The Proton version to set
Returns:
True if successful
"""
# Ensure Steam user detection is completed first
if not self.steam_path:
if not self.find_steam_user():
logger.error("Cannot set Proton version: Steam user detection failed")
return False
logger.info(f"Setting Proton version '{proton_version}' for AppID {app_id} using STL-compatible format")
try:
# Step 1: Write to the main config.vdf for CompatToolMapping
config_path = self.steam_path / "config" / "config.vdf"
@@ -272,8 +396,27 @@ class NativeSteamService:
# Find the CompatToolMapping section
compat_start = config_text.find('"CompatToolMapping"')
if compat_start == -1:
logger.error("CompatToolMapping section not found in config.vdf")
return False
logger.warning("CompatToolMapping section not found in config.vdf, creating it")
# Find the Steam section to add CompatToolMapping to
steam_section = config_text.find('"Steam"')
if steam_section == -1:
logger.error("Steam section not found in config.vdf")
return False
# Find the opening brace for Steam section
steam_brace = config_text.find('{', steam_section)
if steam_brace == -1:
logger.error("Steam section opening brace not found")
return False
# Insert CompatToolMapping section right after Steam opening brace
insert_pos = steam_brace + 1
compat_section = '\n\t\t"CompatToolMapping"\n\t\t{\n\t\t}\n'
config_text = config_text[:insert_pos] + compat_section + config_text[insert_pos:]
# Update compat_start position after insertion
compat_start = config_text.find('"CompatToolMapping"')
logger.info("Created CompatToolMapping section in config.vdf")
# Find the closing brace for CompatToolMapping
# Look for the opening brace after CompatToolMapping
@@ -327,17 +470,47 @@ class NativeSteamService:
logger.error(f"Error setting Proton version: {e}")
return False
def create_shortcut_with_proton(self, app_name: str, exe_path: str, start_dir: str = None,
def create_shortcut_with_proton(self, app_name: str, exe_path: str, start_dir: str = None,
launch_options: str = "%command%", tags: List[str] = None,
proton_version: str = "proton_experimental") -> Tuple[bool, Optional[int]]:
proton_version: str = None) -> Tuple[bool, Optional[int]]:
"""
Complete workflow: Create shortcut and set Proton version.
This is the main method that replaces STL entirely.
Returns:
(success, app_id) - Success status and the AppID
"""
# Use Game Proton from settings for shortcut creation (not Install Proton)
if proton_version is None:
try:
from jackify.backend.handlers.config_handler import ConfigHandler
config_handler = ConfigHandler()
game_proton_path = config_handler.get_game_proton_path()
if game_proton_path and game_proton_path != 'auto':
# User has selected Game Proton - use it
proton_version = os.path.basename(game_proton_path)
# Convert to Steam format
if not proton_version.startswith('GE-Proton'):
proton_version = proton_version.lower().replace(' - ', '_').replace(' ', '_').replace('-', '_')
if not proton_version.startswith('proton'):
proton_version = f"proton_{proton_version}"
logger.info(f"Using Game Proton from settings: {proton_version}")
else:
# Fallback to auto-detect if Game Proton not set
from jackify.backend.handlers.wine_utils import WineUtils
best_proton = WineUtils.select_best_proton()
if best_proton:
proton_version = best_proton['name']
logger.info(f"Auto-detected Game Proton: {proton_version}")
else:
proton_version = "proton_experimental"
logger.warning("Failed to auto-detect Game Proton, falling back to experimental")
except Exception as e:
logger.warning(f"Failed to get Game Proton, falling back to experimental: {e}")
proton_version = "proton_experimental"
logger.info(f"Creating shortcut with Proton: '{app_name}' -> '{proton_version}'")
# Step 1: Create the shortcut
@@ -396,4 +569,57 @@ class NativeSteamService:
except Exception as e:
logger.error(f"Error removing shortcut: {e}")
return False
def create_steam_library_symlinks(self, app_id: int) -> bool:
"""
Create symlink to libraryfolders.vdf in Wine prefix for game detection.
This allows Wabbajack running in the prefix to detect Steam games.
Based on Wabbajack-Proton-AuCu implementation.
Args:
app_id: Steam AppID (unsigned)
Returns:
True if successful
"""
# Ensure Steam user detection is completed first
if not self.steam_path:
if not self.find_steam_user():
logger.error("Cannot create symlinks: Steam user detection failed")
return False
# Find libraryfolders.vdf
libraryfolders_vdf = self.steam_path / "config" / "libraryfolders.vdf"
if not libraryfolders_vdf.exists():
logger.error(f"libraryfolders.vdf not found at: {libraryfolders_vdf}")
return False
# Get compatdata path for this AppID
compat_data = self.steam_path / f"steamapps/compatdata/{app_id}"
if not compat_data.exists():
logger.error(f"Compatdata directory not found: {compat_data}")
return False
# Target directory in Wine prefix
prefix_config_dir = compat_data / "pfx/drive_c/Program Files (x86)/Steam/config"
prefix_config_dir.mkdir(parents=True, exist_ok=True)
# Symlink target
symlink_target = prefix_config_dir / "libraryfolders.vdf"
try:
# Remove existing symlink/file if it exists
if symlink_target.exists() or symlink_target.is_symlink():
symlink_target.unlink()
# Create symlink
symlink_target.symlink_to(libraryfolders_vdf)
logger.info(f"Created symlink: {symlink_target} -> {libraryfolders_vdf}")
return True
except Exception as e:
logger.error(f"Error creating symlink: {e}")
return False

View File

@@ -0,0 +1,307 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Nexus Authentication Service
Unified service for Nexus authentication using OAuth or API key fallback
"""
import logging
from typing import Optional, Tuple
from .nexus_oauth_service import NexusOAuthService
from ..handlers.oauth_token_handler import OAuthTokenHandler
from .api_key_service import APIKeyService
logger = logging.getLogger(__name__)
class NexusAuthService:
"""
Unified authentication service for Nexus Mods
Handles OAuth 2.0 (preferred) with API key fallback (legacy)
"""
def __init__(self):
"""Initialize authentication service"""
self.oauth_service = NexusOAuthService()
self.token_handler = OAuthTokenHandler()
self.api_key_service = APIKeyService()
logger.debug("NexusAuthService initialized")
def get_auth_token(self) -> Optional[str]:
"""
Get authentication token, preferring OAuth over API key
Returns:
Access token or API key, or None if no authentication available
"""
# Try OAuth first
oauth_token = self._get_oauth_token()
if oauth_token:
logger.debug("Using OAuth token for authentication")
return oauth_token
# Fall back to API key
api_key = self.api_key_service.get_saved_api_key()
if api_key:
logger.debug("Using API key for authentication (OAuth not available)")
return api_key
logger.warning("No authentication available (neither OAuth nor API key)")
return None
def _get_oauth_token(self) -> Optional[str]:
"""
Get OAuth access token, refreshing if needed
Returns:
Valid access token or None
"""
# Check if we have a stored token
if not self.token_handler.has_token():
logger.debug("No OAuth token stored")
return None
# Check if token is expired (15 minute buffer for long installs)
if self.token_handler.is_token_expired(buffer_minutes=15):
logger.info("OAuth token expiring soon, attempting refresh")
# Try to refresh
refresh_token = self.token_handler.get_refresh_token()
if refresh_token:
new_token_data = self.oauth_service.refresh_token(refresh_token)
if new_token_data:
# Save refreshed token
self.token_handler.save_token({'oauth': new_token_data})
logger.info("OAuth token refreshed successfully")
return new_token_data.get('access_token')
else:
logger.warning("Token refresh failed, OAuth token invalid")
# Delete invalid token
self.token_handler.delete_token()
return None
else:
logger.warning("No refresh token available")
return None
# Token is valid, return it
return self.token_handler.get_access_token()
def is_authenticated(self) -> bool:
"""
Check if user is authenticated via OAuth or API key
Returns:
True if authenticated
"""
return self.get_auth_token() is not None
def get_auth_method(self) -> Optional[str]:
"""
Get current authentication method
Returns:
'oauth', 'api_key', or None
"""
# Check OAuth first
oauth_token = self._get_oauth_token()
if oauth_token:
return 'oauth'
# Check API key
api_key = self.api_key_service.get_saved_api_key()
if api_key:
return 'api_key'
return None
def get_auth_status(self) -> Tuple[bool, str, Optional[str]]:
"""
Get detailed authentication status
Returns:
Tuple of (authenticated, method, username)
- authenticated: True if authenticated
- method: 'oauth', 'oauth_expired', 'api_key', or 'none'
- username: Username if available (OAuth only), or None
"""
# Check if OAuth token exists
if self.token_handler.has_token():
# Check if refresh token is likely expired (hasn't been refreshed in 30+ days)
token_info = self.token_handler.get_token_info()
if token_info.get('refresh_token_likely_expired'):
logger.warning("Refresh token likely expired (30+ days old), user should re-authorize")
return False, 'oauth_expired', None
# Try OAuth
oauth_token = self._get_oauth_token()
if oauth_token:
# Try to get username from userinfo
user_info = self.oauth_service.get_user_info(oauth_token)
username = user_info.get('name') if user_info else None
return True, 'oauth', username
elif self.token_handler.has_token():
# Had token but couldn't get valid access token (refresh failed)
logger.warning("OAuth token refresh failed, token may be invalid")
return False, 'oauth_expired', None
# Try API key
api_key = self.api_key_service.get_saved_api_key()
if api_key:
return True, 'api_key', None
return False, 'none', None
def authorize_oauth(self, show_browser_message_callback=None) -> bool:
"""
Perform OAuth authorization flow
Args:
show_browser_message_callback: Optional callback for browser messages
Returns:
True if authorization successful
"""
logger.info("Starting OAuth authorization")
token_data = self.oauth_service.authorize(show_browser_message_callback)
if token_data:
# Save token
success = self.token_handler.save_token({'oauth': token_data})
if success:
logger.info("OAuth authorization completed successfully")
return True
else:
logger.error("Failed to save OAuth token")
return False
else:
logger.error("OAuth authorization failed")
return False
def revoke_oauth(self) -> bool:
"""
Revoke OAuth authorization by deleting stored token
Returns:
True if revoked successfully
"""
logger.info("Revoking OAuth authorization")
return self.token_handler.delete_token()
def save_api_key(self, api_key: str) -> bool:
"""
Save API key (legacy fallback)
Args:
api_key: Nexus API key
Returns:
True if saved successfully
"""
return self.api_key_service.save_api_key(api_key)
def validate_api_key(self, api_key: Optional[str] = None) -> Tuple[bool, Optional[str]]:
"""
Validate API key against Nexus API
Args:
api_key: Optional API key to validate (uses stored if not provided)
Returns:
Tuple of (valid, username_or_error)
"""
return self.api_key_service.validate_api_key(api_key)
def ensure_valid_auth(self) -> Optional[str]:
"""
Ensure we have valid authentication, refreshing if needed
This should be called before any Nexus operation
Returns:
Valid auth token (OAuth access token or API key), or None
"""
auth_token = self.get_auth_token()
if not auth_token:
logger.warning("No authentication available for Nexus operation")
return auth_token
def get_auth_for_engine(self) -> Tuple[Optional[str], Optional[str]]:
"""
Get authentication for jackify-engine with auto-refresh support
Returns both NEXUS_API_KEY (for backward compat) and NEXUS_OAUTH_INFO (for auto-refresh).
When NEXUS_OAUTH_INFO is provided, the engine can automatically refresh expired tokens
during long installations.
Returns:
Tuple of (nexus_api_key, nexus_oauth_info_json)
- nexus_api_key: Access token or API key (for backward compat)
- nexus_oauth_info_json: Full OAuth state JSON (for auto-refresh) or None
"""
import json
import time
# Check if using OAuth and ensure token is fresh
if self.token_handler.has_token():
# Refresh token if expired (15 minute buffer for long installs)
access_token = self._get_oauth_token()
if not access_token:
logger.warning("OAuth token refresh failed, cannot provide auth to engine")
return (None, None)
# Load the refreshed token data
token_data = self.token_handler.load_token()
if token_data:
oauth_data = token_data.get('oauth', {})
# Build NexusOAuthState JSON matching upstream Wabbajack format
# This allows engine to auto-refresh tokens during long installations
nexus_oauth_state = {
"oauth": {
"access_token": oauth_data.get('access_token'),
"token_type": oauth_data.get('token_type', 'Bearer'),
"expires_in": oauth_data.get('expires_in', 3600),
"refresh_token": oauth_data.get('refresh_token'),
"scope": oauth_data.get('scope', 'public openid profile'),
"created_at": oauth_data.get('created_at', int(time.time())),
"_received_at": token_data.get('_saved_at', int(time.time())) * 10000000 + 116444736000000000 # Convert Unix to Windows FILETIME
},
"api_key": ""
}
nexus_oauth_json = json.dumps(nexus_oauth_state)
access_token = oauth_data.get('access_token')
logger.info("Providing OAuth state to engine for auto-refresh capability")
return (access_token, nexus_oauth_json)
# Fall back to API key (no auto-refresh support)
api_key = self.api_key_service.get_saved_api_key()
if api_key:
logger.info("Using API key for engine (no auto-refresh)")
return (api_key, None)
logger.warning("No authentication available for engine")
return (None, None)
def clear_all_auth(self) -> bool:
"""
Clear all authentication (both OAuth and API key)
Useful for testing or switching accounts
Returns:
True if any auth was cleared
"""
oauth_cleared = self.token_handler.delete_token()
api_key_cleared = self.api_key_service.clear_api_key()
if oauth_cleared or api_key_cleared:
logger.info("Cleared all Nexus authentication")
return True
else:
logger.debug("No authentication to clear")
return False

View File

@@ -0,0 +1,773 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Nexus OAuth Service
Handles OAuth 2.0 authentication flow with Nexus Mods using PKCE
"""
import os
import base64
import hashlib
import secrets
import webbrowser
import urllib.parse
from http.server import HTTPServer, BaseHTTPRequestHandler
import requests
import json
import threading
import ssl
import tempfile
import logging
import time
import subprocess
from typing import Optional, Tuple, Dict
logger = logging.getLogger(__name__)
class NexusOAuthService:
"""
Handles OAuth 2.0 authentication with Nexus Mods
Uses PKCE flow with system browser and localhost callback
"""
# OAuth Configuration
CLIENT_ID = "jackify"
AUTH_URL = "https://users.nexusmods.com/oauth/authorize"
TOKEN_URL = "https://users.nexusmods.com/oauth/token"
USERINFO_URL = "https://users.nexusmods.com/oauth/userinfo"
SCOPES = "public openid profile"
# Redirect configuration (custom protocol scheme - no SSL cert needed!)
# Requires jackify:// protocol handler to be registered with OS
REDIRECT_URI = "jackify://oauth/callback"
# Callback timeout (5 minutes)
CALLBACK_TIMEOUT = 300
def __init__(self):
"""Initialize OAuth service"""
self._auth_code = None
self._auth_state = None
self._auth_error = None
self._server_done = threading.Event()
# Ensure jackify:// protocol is registered on first use
self._ensure_protocol_registered()
def _generate_pkce_params(self) -> Tuple[str, str, str]:
"""
Generate PKCE code verifier, challenge, and state
Returns:
Tuple of (code_verifier, code_challenge, state)
"""
# Generate code verifier (43-128 characters, base64url encoded)
code_verifier = base64.urlsafe_b64encode(
os.urandom(32)
).decode('utf-8').rstrip('=')
# Generate code challenge (SHA256 hash of verifier, base64url encoded)
code_challenge = base64.urlsafe_b64encode(
hashlib.sha256(code_verifier.encode('utf-8')).digest()
).decode('utf-8').rstrip('=')
# Generate state for CSRF protection
state = secrets.token_urlsafe(32)
return code_verifier, code_challenge, state
def _ensure_protocol_registered(self) -> bool:
"""
Ensure jackify:// protocol is registered with the OS
Returns:
True if registration successful or already registered
"""
import subprocess
import sys
from pathlib import Path
if not sys.platform.startswith('linux'):
logger.debug("Protocol registration only needed on Linux")
return True
try:
# Ensure desktop file exists and has correct Exec path
desktop_file = Path.home() / ".local" / "share" / "applications" / "com.jackify.app.desktop"
# Get environment for AppImage detection
env = os.environ
# Determine executable path (DEV mode vs AppImage)
# Check multiple indicators for AppImage execution
is_appimage = (
'APPIMAGE' in env or # AppImage environment variable
'APPDIR' in env or # AppImage directory variable
(sys.argv[0] and sys.argv[0].endswith('.AppImage')) # Executable name
)
if is_appimage:
# Running from AppImage - use the AppImage path directly
# CRITICAL: Never use -m flag in AppImage mode - it causes __main__.py windows
if 'APPIMAGE' in env:
# APPIMAGE env var gives us the exact path to the AppImage
exec_path = env['APPIMAGE']
logger.info(f"Using APPIMAGE env var: {exec_path}")
elif sys.argv[0] and Path(sys.argv[0]).exists():
# Use sys.argv[0] if it's a valid path
exec_path = str(Path(sys.argv[0]).resolve())
logger.info(f"Using resolved sys.argv[0]: {exec_path}")
else:
# Fallback to sys.argv[0] as-is
exec_path = sys.argv[0]
logger.warning(f"Using sys.argv[0] as fallback: {exec_path}")
else:
# Running from source (DEV mode)
# Need to ensure we run from the correct directory
src_dir = Path(__file__).parent.parent.parent.parent # Go up to src/
# Use bash -c with proper quoting for paths with spaces
exec_path = f'bash -c \'cd "{src_dir}" && "{sys.executable}" -m jackify.frontends.gui "$@"\' --'
logger.info(f"DEV mode exec path: {exec_path}")
logger.info(f"Source directory: {src_dir}")
# Check if desktop file needs creation or update
needs_update = False
if not desktop_file.exists():
needs_update = True
logger.info("Creating desktop file for protocol handler")
else:
# Check if Exec path matches current mode
current_content = desktop_file.read_text()
# Check for both quoted (AppImage) and unquoted (DEV mode with bash -c) formats
if is_appimage:
expected_exec = f'Exec="{exec_path}" %u'
else:
expected_exec = f"Exec={exec_path} %u"
if expected_exec not in current_content:
needs_update = True
logger.info(f"Updating desktop file with new Exec path: {exec_path}")
# Explicitly detect and fix malformed entries (unquoted paths with spaces)
# Check if any Exec line exists without quotes but contains spaces
if is_appimage and ' ' in exec_path:
import re
# Look for Exec=<path with spaces> without quotes
if re.search(r'Exec=[^"]\S*\s+\S*\.AppImage', current_content):
needs_update = True
logger.info("Fixing malformed desktop file (unquoted path with spaces)")
if needs_update:
desktop_file.parent.mkdir(parents=True, exist_ok=True)
# Build desktop file content with proper working directory
if is_appimage:
# AppImage - quote path to handle spaces
desktop_content = f"""[Desktop Entry]
Type=Application
Name=Jackify
Comment=Wabbajack modlist manager for Linux
Exec="{exec_path}" %u
Icon=com.jackify.app
Terminal=false
Categories=Game;Utility;
MimeType=x-scheme-handler/jackify;
"""
else:
# DEV mode - exec_path already contains bash -c with proper quoting
src_dir = Path(__file__).parent.parent.parent.parent # Go up to src/
desktop_content = f"""[Desktop Entry]
Type=Application
Name=Jackify
Comment=Wabbajack modlist manager for Linux
Exec={exec_path} %u
Icon=com.jackify.app
Terminal=false
Categories=Game;Utility;
MimeType=x-scheme-handler/jackify;
Path={src_dir}
"""
desktop_file.write_text(desktop_content)
logger.info(f"Desktop file written: {desktop_file}")
logger.info(f"Exec path: {exec_path}")
logger.info(f"AppImage mode: {is_appimage}")
# Always ensure full registration (don't trust xdg-settings alone)
# PopOS/Ubuntu need mimeapps.list even if xdg-settings says registered
logger.info("Registering jackify:// protocol handler")
# Update MIME cache (required for Firefox dialog)
apps_dir = Path.home() / ".local" / "share" / "applications"
subprocess.run(
['update-desktop-database', str(apps_dir)],
capture_output=True,
timeout=10
)
# Set as default handler using xdg-mime (Firefox compatibility)
subprocess.run(
['xdg-mime', 'default', 'com.jackify.app.desktop', 'x-scheme-handler/jackify'],
capture_output=True,
timeout=10
)
# Also use xdg-settings as backup (some systems need both)
subprocess.run(
['xdg-settings', 'set', 'default-url-scheme-handler', 'jackify', 'com.jackify.app.desktop'],
capture_output=True,
timeout=10
)
# Manually ensure entry in mimeapps.list (PopOS/Ubuntu require this for GIO)
mimeapps_path = Path.home() / ".config" / "mimeapps.list"
try:
# Read existing content
if mimeapps_path.exists():
content = mimeapps_path.read_text()
else:
mimeapps_path.parent.mkdir(parents=True, exist_ok=True)
content = "[Default Applications]\n"
# Add jackify handler if not present
if 'x-scheme-handler/jackify=' not in content:
if '[Default Applications]' not in content:
content = "[Default Applications]\n" + content
# Insert after [Default Applications] line
lines = content.split('\n')
for i, line in enumerate(lines):
if line.strip() == '[Default Applications]':
lines.insert(i + 1, 'x-scheme-handler/jackify=com.jackify.app.desktop')
break
content = '\n'.join(lines)
mimeapps_path.write_text(content)
logger.info("Added jackify handler to mimeapps.list")
except Exception as e:
logger.warning(f"Failed to update mimeapps.list: {e}")
logger.info("jackify:// protocol registered successfully")
return True
except Exception as e:
logger.warning(f"Failed to register jackify:// protocol: {e}")
return False
def _generate_self_signed_cert(self) -> Tuple[Optional[str], Optional[str]]:
"""
Generate self-signed certificate for HTTPS localhost
Returns:
Tuple of (cert_file_path, key_file_path) or (None, None) on failure
"""
try:
from cryptography import x509
from cryptography.x509.oid import NameOID
from cryptography.hazmat.primitives import hashes
from cryptography.hazmat.primitives.asymmetric import rsa
from cryptography.hazmat.primitives import serialization
import datetime
import ipaddress
logger.info("Generating self-signed certificate for OAuth callback")
# Generate private key
private_key = rsa.generate_private_key(
public_exponent=65537,
key_size=2048,
)
# Create certificate
subject = issuer = x509.Name([
x509.NameAttribute(NameOID.COUNTRY_NAME, "US"),
x509.NameAttribute(NameOID.ORGANIZATION_NAME, "Jackify"),
x509.NameAttribute(NameOID.COMMON_NAME, self.REDIRECT_HOST),
])
cert = x509.CertificateBuilder().subject_name(
subject
).issuer_name(
issuer
).public_key(
private_key.public_key()
).serial_number(
x509.random_serial_number()
).not_valid_before(
datetime.datetime.now(datetime.UTC)
).not_valid_after(
datetime.datetime.now(datetime.UTC) + datetime.timedelta(days=365)
).add_extension(
x509.SubjectAlternativeName([
x509.IPAddress(ipaddress.IPv4Address(self.REDIRECT_HOST)),
]),
critical=False,
).sign(private_key, hashes.SHA256())
# Save to temp files
temp_dir = tempfile.mkdtemp()
cert_file = os.path.join(temp_dir, "oauth_cert.pem")
key_file = os.path.join(temp_dir, "oauth_key.pem")
with open(cert_file, "wb") as f:
f.write(cert.public_bytes(serialization.Encoding.PEM))
with open(key_file, "wb") as f:
f.write(private_key.private_bytes(
encoding=serialization.Encoding.PEM,
format=serialization.PrivateFormat.TraditionalOpenSSL,
encryption_algorithm=serialization.NoEncryption()
))
return cert_file, key_file
except ImportError:
logger.error("cryptography package not installed - required for OAuth")
return None, None
except Exception as e:
logger.error(f"Failed to generate SSL certificate: {e}")
return None, None
def _build_authorization_url(self, code_challenge: str, state: str) -> str:
"""
Build OAuth authorization URL
Args:
code_challenge: PKCE code challenge
state: CSRF protection state
Returns:
Authorization URL
"""
params = {
'response_type': 'code',
'client_id': self.CLIENT_ID,
'redirect_uri': self.REDIRECT_URI,
'scope': self.SCOPES,
'code_challenge': code_challenge,
'code_challenge_method': 'S256',
'state': state
}
return f"{self.AUTH_URL}?{urllib.parse.urlencode(params)}"
def _create_callback_handler(self):
"""Create HTTP request handler class for OAuth callback"""
service = self
class OAuthCallbackHandler(BaseHTTPRequestHandler):
"""HTTP request handler for OAuth callback"""
def log_message(self, format, *args):
"""Log OAuth callback requests"""
logger.debug(f"OAuth callback: {format % args}")
def do_GET(self):
"""Handle GET request from OAuth redirect"""
logger.info(f"OAuth callback received: {self.path}")
# Parse query parameters
parsed = urllib.parse.urlparse(self.path)
params = urllib.parse.parse_qs(parsed.query)
# Ignore favicon and other non-OAuth requests
if parsed.path == '/favicon.ico':
self.send_response(404)
self.end_headers()
return
if 'code' in params:
service._auth_code = params['code'][0]
service._auth_state = params.get('state', [None])[0]
logger.info(f"OAuth authorization code received: {service._auth_code[:10]}...")
# Send success response
self.send_response(200)
self.send_header('Content-type', 'text/html')
self.end_headers()
html = """
<html>
<head><title>Authorization Successful</title></head>
<body style="font-family: Arial, sans-serif; text-align: center; padding: 50px;">
<h1>Authorization Successful!</h1>
<p>You can close this window and return to Jackify.</p>
<script>setTimeout(function() { window.close(); }, 3000);</script>
</body>
</html>
"""
self.wfile.write(html.encode())
elif 'error' in params:
service._auth_error = params['error'][0]
error_desc = params.get('error_description', ['Unknown error'])[0]
# Send error response
self.send_response(200)
self.send_header('Content-type', 'text/html')
self.end_headers()
html = f"""
<html>
<head><title>Authorization Failed</title></head>
<body style="font-family: Arial, sans-serif; text-align: center; padding: 50px;">
<h1>Authorization Failed</h1>
<p>Error: {service._auth_error}</p>
<p>{error_desc}</p>
<p>You can close this window and try again in Jackify.</p>
</body>
</html>
"""
self.wfile.write(html.encode())
else:
# Unexpected callback format
logger.warning(f"OAuth callback with no code or error: {params}")
self.send_response(400)
self.send_header('Content-type', 'text/html')
self.end_headers()
html = """
<html>
<head><title>Invalid Request</title></head>
<body style="font-family: Arial, sans-serif; text-align: center; padding: 50px;">
<h1>Invalid OAuth Callback</h1>
<p>You can close this window.</p>
</body>
</html>
"""
self.wfile.write(html.encode())
# Signal server to shut down
service._server_done.set()
logger.debug("OAuth callback handler signaled server to shut down")
return OAuthCallbackHandler
def _wait_for_callback(self) -> bool:
"""
Wait for OAuth callback via jackify:// protocol handler
Returns:
True if callback received, False on timeout
"""
from pathlib import Path
import time
callback_file = Path.home() / ".config" / "jackify" / "oauth_callback.tmp"
# Delete any old callback file
if callback_file.exists():
callback_file.unlink()
logger.info("Waiting for OAuth callback via jackify:// protocol")
# Poll for callback file with periodic user feedback
start_time = time.time()
last_reminder = 0
while (time.time() - start_time) < self.CALLBACK_TIMEOUT:
if callback_file.exists():
try:
# Read callback data
lines = callback_file.read_text().strip().split('\n')
if len(lines) >= 2:
self._auth_code = lines[0]
self._auth_state = lines[1]
logger.info(f"OAuth callback received: code={self._auth_code[:10]}...")
# Clean up
callback_file.unlink()
return True
except Exception as e:
logger.error(f"Failed to read callback file: {e}")
return False
# Show periodic reminder about protocol handler
elapsed = time.time() - start_time
if elapsed - last_reminder > 30: # Every 30 seconds
logger.info(f"Still waiting for OAuth callback... ({int(elapsed)}s elapsed)")
if elapsed > 60:
logger.warning(
"If you see a blank browser tab or popup blocker, "
"check for browser notifications asking to 'Open Jackify'"
)
last_reminder = elapsed
time.sleep(0.5) # Poll every 500ms
logger.error(f"OAuth callback timeout after {self.CALLBACK_TIMEOUT} seconds")
logger.error(
"Protocol handler may not be working. Check:\n"
" 1. Browser asked 'Open Jackify?' and you clicked Allow\n"
" 2. No popup blocker notifications\n"
" 3. Desktop file exists: ~/.local/share/applications/com.jackify.app.desktop"
)
return False
def _send_desktop_notification(self, title: str, message: str):
"""
Send desktop notification if available
Args:
title: Notification title
message: Notification message
"""
try:
# Try notify-send (Linux)
subprocess.run(
['notify-send', title, message],
check=False,
stdout=subprocess.DEVNULL,
stderr=subprocess.DEVNULL,
timeout=2
)
except (FileNotFoundError, subprocess.TimeoutExpired):
pass
def _exchange_code_for_token(
self,
auth_code: str,
code_verifier: str
) -> Optional[Dict]:
"""
Exchange authorization code for access token
Args:
auth_code: Authorization code from callback
code_verifier: PKCE code verifier
Returns:
Token response dict or None on failure
"""
data = {
'grant_type': 'authorization_code',
'client_id': self.CLIENT_ID,
'redirect_uri': self.REDIRECT_URI,
'code': auth_code,
'code_verifier': code_verifier
}
try:
response = requests.post(self.TOKEN_URL, data=data, timeout=10)
if response.status_code == 200:
token_data = response.json()
logger.info("Successfully exchanged authorization code for token")
return token_data
else:
logger.error(f"Token exchange failed: {response.status_code} - {response.text}")
return None
except requests.RequestException as e:
logger.error(f"Token exchange request failed: {e}")
return None
def refresh_token(self, refresh_token: str) -> Optional[Dict]:
"""
Refresh an access token using refresh token
Args:
refresh_token: Refresh token from previous authentication
Returns:
New token response dict or None on failure
"""
data = {
'grant_type': 'refresh_token',
'client_id': self.CLIENT_ID,
'refresh_token': refresh_token
}
try:
response = requests.post(self.TOKEN_URL, data=data, timeout=10)
if response.status_code == 200:
token_data = response.json()
logger.info("Successfully refreshed access token")
return token_data
else:
logger.error(f"Token refresh failed: {response.status_code} - {response.text}")
return None
except requests.RequestException as e:
logger.error(f"Token refresh request failed: {e}")
return None
def get_user_info(self, access_token: str) -> Optional[Dict]:
"""
Get user information using access token
Args:
access_token: OAuth access token
Returns:
User info dict or None on failure
"""
headers = {
'Authorization': f'Bearer {access_token}'
}
try:
response = requests.get(self.USERINFO_URL, headers=headers, timeout=10)
if response.status_code == 200:
user_info = response.json()
logger.info(f"Retrieved user info for: {user_info.get('name', 'unknown')}")
return user_info
else:
logger.error(f"User info request failed: {response.status_code}")
return None
except requests.RequestException as e:
logger.error(f"User info request failed: {e}")
return None
def authorize(self, show_browser_message_callback=None) -> Optional[Dict]:
"""
Perform full OAuth authorization flow
Args:
show_browser_message_callback: Optional callback to display message about browser opening
Returns:
Token response dict or None on failure
"""
logger.info("Starting Nexus OAuth authorization flow")
# Reset state
self._auth_code = None
self._auth_state = None
self._auth_error = None
self._server_done.clear()
# Generate PKCE parameters
code_verifier, code_challenge, state = self._generate_pkce_params()
logger.debug(f"Generated PKCE parameters (state: {state[:10]}...)")
# Build authorization URL
auth_url = self._build_authorization_url(code_challenge, state)
# Open browser
logger.info("Opening browser for authorisation")
try:
# When running from AppImage, we need to clean the environment to avoid
# library conflicts with system tools (xdg-open, kde-open, etc.)
import os
import subprocess
env = os.environ.copy()
# Remove AppImage-specific environment variables that can cause conflicts
# These variables inject AppImage's bundled libraries into child processes
appimage_vars = [
'LD_LIBRARY_PATH',
'PYTHONPATH',
'PYTHONHOME',
'QT_PLUGIN_PATH',
'QML2_IMPORT_PATH',
]
# Check if we're running from AppImage
if 'APPIMAGE' in env or 'APPDIR' in env:
logger.debug("Running from AppImage - cleaning environment for browser launch")
for var in appimage_vars:
if var in env:
del env[var]
logger.debug(f"Removed {var} from browser environment")
# Use Popen instead of run to avoid waiting for browser to close
# xdg-open may not return until the browser closes, which could be never
try:
process = subprocess.Popen(
['xdg-open', auth_url],
env=env,
stdout=subprocess.DEVNULL,
stderr=subprocess.DEVNULL,
start_new_session=True # Detach from parent process
)
# Give it a moment to fail if it's going to fail
import time
time.sleep(0.5)
# Check if process is still running or has exited successfully
poll_result = process.poll()
if poll_result is None:
# Process still running - browser is opening/open
logger.info("Browser opened successfully via xdg-open (process running)")
browser_opened = True
elif poll_result == 0:
# Process exited successfully
logger.info("Browser opened successfully via xdg-open (exit code 0)")
browser_opened = True
else:
# Process exited with error
logger.warning(f"xdg-open exited with code {poll_result}, trying webbrowser module")
if webbrowser.open(auth_url):
logger.info("Browser opened successfully via webbrowser module")
browser_opened = True
else:
logger.warning("webbrowser.open returned False")
browser_opened = False
except FileNotFoundError:
# xdg-open not found - try webbrowser module
logger.warning("xdg-open not found, trying webbrowser module")
if webbrowser.open(auth_url):
logger.info("Browser opened successfully via webbrowser module")
browser_opened = True
else:
logger.warning("webbrowser.open returned False")
browser_opened = False
except Exception as e:
logger.error(f"Error opening browser: {e}")
browser_opened = False
# Send desktop notification
self._send_desktop_notification(
"Jackify - Nexus Authorisation",
"Please check your browser to authorise Jackify"
)
# Show message via callback if provided (AFTER browser opens)
if show_browser_message_callback:
if browser_opened:
show_browser_message_callback(
"Browser opened for Nexus authorisation.\n\n"
"After clicking 'Authorize', your browser may ask to\n"
"open Jackify or show a popup blocker notification.\n\n"
"Please click 'Open' or 'Allow' to complete authorization."
)
else:
show_browser_message_callback(
f"Could not open browser automatically.\n\n"
f"Please open this URL manually:\n{auth_url}"
)
# Wait for callback via jackify:// protocol
if not self._wait_for_callback():
return None
# Check for errors
if self._auth_error:
logger.error(f"Authorization failed: {self._auth_error}")
return None
if not self._auth_code:
logger.error("No authorization code received")
return None
# Verify state matches
if self._auth_state != state:
logger.error("State mismatch - possible CSRF attack")
return None
logger.info("Authorization code received, exchanging for token")
# Exchange code for token
token_data = self._exchange_code_for_token(self._auth_code, code_verifier)
if token_data:
logger.info("OAuth authorization flow completed successfully")
else:
logger.error("Failed to exchange authorization code for token")
return token_data

View File

@@ -0,0 +1,67 @@
#!/usr/bin/env python3
"""
Platform Detection Service
Centralizes platform detection logic (Steam Deck, etc.) to be performed once at application startup
and shared across all components.
"""
import os
import logging
logger = logging.getLogger(__name__)
class PlatformDetectionService:
"""
Service for detecting platform-specific information once at startup
"""
_instance = None
_is_steamdeck = None
def __new__(cls):
"""Singleton pattern to ensure only one instance"""
if cls._instance is None:
cls._instance = super().__new__(cls)
return cls._instance
def __init__(self):
"""Initialize platform detection if not already done"""
if self._is_steamdeck is None:
self._detect_platform()
def _detect_platform(self):
"""Perform platform detection once"""
logger.debug("Performing platform detection...")
# Steam Deck detection
self._is_steamdeck = False
try:
if os.path.exists('/etc/os-release'):
with open('/etc/os-release', 'r') as f:
content = f.read().lower()
if 'steamdeck' in content or 'steamos' in content:
self._is_steamdeck = True
logger.info("Steam Deck/SteamOS platform detected")
else:
logger.debug("Non-Steam Deck Linux platform detected")
else:
logger.debug("No /etc/os-release found - assuming non-Steam Deck platform")
except Exception as e:
logger.warning(f"Error detecting Steam Deck platform: {e}")
self._is_steamdeck = False
logger.debug(f"Platform detection complete: is_steamdeck={self._is_steamdeck}")
@property
def is_steamdeck(self) -> bool:
"""Get Steam Deck detection result"""
if self._is_steamdeck is None:
self._detect_platform()
return self._is_steamdeck
@classmethod
def get_instance(cls):
"""Get the singleton instance"""
return cls()

View File

@@ -6,8 +6,11 @@ Centralized service for detecting and managing protontricks installation across
"""
import logging
import os
import shutil
import subprocess
import sys
import importlib.util
from typing import Optional, Tuple
from ..handlers.protontricks_handler import ProtontricksHandler
from ..handlers.config_handler import ConfigHandler
@@ -39,16 +42,16 @@ class ProtontricksDetectionService:
def _get_protontricks_handler(self) -> ProtontricksHandler:
"""Get or create ProtontricksHandler instance"""
if self._protontricks_handler is None:
self._protontricks_handler = ProtontricksHandler(steamdeck=self.steamdeck)
self._protontricks_handler = ProtontricksHandler(self.steamdeck)
return self._protontricks_handler
def detect_protontricks(self, use_cache: bool = True) -> Tuple[bool, str, str]:
"""
Detect if protontricks is installed and get installation details
Detect if system protontricks is installed and get installation details
Args:
use_cache (bool): Whether to use cached detection result
Returns:
Tuple[bool, str, str]: (is_installed, installation_type, details_message)
- is_installed: True if protontricks is available
@@ -82,7 +85,7 @@ class ProtontricksDetectionService:
details_message = "Protontricks is installed (unknown type)"
else:
installation_type = 'none'
details_message = "Protontricks not found - required for Jackify functionality"
details_message = "Protontricks not found - install via flatpak or package manager"
# Cache the result
self._last_detection_result = (is_installed, installation_type, details_message)
@@ -93,57 +96,22 @@ class ProtontricksDetectionService:
def _detect_without_prompts(self, handler: ProtontricksHandler) -> bool:
"""
Detect protontricks without user prompts or installation attempts
Detect system protontricks (flatpak or native) without user prompts.
Args:
handler (ProtontricksHandler): Handler instance to use
Returns:
bool: True if protontricks is found
bool: True if system protontricks is found
"""
# Use the handler's silent detection method
return handler.detect_protontricks()
def is_bundled_mode(self) -> bool:
"""
DEPRECATED: Bundled protontricks no longer supported.
Always returns False for backwards compatibility.
"""
import shutil
# Check if protontricks exists as a command
protontricks_path_which = shutil.which("protontricks")
if protontricks_path_which:
# Check if it's a flatpak wrapper
try:
with open(protontricks_path_which, 'r') as f:
content = f.read()
if "flatpak run" in content:
logger.debug(f"Detected Protontricks is a Flatpak wrapper at {protontricks_path_which}")
handler.which_protontricks = 'flatpak'
# Continue to check flatpak list just to be sure
else:
logger.info(f"Native Protontricks found at {protontricks_path_which}")
handler.which_protontricks = 'native'
handler.protontricks_path = protontricks_path_which
return True
except Exception as e:
logger.error(f"Error reading protontricks executable: {e}")
# Check if flatpak protontricks is installed
try:
env = handler._get_clean_subprocess_env()
result = subprocess.run(
["flatpak", "list"],
capture_output=True,
text=True,
check=True,
env=env
)
if "com.github.Matoking.protontricks" in result.stdout:
logger.info("Flatpak Protontricks is installed")
handler.which_protontricks = 'flatpak'
return True
except FileNotFoundError:
logger.warning("'flatpak' command not found. Cannot check for Flatpak Protontricks.")
except subprocess.CalledProcessError as e:
logger.warning(f"Error checking flatpak list: {e}")
except Exception as e:
logger.error(f"Unexpected error checking flatpak: {e}")
return False
def install_flatpak_protontricks(self) -> Tuple[bool, str]:
@@ -164,14 +132,31 @@ class ProtontricksDetectionService:
logger.error(error_msg)
return False, error_msg
# Install command
install_cmd = ["flatpak", "install", "-u", "-y", "--noninteractive", "flathub", "com.github.Matoking.protontricks"]
# Install command - use --user flag for user-level installation (works on Steam Deck)
# This avoids requiring system-wide installation permissions
install_cmd = ["flatpak", "install", "--user", "-y", "--noninteractive", "flathub", "com.github.Matoking.protontricks"]
# Use clean environment
env = handler._get_clean_subprocess_env()
# Run installation
process = subprocess.run(install_cmd, check=True, text=True, env=env, capture_output=True)
# Log the command for debugging
logger.debug(f"Running flatpak install command: {' '.join(install_cmd)}")
# Run installation with timeout (5 minutes should be plenty)
process = subprocess.run(
install_cmd,
check=True,
text=True,
env=env,
capture_output=True,
timeout=300 # 5 minute timeout
)
# Log stdout/stderr for debugging (even on success, might contain useful info)
if process.stdout:
logger.debug(f"Flatpak install stdout: {process.stdout}")
if process.stderr:
logger.debug(f"Flatpak install stderr: {process.stderr}")
# Clear cache to force re-detection
self._cached_detection_valid = False
@@ -184,13 +169,41 @@ class ProtontricksDetectionService:
error_msg = "Flatpak command not found. Please install Flatpak first."
logger.error(error_msg)
return False, error_msg
except subprocess.CalledProcessError as e:
error_msg = f"Flatpak installation failed: {e}"
except subprocess.TimeoutExpired:
error_msg = "Flatpak installation timed out after 5 minutes. Please check your network connection and try again."
logger.error(error_msg)
return False, error_msg
except subprocess.CalledProcessError as e:
# Include stderr in error message for better debugging
stderr_msg = e.stderr.strip() if e.stderr else "No error details available"
stdout_msg = e.stdout.strip() if e.stdout else ""
# Try to extract meaningful error from stderr
if stderr_msg:
# Common errors: permission denied, network issues, etc.
if "permission" in stderr_msg.lower() or "denied" in stderr_msg.lower():
error_msg = f"Permission denied. Try running: flatpak install --user flathub com.github.Matoking.protontricks\n\nDetails: {stderr_msg}"
elif "network" in stderr_msg.lower() or "connection" in stderr_msg.lower():
error_msg = f"Network error during installation. Check your internet connection.\n\nDetails: {stderr_msg}"
elif "already installed" in stderr_msg.lower():
# This might actually be success - clear cache and re-detect
logger.info("Protontricks appears to already be installed (according to flatpak output)")
self._cached_detection_valid = False
return True, "Protontricks is already installed."
else:
error_msg = f"Flatpak installation failed:\n\n{stderr_msg}"
if stdout_msg:
error_msg += f"\n\nOutput: {stdout_msg}"
else:
error_msg = f"Flatpak installation failed with return code {e.returncode}."
if stdout_msg:
error_msg += f"\n\nOutput: {stdout_msg}"
logger.error(f"Flatpak installation error: {error_msg}")
return False, error_msg
except Exception as e:
error_msg = f"Unexpected error during Flatpak installation: {e}"
logger.error(error_msg)
logger.error(error_msg, exc_info=True)
return False, error_msg
def get_installation_guidance(self) -> str:

View File

@@ -5,46 +5,87 @@ import signal
import psutil
import logging
import sys
import shutil
from typing import Callable, Optional
logger = logging.getLogger(__name__)
STRATEGY_JACKIFY = "jackify"
STRATEGY_NAK_SIMPLE = "nak_simple"
def _get_restart_strategy() -> str:
"""Read restart strategy from config with safe fallback."""
try:
from jackify.backend.handlers.config_handler import ConfigHandler
strategy = ConfigHandler().get("steam_restart_strategy", STRATEGY_JACKIFY)
if strategy not in (STRATEGY_JACKIFY, STRATEGY_NAK_SIMPLE):
return STRATEGY_JACKIFY
return strategy
except Exception as exc: # pragma: no cover - defensive logging only
logger.debug(f"Steam restart: Unable to read strategy from config: {exc}")
return STRATEGY_JACKIFY
def _strategy_label(strategy: str) -> str:
if strategy == STRATEGY_NAK_SIMPLE:
return "NaK simple restart"
return "Jackify hardened restart"
def _get_clean_subprocess_env():
"""
Create a clean environment for subprocess calls by removing PyInstaller-specific
environment variables that can interfere with Steam execution.
Create a clean environment for subprocess calls by stripping bundle-specific
environment variables (e.g., frozen AppImage remnants) that can interfere with Steam.
CRITICAL: Preserves all display/session variables that Steam needs for GUI:
- DISPLAY, WAYLAND_DISPLAY, XDG_SESSION_TYPE, DBUS_SESSION_BUS_ADDRESS,
XDG_RUNTIME_DIR, XAUTHORITY, etc.
Returns:
dict: Cleaned environment dictionary
dict: Cleaned environment dictionary with GUI variables preserved
"""
env = os.environ.copy()
pyinstaller_vars_removed = []
bundle_vars_removed = []
# Remove PyInstaller-specific environment variables
# CRITICAL: Preserve display/session variables that Steam GUI needs
# These MUST be kept for Steam to open its GUI window
gui_vars_to_preserve = [
'DISPLAY', 'WAYLAND_DISPLAY', 'XDG_SESSION_TYPE', 'DBUS_SESSION_BUS_ADDRESS',
'XDG_RUNTIME_DIR', 'XAUTHORITY', 'XDG_CURRENT_DESKTOP', 'XDG_SESSION_DESKTOP',
'QT_QPA_PLATFORM', 'GDK_BACKEND', 'XDG_DATA_DIRS', 'XDG_CONFIG_DIRS'
]
preserved_gui_vars = {}
for var in gui_vars_to_preserve:
if var in env:
preserved_gui_vars[var] = env[var]
logger.debug(f"Steam restart: Preserving GUI variable {var}={env[var][:50] if len(str(env[var])) > 50 else env[var]}")
# Remove bundle-specific environment variables
if env.pop('_MEIPASS', None):
pyinstaller_vars_removed.append('_MEIPASS')
bundle_vars_removed.append('_MEIPASS')
if env.pop('_MEIPASS2', None):
pyinstaller_vars_removed.append('_MEIPASS2')
bundle_vars_removed.append('_MEIPASS2')
# Clean library path variables that PyInstaller modifies (Linux/Unix)
# Clean library path variables that frozen bundles modify (Linux/Unix)
if 'LD_LIBRARY_PATH_ORIG' in env:
# Restore original LD_LIBRARY_PATH if it was backed up by PyInstaller
# Restore original LD_LIBRARY_PATH if it was backed up by the bundler
env['LD_LIBRARY_PATH'] = env['LD_LIBRARY_PATH_ORIG']
pyinstaller_vars_removed.append('LD_LIBRARY_PATH (restored from _ORIG)')
bundle_vars_removed.append('LD_LIBRARY_PATH (restored from _ORIG)')
else:
# Remove PyInstaller-modified LD_LIBRARY_PATH
# Remove modified LD_LIBRARY_PATH entries
if env.pop('LD_LIBRARY_PATH', None):
pyinstaller_vars_removed.append('LD_LIBRARY_PATH (removed)')
bundle_vars_removed.append('LD_LIBRARY_PATH (removed)')
# Clean PATH of PyInstaller-specific entries
# Clean PATH of bundle-specific entries
if 'PATH' in env and hasattr(sys, '_MEIPASS'):
path_entries = env['PATH'].split(os.pathsep)
original_count = len(path_entries)
# Remove any PATH entries that point to PyInstaller temp directory
# Remove any PATH entries that point to the bundle's temp directory
cleaned_path = [p for p in path_entries if not p.startswith(sys._MEIPASS)]
env['PATH'] = os.pathsep.join(cleaned_path)
if len(cleaned_path) < original_count:
pyinstaller_vars_removed.append(f'PATH (removed {original_count - len(cleaned_path)} PyInstaller entries)')
bundle_vars_removed.append(f'PATH (removed {original_count - len(cleaned_path)} bundle entries)')
# Clean macOS library path (if present)
if 'DYLD_LIBRARY_PATH' in env and hasattr(sys, '_MEIPASS'):
@@ -52,16 +93,26 @@ def _get_clean_subprocess_env():
cleaned_dyld = [p for p in dyld_entries if not p.startswith(sys._MEIPASS)]
if cleaned_dyld:
env['DYLD_LIBRARY_PATH'] = os.pathsep.join(cleaned_dyld)
pyinstaller_vars_removed.append('DYLD_LIBRARY_PATH (cleaned)')
bundle_vars_removed.append('DYLD_LIBRARY_PATH (cleaned)')
else:
env.pop('DYLD_LIBRARY_PATH', None)
pyinstaller_vars_removed.append('DYLD_LIBRARY_PATH (removed)')
bundle_vars_removed.append('DYLD_LIBRARY_PATH (removed)')
# Ensure GUI variables are still present (they should be, but double-check)
for var, value in preserved_gui_vars.items():
if var not in env:
env[var] = value
logger.warning(f"Steam restart: Restored GUI variable {var} that was accidentally removed")
# Log what was cleaned for debugging
if pyinstaller_vars_removed:
logger.debug(f"Steam restart: Cleaned PyInstaller environment variables: {', '.join(pyinstaller_vars_removed)}")
if bundle_vars_removed:
logger.debug(f"Steam restart: Cleaned bundled environment variables: {', '.join(bundle_vars_removed)}")
else:
logger.debug("Steam restart: No PyInstaller environment variables detected (likely DEV mode)")
logger.debug("Steam restart: No bundled environment variables detected (likely DEV mode)")
# Log preserved GUI variables for debugging
if preserved_gui_vars:
logger.debug(f"Steam restart: Preserved {len(preserved_gui_vars)} GUI environment variables")
return env
@@ -86,6 +137,31 @@ def is_steam_deck() -> bool:
logger.debug(f"Error detecting Steam Deck: {e}")
return False
def is_flatpak_steam() -> bool:
"""Detect if Steam is installed as a Flatpak."""
try:
# First check if flatpak command exists
if not shutil.which('flatpak'):
return False
# Verify the app is actually installed (not just directory exists)
result = subprocess.run(['flatpak', 'list', '--app'],
stdout=subprocess.PIPE,
stderr=subprocess.DEVNULL, # Suppress stderr to avoid error messages
text=True,
timeout=5)
if result.returncode == 0:
# Check for exact match - "com.valvesoftware.Steam" as a whole word
# This prevents matching "com.valvesoftware.SteamLink" or similar
for line in result.stdout.splitlines():
parts = line.split()
if parts and parts[0] == 'com.valvesoftware.Steam':
return True
return False
except Exception as e:
logger.debug(f"Error detecting Flatpak Steam: {e}")
return False
def get_steam_processes() -> list:
"""Return a list of psutil.Process objects for running Steam processes."""
steam_procs = []
@@ -118,20 +194,118 @@ def wait_for_steam_exit(timeout: int = 60, check_interval: float = 0.5) -> bool:
time.sleep(check_interval)
return False
def start_steam() -> bool:
"""Attempt to start Steam using the exact methods from existing working logic."""
env = _get_clean_subprocess_env()
def _start_steam_nak_style(is_steamdeck_flag=False, is_flatpak_flag=False, env_override=None) -> bool:
"""
Start Steam using a simplified NaK-style restart (single command, no env cleanup).
CRITICAL: Do NOT use start_new_session - Steam needs to inherit the session
to connect to display/tray. Ensure all GUI environment variables are preserved.
"""
env = env_override if env_override is not None else os.environ.copy()
# Log critical GUI variables for debugging
gui_vars = ['DISPLAY', 'WAYLAND_DISPLAY', 'XDG_SESSION_TYPE', 'DBUS_SESSION_BUS_ADDRESS', 'XDG_RUNTIME_DIR']
for var in gui_vars:
if var in env:
logger.debug(f"NaK-style restart: {var}={env[var][:50] if len(str(env[var])) > 50 else env[var]}")
else:
logger.warning(f"NaK-style restart: {var} is NOT SET - Steam GUI may fail!")
try:
# Try systemd user service (Steam Deck)
if is_steam_deck():
if is_steamdeck_flag:
logger.info("NaK-style restart: Steam Deck detected, restarting via systemctl.")
subprocess.Popen(["systemctl", "--user", "restart", "app-steam@autostart.service"], env=env)
elif is_flatpak_flag:
logger.info("NaK-style restart: Flatpak Steam detected, running flatpak command.")
subprocess.Popen(["flatpak", "run", "com.valvesoftware.Steam"],
env=env, stderr=subprocess.DEVNULL)
else:
logger.info("NaK-style restart: launching Steam directly (inheriting session for GUI).")
# NaK uses simple "steam" command without -foreground flag
# Do NOT use start_new_session - Steam needs session access for GUI
# Use shell=True to ensure proper environment inheritance
# This helps with GUI display access on some systems
subprocess.Popen("steam", shell=True, env=env)
time.sleep(5)
# Use steamwebhelper for detection (actual Steam process, not steam-powerbuttond)
check_result = subprocess.run(['pgrep', '-f', 'steamwebhelper'], capture_output=True, timeout=10, env=env)
if check_result.returncode == 0:
logger.info("NaK-style restart detected running Steam process.")
return True
logger.warning("NaK-style restart did not detect Steam process after launch.")
return False
except FileNotFoundError as exc:
logger.error(f"NaK-style restart command not found: {exc}")
return False
except Exception as exc:
logger.error(f"NaK-style restart encountered an error: {exc}")
return False
def start_steam(is_steamdeck_flag=None, is_flatpak_flag=None, env_override=None, strategy: str = STRATEGY_JACKIFY) -> bool:
"""
Attempt to start Steam using the exact methods from existing working logic.
Args:
is_steamdeck_flag: Optional pre-detected Steam Deck status
is_flatpak_flag: Optional pre-detected Flatpak Steam status
env_override: Optional environment dictionary for subprocess calls
strategy: Restart strategy identifier
"""
if strategy == STRATEGY_NAK_SIMPLE:
return _start_steam_nak_style(
is_steamdeck_flag=is_steamdeck_flag,
is_flatpak_flag=is_flatpak_flag,
env_override=env_override or os.environ.copy(),
)
env = env_override if env_override is not None else _get_clean_subprocess_env()
# Use provided flags or detect
_is_steam_deck = is_steamdeck_flag if is_steamdeck_flag is not None else is_steam_deck()
_is_flatpak = is_flatpak_flag if is_flatpak_flag is not None else is_flatpak_steam()
logger.info(
"Starting Steam (strategy=%s, steam_deck=%s, flatpak=%s)",
strategy,
_is_steam_deck,
_is_flatpak,
)
try:
# Try systemd user service (Steam Deck) - HIGHEST PRIORITY
if _is_steam_deck:
logger.debug("Using systemctl restart for Steam Deck.")
subprocess.Popen(["systemctl", "--user", "restart", "app-steam@autostart.service"], env=env)
return True
# Use startup methods with only -silent flag (no -minimized or -no-browser)
# Check if Flatpak Steam (only if not Steam Deck)
if _is_flatpak:
logger.info("Flatpak Steam detected - trying flatpak run command first")
try:
# Try without flags first (most reliable for Ubuntu/PopOS)
logger.debug("Executing: flatpak run com.valvesoftware.Steam")
subprocess.Popen(["flatpak", "run", "com.valvesoftware.Steam"],
env=env, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)
time.sleep(7) # Give Flatpak more time to start
# For Flatpak Steam, check for the flatpak process, not steamwebhelper
check_result = subprocess.run(['pgrep', '-f', 'com.valvesoftware.Steam'], capture_output=True, timeout=10, env=env)
if check_result.returncode == 0:
logger.info("Flatpak Steam started successfully")
return True
else:
logger.warning("Flatpak Steam not detected after launch - will NOT fall back to prevent conflicts")
return False # Flatpak Steam must use flatpak command, don't fall back
except Exception as e:
logger.error(f"Flatpak Steam start failed: {e}")
return False # Flatpak Steam must use flatpak command, don't fall back
# Use startup methods with -foreground flag to ensure GUI opens
start_methods = [
{"name": "Popen", "cmd": ["steam", "-silent"], "kwargs": {"stdout": subprocess.DEVNULL, "stderr": subprocess.DEVNULL, "stdin": subprocess.DEVNULL, "start_new_session": True, "env": env}},
{"name": "setsid", "cmd": ["setsid", "steam", "-silent"], "kwargs": {"stdout": subprocess.DEVNULL, "stderr": subprocess.DEVNULL, "stdin": subprocess.DEVNULL, "env": env}},
{"name": "nohup", "cmd": ["nohup", "steam", "-silent"], "kwargs": {"stdout": subprocess.DEVNULL, "stderr": subprocess.DEVNULL, "stdin": subprocess.DEVNULL, "start_new_session": True, "preexec_fn": os.setpgrp, "env": env}}
{"name": "Popen", "cmd": ["steam", "-foreground"], "kwargs": {"stdout": subprocess.DEVNULL, "stderr": subprocess.DEVNULL, "stdin": subprocess.DEVNULL, "start_new_session": True, "env": env}},
{"name": "setsid", "cmd": ["setsid", "steam", "-foreground"], "kwargs": {"stdout": subprocess.DEVNULL, "stderr": subprocess.DEVNULL, "stdin": subprocess.DEVNULL, "env": env}},
{"name": "nohup", "cmd": ["nohup", "steam", "-foreground"], "kwargs": {"stdout": subprocess.DEVNULL, "stderr": subprocess.DEVNULL, "stdin": subprocess.DEVNULL, "start_new_session": True, "preexec_fn": os.setpgrp, "env": env}}
]
for method in start_methods:
@@ -142,7 +316,8 @@ def start_steam() -> bool:
if process is not None:
logger.info(f"Initiated Steam start with {method_name}.")
time.sleep(5) # Wait 5 seconds as in existing logic
check_result = subprocess.run(['pgrep', '-f', 'steam'], capture_output=True, timeout=10, env=env)
# Use steamwebhelper for detection (actual Steam process, not steam-powerbuttond)
check_result = subprocess.run(['pgrep', '-f', 'steamwebhelper'], capture_output=True, timeout=10, env=env)
if check_result.returncode == 0:
logger.info(f"Steam process detected after using {method_name}. Proceeding to wait phase.")
return True
@@ -160,106 +335,149 @@ def start_steam() -> bool:
logger.error(f"Error starting Steam: {e}")
return False
def robust_steam_restart(progress_callback: Optional[Callable[[str], None]] = None, timeout: int = 60) -> bool:
def robust_steam_restart(progress_callback: Optional[Callable[[str], None]] = None, timeout: int = 60, system_info=None) -> bool:
"""
Robustly restart Steam across all distros. Returns True on success, False on failure.
Optionally accepts a progress_callback(message: str) for UI feedback.
Uses aggressive pkill approach for maximum reliability.
Args:
progress_callback: Optional callback for progress updates
timeout: Timeout in seconds for restart operation
system_info: Optional SystemInfo object with pre-detected Steam installation types
"""
env = _get_clean_subprocess_env()
shutdown_env = _get_clean_subprocess_env()
strategy = _get_restart_strategy()
start_env = shutdown_env if strategy == STRATEGY_JACKIFY else os.environ.copy()
# Use cached detection from system_info if available, otherwise detect
_is_steam_deck = system_info.is_steamdeck if system_info else is_steam_deck()
_is_flatpak = system_info.is_flatpak_steam if system_info else is_flatpak_steam()
def report(msg):
logger.info(msg)
if progress_callback:
progress_callback(msg)
report("Shutting down Steam...")
# Steam Deck: Use systemctl for shutdown (special handling)
if is_steam_deck():
report(f"Steam restart strategy: {_strategy_label(strategy)}")
# Steam Deck: Use systemctl for shutdown (special handling) - HIGHEST PRIORITY
if _is_steam_deck:
try:
report("Steam Deck detected - using systemctl shutdown...")
subprocess.run(['systemctl', '--user', 'stop', 'app-steam@autostart.service'],
timeout=15, check=False, capture_output=True, env=env)
subprocess.run(['systemctl', '--user', 'stop', 'app-steam@autostart.service'],
timeout=15, check=False, capture_output=True, env=shutdown_env)
time.sleep(2)
except Exception as e:
logger.debug(f"systemctl stop failed on Steam Deck: {e}")
# Flatpak Steam: Use flatpak kill command (only if not Steam Deck)
elif _is_flatpak:
try:
report("Flatpak Steam detected - stopping via flatpak...")
subprocess.run(['flatpak', 'kill', 'com.valvesoftware.Steam'],
timeout=15, check=False, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=shutdown_env)
time.sleep(2)
except Exception as e:
logger.debug(f"flatpak kill failed: {e}")
# All systems: Use pkill approach (proven 15/16 test success rate)
try:
# Skip unreliable steam -shutdown, go straight to pkill
pkill_result = subprocess.run(['pkill', 'steam'], timeout=15, check=False, capture_output=True, env=env)
pkill_result = subprocess.run(['pkill', 'steam'], timeout=15, check=False, capture_output=True, env=shutdown_env)
logger.debug(f"pkill steam result: {pkill_result.returncode}")
time.sleep(2)
# Check if Steam is still running
check_result = subprocess.run(['pgrep', '-f', 'steamwebhelper'], capture_output=True, timeout=10, env=env)
check_result = subprocess.run(['pgrep', '-f', 'steamwebhelper'], capture_output=True, timeout=10, env=shutdown_env)
if check_result.returncode == 0:
# Force kill if still running
report("Steam still running - force terminating...")
force_result = subprocess.run(['pkill', '-9', 'steam'], timeout=15, check=False, capture_output=True, env=env)
force_result = subprocess.run(['pkill', '-9', 'steam'], timeout=15, check=False, capture_output=True, env=shutdown_env)
logger.debug(f"pkill -9 steam result: {force_result.returncode}")
time.sleep(2)
# Final check
final_check = subprocess.run(['pgrep', '-f', 'steamwebhelper'], capture_output=True, timeout=10, env=env)
final_check = subprocess.run(['pgrep', '-f', 'steamwebhelper'], capture_output=True, timeout=10, env=shutdown_env)
if final_check.returncode != 0:
logger.info("Steam processes successfully force terminated.")
else:
report("Failed to terminate Steam processes.")
return False
# Steam might still be running, but proceed anyway - wait phase will verify
logger.warning("Steam processes may still be running after termination attempts. Proceeding to start phase...")
report("Steam shutdown incomplete, but proceeding...")
else:
logger.info("Steam processes successfully terminated.")
except Exception as e:
logger.error(f"Error during Steam shutdown: {e}")
report("Failed to shut down Steam.")
return False
# Don't fail completely on shutdown errors - proceed to start phase
logger.warning(f"Error during Steam shutdown: {e}. Proceeding to start phase anyway...")
report("Steam shutdown had issues, but proceeding...")
report("Steam closed successfully.")
# Start Steam using platform-specific logic
report("Starting Steam...")
# Steam Deck: Use systemctl restart (keep existing working approach)
if is_steam_deck():
if _is_steam_deck:
try:
subprocess.Popen(["systemctl", "--user", "restart", "app-steam@autostart.service"], env=env)
subprocess.Popen(["systemctl", "--user", "restart", "app-steam@autostart.service"], env=start_env)
logger.info("Steam Deck: Initiated systemctl restart")
except Exception as e:
logger.error(f"Steam Deck systemctl restart failed: {e}")
report("Failed to restart Steam on Steam Deck.")
return False
else:
# All other distros: Use proven steam -silent method
if not start_steam():
report("Failed to start Steam.")
return False
# All other distros: Use start_steam() which now uses -foreground to ensure GUI opens
steam_started = start_steam(
is_steamdeck_flag=_is_steam_deck,
is_flatpak_flag=_is_flatpak,
env_override=start_env,
strategy=strategy,
)
# Even if start_steam() returns False, Steam might still be starting
# Give it a chance by proceeding to wait phase
if not steam_started:
logger.warning("start_steam() returned False, but proceeding to wait phase in case Steam is starting anyway")
report("Steam start command issued, waiting for process...")
# Wait for Steam to fully initialize using existing logic
# Wait for Steam to fully initialize
# CRITICAL: Use steamwebhelper (actual Steam process), not "steam" (matches steam-powerbuttond, etc.)
report("Waiting for Steam to fully start")
logger.info("Waiting up to 2 minutes for Steam to fully initialize...")
max_startup_wait = 120
logger.info("Waiting up to 3 minutes (180 seconds) for Steam to fully initialize...")
max_startup_wait = 180 # Increased from 120 to 180 seconds (3 minutes) for slower systems
elapsed_wait = 0
initial_wait_done = False
last_status_log = 0 # Track when we last logged status
while elapsed_wait < max_startup_wait:
try:
result = subprocess.run(['pgrep', '-f', 'steam'], capture_output=True, timeout=10, env=env)
# Log status every 30 seconds so user knows we're still waiting
if elapsed_wait - last_status_log >= 30:
remaining = max_startup_wait - elapsed_wait
logger.info(f"Still waiting for Steam... ({elapsed_wait}s elapsed, {remaining}s remaining)")
if progress_callback:
progress_callback(f"Waiting for Steam... ({elapsed_wait}s / {max_startup_wait}s)")
last_status_log = elapsed_wait
# Use steamwebhelper for detection (matches shutdown logic)
result = subprocess.run(['pgrep', '-f', 'steamwebhelper'], capture_output=True, timeout=10, env=start_env)
if result.returncode == 0:
if not initial_wait_done:
logger.info("Steam process detected. Waiting additional time for full initialization...")
logger.info(f"Steam process detected at {elapsed_wait}s. Waiting additional time for full initialization...")
initial_wait_done = True
time.sleep(5)
elapsed_wait += 5
if initial_wait_done and elapsed_wait >= 15:
final_check = subprocess.run(['pgrep', '-f', 'steam'], capture_output=True, timeout=10, env=env)
# Require at least 20 seconds of stable detection (increased from 15)
if initial_wait_done and elapsed_wait >= 20:
final_check = subprocess.run(['pgrep', '-f', 'steamwebhelper'], capture_output=True, timeout=10, env=start_env)
if final_check.returncode == 0:
report("Steam started successfully.")
logger.info("Steam confirmed running after wait.")
logger.info(f"Steam confirmed running after {elapsed_wait}s wait.")
return True
else:
logger.warning("Steam process disappeared during final initialization wait.")
break
logger.warning("Steam process disappeared during final initialization wait, continuing to wait...")
# Don't break - continue waiting in case Steam is still starting
initial_wait_done = False # Reset to allow re-detection
else:
logger.debug(f"Steam process not yet detected. Waiting... ({elapsed_wait + 5}s)")
time.sleep(5)
@@ -269,6 +487,7 @@ def robust_steam_restart(progress_callback: Optional[Callable[[str], None]] = No
time.sleep(5)
elapsed_wait += 5
report("Steam did not start within timeout.")
logger.error("Steam failed to start/initialize within the allowed time.")
# Only reach here if we've waited the full duration
report(f"Steam did not start within {max_startup_wait}s timeout.")
logger.error(f"Steam failed to start/initialize within the allowed time ({elapsed_wait}s elapsed).")
return False

View File

@@ -103,15 +103,33 @@ class UpdateService:
# Determine if this is a delta update
is_delta = '.delta' in download_url or 'delta' in download_url.lower()
return UpdateInfo(
version=latest_version,
tag_name=release_data['tag_name'],
release_date=release_data['published_at'],
changelog=release_data.get('body', ''),
download_url=download_url,
file_size=file_size,
is_delta_update=is_delta
)
# Safety checks to prevent segfault
try:
# Sanitize string fields
safe_version = str(latest_version) if latest_version else ""
safe_tag = str(release_data.get('tag_name', ''))
safe_date = str(release_data.get('published_at', ''))
safe_changelog = str(release_data.get('body', ''))[:1000] # Limit size
safe_url = str(download_url)
logger.debug(f"Creating UpdateInfo for version {safe_version}")
update_info = UpdateInfo(
version=safe_version,
tag_name=safe_tag,
release_date=safe_date,
changelog=safe_changelog,
download_url=safe_url,
file_size=file_size,
is_delta_update=is_delta
)
logger.debug(f"UpdateInfo created successfully")
return update_info
except Exception as e:
logger.error(f"Failed to create UpdateInfo: {e}")
return None
else:
logger.warning(f"No AppImage found in release {latest_version}")
@@ -173,9 +191,14 @@ class UpdateService:
def check_worker():
try:
update_info = self.check_for_updates()
logger.debug(f"check_worker: Received update_info: {update_info}")
logger.debug(f"check_worker: About to call callback...")
callback(update_info)
logger.debug(f"check_worker: Callback completed")
except Exception as e:
logger.error(f"Error in background update check: {e}")
import traceback
logger.error(f"Traceback: {traceback.format_exc()}")
callback(None)
thread = threading.Thread(target=check_worker, daemon=True)
@@ -248,9 +271,9 @@ class UpdateService:
total_size = int(response.headers.get('content-length', 0))
downloaded_size = 0
# Create update directory in user's home directory
home_dir = Path.home()
update_dir = home_dir / "Jackify" / "updates"
# Create update directory in user's data directory
from jackify.shared.paths import get_jackify_data_dir
update_dir = get_jackify_data_dir() / "updates"
update_dir.mkdir(parents=True, exist_ok=True)
temp_file = update_dir / f"Jackify-{update_info.version}.AppImage"
@@ -322,51 +345,78 @@ class UpdateService:
Path to helper script, or None if creation failed
"""
try:
# Create update directory in user's home directory
home_dir = Path.home()
update_dir = home_dir / "Jackify" / "updates"
# Create update directory in user's data directory
from jackify.shared.paths import get_jackify_data_dir
update_dir = get_jackify_data_dir() / "updates"
update_dir.mkdir(parents=True, exist_ok=True)
helper_script = update_dir / "update_helper.sh"
script_content = f'''#!/bin/bash
# Jackify Update Helper Script
# This script replaces the current AppImage with the new version
# This script safely replaces the current AppImage with the new version
CURRENT_APPIMAGE="{current_appimage}"
NEW_APPIMAGE="{new_appimage}"
TEMP_NAME="$CURRENT_APPIMAGE.updating"
echo "Jackify Update Helper"
echo "Waiting for Jackify to exit..."
# Wait for Jackify to exit (give it a few seconds)
sleep 3
# Wait longer for Jackify to fully exit and unmount
sleep 5
echo "Replacing AppImage..."
echo "Validating new AppImage..."
# Backup current version (optional)
# Validate new AppImage exists and is executable
if [ ! -f "$NEW_APPIMAGE" ]; then
echo "ERROR: New AppImage not found: $NEW_APPIMAGE"
exit 1
fi
# Test that new AppImage can execute --version
if ! timeout 10 "$NEW_APPIMAGE" --version >/dev/null 2>&1; then
echo "ERROR: New AppImage failed validation test"
exit 1
fi
echo "New AppImage validated successfully"
echo "Performing safe replacement..."
# Backup current version
if [ -f "$CURRENT_APPIMAGE" ]; then
cp "$CURRENT_APPIMAGE" "$CURRENT_APPIMAGE.backup"
fi
# Replace with new version
if cp "$NEW_APPIMAGE" "$CURRENT_APPIMAGE"; then
chmod +x "$CURRENT_APPIMAGE"
echo "Update completed successfully!"
# Safe replacement: copy to temp name first, then atomic move
if cp "$NEW_APPIMAGE" "$TEMP_NAME"; then
chmod +x "$TEMP_NAME"
# Clean up temporary file
rm -f "$NEW_APPIMAGE"
# Restart Jackify
echo "Restarting Jackify..."
exec "$CURRENT_APPIMAGE"
else
echo "Update failed - could not replace AppImage"
# Restore backup if replacement failed
if [ -f "$CURRENT_APPIMAGE.backup" ]; then
mv "$CURRENT_APPIMAGE.backup" "$CURRENT_APPIMAGE"
echo "Restored original AppImage"
# Atomic move to replace
if mv "$TEMP_NAME" "$CURRENT_APPIMAGE"; then
echo "Update completed successfully!"
# Clean up
rm -f "$NEW_APPIMAGE"
rm -f "$CURRENT_APPIMAGE.backup"
# Restart Jackify
echo "Restarting Jackify..."
sleep 1
exec "$CURRENT_APPIMAGE"
else
echo "ERROR: Failed to move updated AppImage"
rm -f "$TEMP_NAME"
# Restore backup
if [ -f "$CURRENT_APPIMAGE.backup" ]; then
mv "$CURRENT_APPIMAGE.backup" "$CURRENT_APPIMAGE"
echo "Restored original AppImage"
fi
exit 1
fi
else
echo "ERROR: Failed to copy new AppImage"
exit 1
fi
# Clean up this script

View File

@@ -0,0 +1,3 @@
"""Helper utilities for backend services."""

View File

@@ -0,0 +1,46 @@
"""
Utilities for detecting Nexus Premium requirement messages in engine output.
"""
from __future__ import annotations
_KEYWORD_PHRASES = (
"buy nexus premium",
"requires nexus premium",
"requires a nexus premium",
"nexus premium is required",
"nexus premium required",
"nexus mods premium is required",
"manual download", # Evaluated with additional context
)
def is_non_premium_indicator(line: str) -> tuple[bool, str | None]:
"""
Return True if the engine output line indicates a Nexus non-premium scenario.
Args:
line: Raw line emitted from the jackify-engine process.
Returns:
Tuple of (is_premium_error: bool, matched_pattern: str | None)
"""
if not line:
return False, None
normalized = line.strip().lower()
if not normalized:
return False, None
# Direct phrase detection
for phrase in _KEYWORD_PHRASES[:6]:
if phrase in normalized:
return True, phrase
# Manual download + Nexus URL implies premium requirement in current workflows.
if "manual download" in normalized and ("nexusmods.com" in normalized or "nexus mods" in normalized):
return True, "manual download + nexusmods.com"
return False, None

BIN
jackify/engine/Microsoft.CSharp.dll Normal file → Executable file

Binary file not shown.

Binary file not shown.

Binary file not shown.

BIN
jackify/engine/System.Collections.Concurrent.dll Normal file → Executable file

Binary file not shown.

BIN
jackify/engine/System.Collections.Immutable.dll Normal file → Executable file

Binary file not shown.

BIN
jackify/engine/System.Collections.NonGeneric.dll Normal file → Executable file

Binary file not shown.

BIN
jackify/engine/System.Collections.Specialized.dll Normal file → Executable file

Binary file not shown.

BIN
jackify/engine/System.Collections.dll Normal file → Executable file

Binary file not shown.

BIN
jackify/engine/System.ComponentModel.EventBasedAsync.dll Normal file → Executable file

Binary file not shown.

BIN
jackify/engine/System.ComponentModel.Primitives.dll Normal file → Executable file

Binary file not shown.

BIN
jackify/engine/System.ComponentModel.TypeConverter.dll Normal file → Executable file

Binary file not shown.

BIN
jackify/engine/System.ComponentModel.dll Normal file → Executable file

Binary file not shown.

BIN
jackify/engine/System.Console.dll Normal file → Executable file

Binary file not shown.

Binary file not shown.

BIN
jackify/engine/System.Data.Common.dll Normal file → Executable file

Binary file not shown.

Binary file not shown.

BIN
jackify/engine/System.Diagnostics.FileVersionInfo.dll Normal file → Executable file

Binary file not shown.

BIN
jackify/engine/System.Diagnostics.Process.dll Normal file → Executable file

Binary file not shown.

BIN
jackify/engine/System.Diagnostics.StackTrace.dll Normal file → Executable file

Binary file not shown.

BIN
jackify/engine/System.Diagnostics.TraceSource.dll Normal file → Executable file

Binary file not shown.

BIN
jackify/engine/System.Drawing.Primitives.dll Normal file → Executable file

Binary file not shown.

BIN
jackify/engine/System.Drawing.dll Normal file → Executable file

Binary file not shown.

BIN
jackify/engine/System.Formats.Asn1.dll Normal file → Executable file

Binary file not shown.

BIN
jackify/engine/System.IO.Compression.Brotli.dll Normal file → Executable file

Binary file not shown.

BIN
jackify/engine/System.IO.Compression.ZipFile.dll Normal file → Executable file

Binary file not shown.

BIN
jackify/engine/System.IO.Compression.dll Normal file → Executable file

Binary file not shown.

BIN
jackify/engine/System.IO.FileSystem.DriveInfo.dll Normal file → Executable file

Binary file not shown.

BIN
jackify/engine/System.IO.FileSystem.Watcher.dll Normal file → Executable file

Binary file not shown.

BIN
jackify/engine/System.IO.MemoryMappedFiles.dll Normal file → Executable file

Binary file not shown.

BIN
jackify/engine/System.IO.Pipes.dll Normal file → Executable file

Binary file not shown.

Some files were not shown because too many files have changed in this diff Show More