mirror of
https://github.com/Omni-guides/Jackify.git
synced 2026-01-17 19:47:00 +01:00
Compare commits
17 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
0d84d2f2fe | ||
|
|
2511c9334c | ||
|
|
5869a896a8 | ||
|
|
99fb369d5e | ||
|
|
a813236e51 | ||
|
|
a7ed4b2a1e | ||
|
|
523681a254 | ||
|
|
abfca5268f | ||
|
|
4de5c7f55d | ||
|
|
9c52c0434b | ||
|
|
e3dc62fdac | ||
|
|
ce969eba1b | ||
|
|
fe14e4ecfb | ||
|
|
9680814bbb | ||
|
|
91ac08afb2 | ||
|
|
06bd94d119 | ||
|
|
52806f4116 |
316
CHANGELOG.md
316
CHANGELOG.md
@@ -1,5 +1,304 @@
|
||||
# Jackify Changelog
|
||||
|
||||
## v0.2.0.9 - Critical Configuration Fixes
|
||||
**Release Date:** 2025-12-31
|
||||
|
||||
### Bug Fixes
|
||||
- Fixed AppID conversion bug causing Configure Existing failures
|
||||
- Fixed missing MessageService import crash in Configure Existing
|
||||
- Fixed RecursionError in config_handler.py logger
|
||||
- Fixed winetricks automatic fallback to protontricks (was silently failing)
|
||||
|
||||
### Improvements
|
||||
- Added detailed progress indicators for configuration workflows
|
||||
- Fixed progress bar completion showing 100% instead of 95%
|
||||
- Removed debug logging noise from file progress widget
|
||||
- Enhanced Premium detection diagnostics for Issue #111
|
||||
- Flatpak protontricks now auto-granted cache access for faster subsequent installs
|
||||
|
||||
---
|
||||
|
||||
## v0.2.0.8 - Bug Fixes and Improvements
|
||||
**Release Date:** 2025-12-29
|
||||
|
||||
### Bug Fixes
|
||||
- Fixed Configure New/Existing/TTW screens missing Activity tab and progress updates
|
||||
- Fixed cancel/back buttons crashing in Configure workflows
|
||||
|
||||
### Improvements
|
||||
- Install directory now auto-appends modlist name when selected from gallery
|
||||
|
||||
### Known Issues
|
||||
- Mod filter temporarily disabled in gallery due to technical issue (tag and game filters still work)
|
||||
|
||||
---
|
||||
|
||||
## v0.2.0.7 - Critical Auth Fix
|
||||
**Release Date:** 2025-12-28
|
||||
|
||||
### Critical Bug Fixes
|
||||
- **OAuth Token Loss**: Fixed version comparison bug that was deleting OAuth tokens every time settings were saved (affects users on v0.2.0.4+)
|
||||
- Fixed internal import paths for improved stability
|
||||
|
||||
---
|
||||
|
||||
## v0.2.0.6 - Premium Detection and Engine Update
|
||||
**Release Date:** 2025-12-28
|
||||
|
||||
**IMPORTANT:** If you are on v0.2.0.5, automatic updates will not work. You must manually download and install v0.2.0.6.
|
||||
|
||||
### Engine Updates
|
||||
- **jackify-engine 0.4.4**: Latest engine version with improvements
|
||||
|
||||
### Critical Bug Fixes
|
||||
- **Auto-Update System**: Fixed broken update dialog import that prevented automatic updates
|
||||
- **Premium Detection**: Fixed false Premium errors caused by overly-broad detection pattern triggering on jackify-engine 0.4.3's userinfo JSON output
|
||||
- **Custom Data Directory**: Fixed AppImage always creating ~/Jackify on startup, even when user configured a custom jackify_data_dir
|
||||
- **Proton Auto-Selection**: Fixed auto-selection writing invalid "auto" string to config on detection failure
|
||||
|
||||
### Quality Improvements
|
||||
- Added pre-build import validator to prevent broken imports from reaching production
|
||||
|
||||
---
|
||||
|
||||
## v0.2.0.5 - Emergency OAuth Fix
|
||||
**Release Date:** 2025-12-24
|
||||
|
||||
### Critical Bug Fixes
|
||||
- **OAuth Authentication**: Fixed regression in v0.2.0.4 that prevented OAuth token encryption/decryption, breaking Nexus authentication for users
|
||||
|
||||
---
|
||||
|
||||
## v0.2.0.4 - Bugfixes & Improvements
|
||||
**Release Date:** 2025-12-23
|
||||
|
||||
### Engine Updates
|
||||
- **jackify-engine 0.4.3**: Fixed case sensitivity issues, archive extraction crashes, and improved error messages
|
||||
|
||||
### Bug Fixes
|
||||
- Fixed modlist gallery metadata showing outdated versions (now always fetches fresh data)
|
||||
- Fixed hardcoded ~/Jackify paths preventing custom data directory settings
|
||||
- Fixed update check blocking GUI startup
|
||||
- Improved Steam restart reliability (3-minute timeout, better error handling)
|
||||
- Fixed Protontricks Flatpak installation on Steam Deck
|
||||
|
||||
### Backend Changes
|
||||
- GPU texture conversion now always enabled (config setting deprecated)
|
||||
|
||||
### UI Improvements
|
||||
- Redesigned modlist detail view to show more of hero image
|
||||
- Improved gallery loading with animated feedback and faster initial load
|
||||
|
||||
---
|
||||
|
||||
## v0.2.0.3 - Engine Bugfix & Settings Cleanup
|
||||
**Release Date:** 2025-12-21
|
||||
|
||||
### Engine Updates
|
||||
- **jackify-engine 0.4.3**: Bugfix release
|
||||
|
||||
### UI Improvements
|
||||
- **Settings Dialog**: Removed GPU disable toggle - GPU usage is now always enabled (the disable option was non-functional)
|
||||
|
||||
---
|
||||
|
||||
## v0.2.0.2 - Emergency Engine Bugfix
|
||||
**Release Date:** 2025-12-18
|
||||
|
||||
### Engine Updates
|
||||
- **jackify-engine 0.4.2**: Fixed OOM issue with jackify-engine 0.4.1 due to array size
|
||||
|
||||
---
|
||||
|
||||
## v0.2.0.1 - Critical Bugfix Release
|
||||
**Release Date:** 2025-12-15
|
||||
|
||||
### Critical Bug Fixes
|
||||
- **Directory Safety Validation**: Fixed data loss bug where directories with only a `downloads/` folder were incorrectly identified as valid modlist directories
|
||||
- **Flatpak Steam Restart**: Fixed Steam restart failures on Ubuntu/PopOS by removing incompatible `-foreground` flag and increasing startup wait
|
||||
|
||||
### Bug Fixes
|
||||
- **External Links**: Fixed Ko-fi, GitHub, and Nexus links not opening on some distros using xdg-open with clean environment
|
||||
- **TTW Console Output**: Filtered standalone "OK"/"DONE" noise messages from TTW installation console
|
||||
- **Activity Window**: Fixed progress display updates in TTW Installer and other workflows
|
||||
- **Wine Component Installation**: Added status feedback during component installation showing component list
|
||||
- **Progress Parser**: Added defensive checks to prevent segfaults from malformed engine output
|
||||
- **Progress Parser Speed Info**: Fixed 'OperationType' object has no attribute 'lower' error by converting enum to string value when extracting speed info from timestamp status patterns
|
||||
|
||||
### Improvements
|
||||
- **Default Wine Components**: Added dxvk to default component list for better graphics compatibility
|
||||
- **TTW Installer UI**: Show version numbers in status displays
|
||||
|
||||
### Engine Updates
|
||||
- **jackify-engine 0.4.1**: Download reliability fixes, BSA case sensitivity handling, external drive I/O limiting, GPU detection caching, and texture processing performance improvements
|
||||
|
||||
---
|
||||
|
||||
## v0.2.0 - Modlist Gallery, OAuth Authentication & Performance Improvements
|
||||
**Release Date:** 2025-12-06
|
||||
|
||||
### Major Features
|
||||
|
||||
#### Modlist Selection Gallery
|
||||
Complete overhaul of modlist selection (First pass):
|
||||
|
||||
**Core Features:**
|
||||
- Card-based Modlist Selection browser with modlist images, titles, authors and metadata
|
||||
- Game-specific filtering automatically applied based on selected game type
|
||||
- Details per card: download/install/total sizes, tags, version, badges
|
||||
- Async image loading from GitHub with local 7-day caching
|
||||
- Detail view with full descriptions, banner images, and external links
|
||||
- Selected modlist automatically populates Install Modlist workflow
|
||||
|
||||
**Search and Filtering:**
|
||||
- Text search across modlist names and descriptions
|
||||
- Multi-select tag filtering with normalized tags
|
||||
- Show Official Only, Show NSFW, Hide Unavailable toggles
|
||||
- Mod search capability - find modlists containing specific Nexus mods
|
||||
- Randomised card ordering
|
||||
|
||||
**Performance:**
|
||||
- Gallery images loading from cache
|
||||
- Background metadata and image preloading when Install Modlist screen opens
|
||||
- Efficient rendering - cards created once, filters toggle visibility
|
||||
- Non-blocking UI with concurrent image downloads
|
||||
|
||||
**Steam Deck Optimized:**
|
||||
- Dynamic card sizing (e.g 250x270 on Steam Deck, larger on desktop)
|
||||
- Responsive grid layout (up to 4 columns on large screens, 3 on Steam Deck)
|
||||
- Optimized spacing and padding for 1280x800 displays
|
||||
|
||||
#### OAuth 2.0 Authentication
|
||||
Modern authentication for Nexus Mods with secure token management:
|
||||
|
||||
- One-click browser-based authorization with PKCE security
|
||||
- Automatic token refresh with encrypted storage
|
||||
- Authorisation status indicator on Install Modlist screen
|
||||
- Works in both GUI and CLI workflows
|
||||
|
||||
#### Compact Mode UI Redesign
|
||||
Streamlined interface with dynamic window management:
|
||||
|
||||
- Default compact mode with optional Details view
|
||||
- Activity window tab (default), across all workflow screens
|
||||
- Process Monitor tab still available
|
||||
- Show Details toggle for console output when needed
|
||||
|
||||
### Critical Fixes
|
||||
|
||||
### Replaced TTW Installer
|
||||
- Replaced the previous TTW Installer due to complexities with its config file
|
||||
|
||||
#### GPU Texture Conversion (jackify-engine 0.4.0)
|
||||
- Fixed GPU not being used for BC7/BC6H texture conversions
|
||||
- Previous versions fell back to CPU-only despite GPU availability
|
||||
- Added GPU toggle in Settings (enabled by default)
|
||||
|
||||
#### Winetricks Compatibility & Protontricks
|
||||
- Fixed bundled winetricks path incompatibility
|
||||
- Hopefully fixed winetricks in cases where it failed to download components
|
||||
- For now, Jackify still defaults to bundled winetricks (Protontricks toggle in settings)
|
||||
|
||||
#### Steam Restart Reliability
|
||||
- Enhanced Steam Restart so that is now hopefully works more reliably on all distros
|
||||
- Fixed Flatpak detection blocking normal Steam start methods
|
||||
|
||||
### Technical Improvements
|
||||
|
||||
- Proton version usage clarified: Install Proton for installation/texture processing, Game Proton for shortcuts
|
||||
- Centralised Steam detection in SystemInfo
|
||||
- ConfigHandler refactored to always read fresh from disk
|
||||
- Removed obsolete dotnet4.x code
|
||||
- Enhanced Flatpak Steam compatdata detection with proper VDF parsing
|
||||
|
||||
### Bug Fixes
|
||||
|
||||
- TTW installation UI performance (batched output processing, non-blocking operations)
|
||||
- Activity window animations (removed custom timers, Qt native rendering)
|
||||
- Timer reset when returning from TTW screen
|
||||
- Fixed bandwidth limit KB/s to bytes conversion
|
||||
- Fixed AttributeError in AutomatedPrefixService.restart_steam()
|
||||
|
||||
### Engine Updates
|
||||
- jackify-engine 0.4.0 with GPU texture conversion fixes and refactored file progress reporting
|
||||
|
||||
---
|
||||
|
||||
## v0.1.7.1 - Wine Component Verification & Flatpak Steam Fixes
|
||||
**Release Date:** November 11, 2025
|
||||
|
||||
### Critical Bug Fixes
|
||||
- **FIXED: Wine Component Installation Verification** - Jackify now verifies components are actually installed before reporting success
|
||||
|
||||
### Bug Fixes
|
||||
- **Steam Deck SD Card Paths**: Fixed ModOrganizer.ini path corruption on SD card installs using regex-based stripping
|
||||
- **Flatpak Steam Detection**: Fixed libraryfolders.vdf path detection for Flatpak Steam installations
|
||||
- **Flatpak Steam Restart**: Steam restart service now properly detects and controls Flatpak Steam
|
||||
- **Path Manipulation**: Fixed path corruption in Configure Existing/New Modlist (paths with spaces)
|
||||
|
||||
### Improvements
|
||||
- Added network diagnostics before winetricks fallback to protontricks
|
||||
- Enhanced component installation logging with verification status
|
||||
- Added GE-Proton 10-14 recommendation to success message (ENB compatibility note for Valve's Proton 10)
|
||||
|
||||
### Engine Updates
|
||||
- **jackify-engine 0.3.18**: Archive extraction fixes for Windows symlinks, bandwidth limiting fix, improved error messages
|
||||
|
||||
---
|
||||
|
||||
## v0.1.7 - TTW Automation & Bug Fixes
|
||||
**Release Date:** November 1, 2025
|
||||
|
||||
### Major Features
|
||||
- **TTW (Tale of Two Wastelands) Installation and Automation**
|
||||
laf - TTW Installation function using Hoolamike application - https://github.com/Niedzwiedzw/hoolamike
|
||||
- Automated workflow for TTW installation and integration into FNV modlists, where possible
|
||||
- Automatic detection of TTW-compatible modlists
|
||||
- User prompt after modlist installation with option to install TTW
|
||||
- Automated integration: file copying, load order updates, modlist.txt updates
|
||||
- Available in both CLI and GUI workflows
|
||||
|
||||
### Bug Fixes
|
||||
- **Registry UTF-8 Decode Error**: Fixed crash during dotnet4.x installation when Wine outputs binary data
|
||||
- **Python 3.10 Compatibility**: Fixed startup crash on Python 3.10 systems
|
||||
- **TTW Steam Deck Layout**: Fixed window sizing issues on Steam Deck when entering/exiting TTW screen
|
||||
- **TTW Integration Status**: Added visible status banner updates during modlist integration for collapsed mode
|
||||
- **TTW Accidental Input Protection**: Added 3-second countdown to TTW installation prompt to prevent accidental dismissal
|
||||
- **Settings Persistence**: Settings changes now persist correctly across workflows
|
||||
- **Steam Deck Keyboard Input**: Fixed keyboard input failure on Steam Deck
|
||||
- **Application Close Crash**: Fixed crash when closing application on Steam Deck
|
||||
- **Winetricks Diagnostics**: Enhanced error detection with automatic fallback
|
||||
|
||||
---
|
||||
|
||||
## v0.1.6.6 - AppImage Bundling Fix
|
||||
**Release Date:** October 29, 2025
|
||||
|
||||
### Bug Fixes
|
||||
- **Fixed AppImage bundling issue** causing legacy code to be retained in rare circumstances
|
||||
|
||||
---
|
||||
|
||||
## v0.1.6.5 - Steam Deck SD Card Path Fix
|
||||
**Release Date:** October 27, 2025
|
||||
|
||||
### Bug Fixes
|
||||
- **Fixed Steam Deck SD card path manipulation** when jackify-engine installed
|
||||
- **Fixed Ubuntu Qt platform plugin errors** by bundling XCB libraries
|
||||
- **Added Flatpak GE-Proton detection** and protontricks installation choices
|
||||
- **Extended Steam Deck SD card timeouts** for slower I/O operations
|
||||
|
||||
---
|
||||
|
||||
## v0.1.6.4 - Flatpak Steam Detection Hotfix
|
||||
**Release Date:** October 24, 2025
|
||||
|
||||
### Critical Bug Fixes
|
||||
- **FIXED: Flatpak Steam Detection**: Added support for `/data/Steam/` directory structure used by some Flatpak Steam installations
|
||||
- **IMPROVED: Steam Path Detection**: Now checks all known Flatpak Steam directory structures for maximum compatibility
|
||||
|
||||
---
|
||||
|
||||
## v0.1.6.3 - Emergency Hotfix
|
||||
**Release Date:** October 23, 2025
|
||||
|
||||
@@ -404,6 +703,23 @@
|
||||
- **Clean Architecture**: Removed obsolete service imports, initializations, and cleanup methods
|
||||
- **Code Quality**: Eliminated "tombstone comments" and unused service references
|
||||
|
||||
### Deferred Features (Available in Future Release)
|
||||
|
||||
#### OAuth 2.0 Authentication for Nexus Mods
|
||||
**Status:** Fully implemented but disabled pending Nexus Mods approval
|
||||
|
||||
The OAuth 2.0 authentication system has been fully developed and tested, but is temporarily disabled in v0.1.8 as we await approval from Nexus Mods for our OAuth application. The backend code remains intact and will be re-enabled immediately upon approval.
|
||||
|
||||
**Features (ready for deployment):**
|
||||
- **Secure OAuth 2.0 + PKCE Flow**: Modern authentication to replace API key dependency
|
||||
- **Encrypted Token Storage**: Tokens stored using Fernet encryption with automatic refresh
|
||||
- **GUI Integration**: Clean status display on Install Modlist screen with authorize/revoke functionality
|
||||
- **CLI Integration**: OAuth menu in Additional Tasks for command-line users
|
||||
- **API Key Fallback**: Optional legacy API key support (configurable in Settings)
|
||||
- **Unified Auth Service**: Single authentication layer supporting both OAuth and API key methods
|
||||
|
||||
**Current Limitation:** Awaiting Nexus approval for `jackify://oauth/callback` custom URI. Once approved, OAuth will be enabled as the primary authentication method with API key as optional fallback.
|
||||
|
||||
### Technical Details
|
||||
- **Single Shortcut Creation Path**: All workflows now use `run_working_workflow()` → `create_shortcut_with_native_service()`
|
||||
- **Service Layer Cleanup**: Removed dual codepath architecture in favor of proven automated workflows
|
||||
|
||||
@@ -77,6 +77,9 @@ Currently, there are two main functions that Jackify will perform at this stage
|
||||
- **FUSE** (required for AppImage execution)
|
||||
- Pre-installed on most Linux distributions
|
||||
- If AppImage fails to run, install FUSE using your distribution's package manager
|
||||
- **Ubuntu/Debian only**: Qt platform plugin library
|
||||
- `sudo apt install libxcb-cursor-dev`
|
||||
- Required for Qt GUI to initialize properly
|
||||
|
||||
### Installation
|
||||
|
||||
|
||||
@@ -5,4 +5,4 @@ This package provides both CLI and GUI interfaces for managing
|
||||
Wabbajack modlists natively on Linux systems.
|
||||
"""
|
||||
|
||||
__version__ = "0.1.6.3"
|
||||
__version__ = "0.2.0.9"
|
||||
|
||||
@@ -30,6 +30,8 @@ def _get_user_proton_version():
|
||||
from jackify.backend.handlers.wine_utils import WineUtils
|
||||
|
||||
config_handler = ConfigHandler()
|
||||
# Use Install Proton (not Game Proton) for installation/texture processing
|
||||
# get_proton_path() returns the Install Proton path
|
||||
user_proton_path = config_handler.get_proton_path()
|
||||
|
||||
if user_proton_path == 'auto':
|
||||
@@ -90,15 +92,15 @@ def get_jackify_engine_path():
|
||||
logger.debug(f"Using engine from environment variable: {env_engine_path}")
|
||||
return env_engine_path
|
||||
|
||||
# Priority 2: PyInstaller bundle (most specific detection)
|
||||
# Priority 2: Frozen bundle (most specific detection)
|
||||
if getattr(sys, 'frozen', False) and hasattr(sys, '_MEIPASS'):
|
||||
# Running in a PyInstaller bundle
|
||||
# Running inside a frozen bundle
|
||||
# Engine is expected at <bundle_root>/jackify/engine/jackify-engine
|
||||
engine_path = os.path.join(sys._MEIPASS, 'jackify', 'engine', 'jackify-engine')
|
||||
if os.path.exists(engine_path):
|
||||
return engine_path
|
||||
# Fallback: log warning but continue to other detection methods
|
||||
logger.warning(f"PyInstaller engine not found at expected path: {engine_path}")
|
||||
logger.warning(f"Frozen-bundle engine not found at expected path: {engine_path}")
|
||||
|
||||
# Priority 3: Check if THIS process is actually running from Jackify AppImage
|
||||
# (not just inheriting APPDIR from another AppImage like Cursor)
|
||||
@@ -123,7 +125,7 @@ def get_jackify_engine_path():
|
||||
|
||||
# If all else fails, log error and return the source path anyway
|
||||
logger.error(f"jackify-engine not found in any expected location. Tried:")
|
||||
logger.error(f" PyInstaller: {getattr(sys, '_MEIPASS', 'N/A')}/jackify/engine/jackify-engine")
|
||||
logger.error(f" Frozen bundle: {getattr(sys, '_MEIPASS', 'N/A')}/jackify/engine/jackify-engine")
|
||||
logger.error(f" AppImage: {appdir or 'N/A'}/opt/jackify/engine/jackify-engine")
|
||||
logger.error(f" Source: {engine_path}")
|
||||
logger.error("This will likely cause installation failures.")
|
||||
@@ -481,53 +483,76 @@ class ModlistInstallCLI:
|
||||
self.context['download_dir'] = download_dir_path
|
||||
self.logger.debug(f"Download directory context set to: {self.context['download_dir']}")
|
||||
|
||||
# 5. Prompt for Nexus API key (skip if in context and valid)
|
||||
# 5. Get Nexus authentication (OAuth or API key)
|
||||
if 'nexus_api_key' not in self.context or not self.context.get('nexus_api_key'):
|
||||
from jackify.backend.services.api_key_service import APIKeyService
|
||||
api_key_service = APIKeyService()
|
||||
saved_key = api_key_service.get_saved_api_key()
|
||||
api_key = None
|
||||
if saved_key:
|
||||
print("\n" + "-" * 28)
|
||||
print(f"{COLOR_INFO}A Nexus API Key is already saved.{COLOR_RESET}")
|
||||
use_saved = input(f"{COLOR_PROMPT}Use the saved API key? [Y/n]: {COLOR_RESET}").strip().lower()
|
||||
if use_saved in ('', 'y', 'yes'):
|
||||
api_key = saved_key
|
||||
from jackify.backend.services.nexus_auth_service import NexusAuthService
|
||||
auth_service = NexusAuthService()
|
||||
|
||||
# Get current auth status
|
||||
authenticated, method, username = auth_service.get_auth_status()
|
||||
|
||||
if authenticated:
|
||||
# Already authenticated - use existing auth
|
||||
if method == 'oauth':
|
||||
print("\n" + "-" * 28)
|
||||
print(f"{COLOR_SUCCESS}Nexus Authentication: Authorized via OAuth{COLOR_RESET}")
|
||||
if username:
|
||||
print(f"{COLOR_INFO}Logged in as: {username}{COLOR_RESET}")
|
||||
elif method == 'api_key':
|
||||
print("\n" + "-" * 28)
|
||||
print(f"{COLOR_INFO}Nexus Authentication: Using API Key (Legacy){COLOR_RESET}")
|
||||
|
||||
# Get valid token/key
|
||||
api_key = auth_service.ensure_valid_auth()
|
||||
if api_key:
|
||||
self.context['nexus_api_key'] = api_key
|
||||
else:
|
||||
new_key = input(f"{COLOR_PROMPT}Enter a new Nexus API Key (or press Enter to keep the saved one): {COLOR_RESET}").strip()
|
||||
if new_key:
|
||||
api_key = new_key
|
||||
replace = input(f"{COLOR_PROMPT}Replace the saved key with this one? [y/N]: {COLOR_RESET}").strip().lower()
|
||||
if replace == 'y':
|
||||
if api_key_service.save_api_key(api_key):
|
||||
print(f"{COLOR_SUCCESS}API key saved successfully.{COLOR_RESET}")
|
||||
else:
|
||||
print(f"{COLOR_WARNING}Failed to save API key. Using for this session only.{COLOR_RESET}")
|
||||
# Auth expired or invalid - prompt to set up
|
||||
print(f"\n{COLOR_WARNING}Your authentication has expired or is invalid.{COLOR_RESET}")
|
||||
authenticated = False
|
||||
|
||||
if not authenticated:
|
||||
# Not authenticated - offer to set up OAuth
|
||||
print("\n" + "-" * 28)
|
||||
print(f"{COLOR_WARNING}Nexus Mods authentication is required for downloading mods.{COLOR_RESET}")
|
||||
print(f"\n{COLOR_PROMPT}Would you like to authorize with Nexus now?{COLOR_RESET}")
|
||||
print(f"{COLOR_INFO}This will open your browser for secure OAuth authorization.{COLOR_RESET}")
|
||||
|
||||
authorize = input(f"{COLOR_PROMPT}Authorize now? [Y/n]: {COLOR_RESET}").strip().lower()
|
||||
|
||||
if authorize in ('', 'y', 'yes'):
|
||||
# Launch OAuth authorization
|
||||
print(f"\n{COLOR_INFO}Starting OAuth authorization...{COLOR_RESET}")
|
||||
print(f"{COLOR_WARNING}Your browser will open shortly.{COLOR_RESET}")
|
||||
print(f"{COLOR_INFO}Note: You may see a security warning about a self-signed certificate.{COLOR_RESET}")
|
||||
print(f"{COLOR_INFO}This is normal - click 'Advanced' and 'Proceed' to continue.{COLOR_RESET}")
|
||||
|
||||
def show_message(msg):
|
||||
print(f"\n{COLOR_INFO}{msg}{COLOR_RESET}")
|
||||
|
||||
success = auth_service.authorize_oauth(show_browser_message_callback=show_message)
|
||||
|
||||
if success:
|
||||
print(f"\n{COLOR_SUCCESS}OAuth authorization successful!{COLOR_RESET}")
|
||||
_, _, username = auth_service.get_auth_status()
|
||||
if username:
|
||||
print(f"{COLOR_INFO}Authorized as: {username}{COLOR_RESET}")
|
||||
|
||||
api_key = auth_service.ensure_valid_auth()
|
||||
if api_key:
|
||||
self.context['nexus_api_key'] = api_key
|
||||
else:
|
||||
print(f"{COLOR_INFO}Using new key for this session only. Saved key unchanged.{COLOR_RESET}")
|
||||
print(f"{COLOR_ERROR}Failed to retrieve auth token after authorization.{COLOR_RESET}")
|
||||
return None
|
||||
else:
|
||||
api_key = saved_key
|
||||
else:
|
||||
print("\n" + "-" * 28)
|
||||
print(f"{COLOR_INFO}A Nexus Mods API key is required for downloading mods.{COLOR_RESET}")
|
||||
print(f"{COLOR_INFO}You can get your personal key at: {COLOR_SELECTION}https://www.nexusmods.com/users/myaccount?tab=api{COLOR_RESET}")
|
||||
print(f"{COLOR_WARNING}Your API Key is NOT saved locally. It is used only for this session unless you choose to save it.{COLOR_RESET}")
|
||||
api_key = input(f"{COLOR_PROMPT}Enter Nexus API Key (or 'q' to cancel): {COLOR_RESET}").strip()
|
||||
if not api_key or api_key.lower() == 'q':
|
||||
self.logger.info("User cancelled or provided no API key.")
|
||||
return None
|
||||
save = input(f"{COLOR_PROMPT}Would you like to save this API key for future use? [y/N]: {COLOR_RESET}").strip().lower()
|
||||
if save == 'y':
|
||||
if api_key_service.save_api_key(api_key):
|
||||
print(f"{COLOR_SUCCESS}API key saved successfully.{COLOR_RESET}")
|
||||
else:
|
||||
print(f"{COLOR_WARNING}Failed to save API key. Using for this session only.{COLOR_RESET}")
|
||||
print(f"\n{COLOR_ERROR}OAuth authorization failed.{COLOR_RESET}")
|
||||
return None
|
||||
else:
|
||||
print(f"{COLOR_INFO}Using API key for this session only. It will not be saved.{COLOR_RESET}")
|
||||
|
||||
# Set the API key in context regardless of which path was taken
|
||||
self.context['nexus_api_key'] = api_key
|
||||
self.logger.debug(f"NEXUS_API_KEY is set in environment for engine (presence check).")
|
||||
# User declined OAuth - cancelled
|
||||
print(f"\n{COLOR_INFO}Authorization required to proceed. Installation cancelled.{COLOR_RESET}")
|
||||
self.logger.info("User declined Nexus authorization.")
|
||||
return None
|
||||
self.logger.debug(f"Nexus authentication configured for engine.")
|
||||
|
||||
# Display summary and confirm
|
||||
self._display_summary() # Ensure this method exists or implement it
|
||||
@@ -622,11 +647,23 @@ class ModlistInstallCLI:
|
||||
if isinstance(download_dir_display, tuple):
|
||||
download_dir_display = download_dir_display[0] # Get the Path object from (Path, bool)
|
||||
print(f"Download Directory: {download_dir_display}")
|
||||
|
||||
if self.context.get('nexus_api_key'):
|
||||
print(f"Nexus API Key: [SET]")
|
||||
|
||||
# Show authentication method
|
||||
from jackify.backend.services.nexus_auth_service import NexusAuthService
|
||||
auth_service = NexusAuthService()
|
||||
authenticated, method, username = auth_service.get_auth_status()
|
||||
|
||||
if method == 'oauth':
|
||||
auth_display = f"Nexus Authentication: OAuth"
|
||||
if username:
|
||||
auth_display += f" ({username})"
|
||||
elif method == 'api_key':
|
||||
auth_display = "Nexus Authentication: API Key (Legacy)"
|
||||
else:
|
||||
print(f"Nexus API Key: [NOT SET - WILL LIKELY FAIL]")
|
||||
# Should never reach here since we validate auth before getting to summary
|
||||
auth_display = "Nexus Authentication: Unknown"
|
||||
|
||||
print(auth_display)
|
||||
print(f"{COLOR_INFO}----------------------------------------{COLOR_RESET}")
|
||||
|
||||
def configuration_phase(self):
|
||||
@@ -643,7 +680,8 @@ class ModlistInstallCLI:
|
||||
start_time = time.time()
|
||||
|
||||
# --- BEGIN: TEE LOGGING SETUP & LOG ROTATION ---
|
||||
log_dir = Path.home() / "Jackify" / "logs"
|
||||
from jackify.shared.paths import get_jackify_logs_dir
|
||||
log_dir = get_jackify_logs_dir()
|
||||
log_dir.mkdir(parents=True, exist_ok=True)
|
||||
workflow_log_path = log_dir / "Modlist_Install_workflow.log"
|
||||
# Log rotation: keep last 3 logs, 1MB each (adjust as needed)
|
||||
@@ -719,7 +757,7 @@ class ModlistInstallCLI:
|
||||
# --- End Patch ---
|
||||
|
||||
# Build command
|
||||
cmd = [engine_path, 'install']
|
||||
cmd = [engine_path, 'install', '--show-file-progress']
|
||||
# Determine if this is a local .wabbajack file or an online modlist
|
||||
modlist_value = self.context.get('modlist_value')
|
||||
if modlist_value and modlist_value.endswith('.wabbajack') and os.path.isfile(modlist_value):
|
||||
@@ -771,9 +809,11 @@ class ModlistInstallCLI:
|
||||
else:
|
||||
self.logger.warning(f"File descriptor limit: {message}")
|
||||
|
||||
# Popen now inherits the modified os.environ because env=None
|
||||
# Use cleaned environment to prevent AppImage variable inheritance
|
||||
from jackify.backend.handlers.subprocess_utils import get_clean_subprocess_env
|
||||
clean_env = get_clean_subprocess_env()
|
||||
# Store process reference for cleanup
|
||||
self._current_process = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, text=False, env=None, cwd=engine_dir)
|
||||
self._current_process = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, text=False, env=clean_env, cwd=engine_dir)
|
||||
proc = self._current_process
|
||||
|
||||
# Read output in binary mode to properly handle carriage returns
|
||||
@@ -1512,9 +1552,21 @@ class ModlistInstallCLI:
|
||||
if isinstance(download_dir_display, tuple):
|
||||
download_dir_display = download_dir_display[0] # Get the Path object from (Path, bool)
|
||||
print(f"Download Directory: {download_dir_display}")
|
||||
|
||||
if self.context.get('nexus_api_key'):
|
||||
print(f"Nexus API Key: [SET]")
|
||||
|
||||
# Show authentication method
|
||||
from jackify.backend.services.nexus_auth_service import NexusAuthService
|
||||
auth_service = NexusAuthService()
|
||||
authenticated, method, username = auth_service.get_auth_status()
|
||||
|
||||
if method == 'oauth':
|
||||
auth_display = f"Nexus Authentication: OAuth"
|
||||
if username:
|
||||
auth_display += f" ({username})"
|
||||
elif method == 'api_key':
|
||||
auth_display = "Nexus Authentication: API Key (Legacy)"
|
||||
else:
|
||||
print(f"Nexus API Key: [NOT SET - WILL LIKELY FAIL]")
|
||||
# Should never reach here since we validate auth before getting to summary
|
||||
auth_display = "Nexus Authentication: Unknown"
|
||||
|
||||
print(auth_display)
|
||||
print(f"{COLOR_INFO}----------------------------------------{COLOR_RESET}")
|
||||
3
jackify/backend/data/__init__.py
Normal file
3
jackify/backend/data/__init__.py
Normal file
@@ -0,0 +1,3 @@
|
||||
"""
|
||||
Data package for static configuration and reference data.
|
||||
"""
|
||||
46
jackify/backend/data/ttw_compatible_modlists.py
Normal file
46
jackify/backend/data/ttw_compatible_modlists.py
Normal file
@@ -0,0 +1,46 @@
|
||||
"""
|
||||
TTW-Compatible Modlists Configuration
|
||||
|
||||
Defines which Fallout New Vegas modlists support Tale of Two Wastelands.
|
||||
This whitelist determines when Jackify should offer TTW installation after
|
||||
a successful modlist installation.
|
||||
"""
|
||||
|
||||
TTW_COMPATIBLE_MODLISTS = {
|
||||
# Exact modlist names that support/require TTW
|
||||
"exact_matches": [
|
||||
"Begin Again",
|
||||
"Uranium Fever",
|
||||
"The Badlands",
|
||||
"Wild Card TTW",
|
||||
],
|
||||
|
||||
# Pattern matching for modlist names (regex)
|
||||
"patterns": [
|
||||
r".*TTW.*", # Any modlist with TTW in name
|
||||
r".*Tale.*Two.*Wastelands.*",
|
||||
]
|
||||
}
|
||||
|
||||
|
||||
def is_ttw_compatible(modlist_name: str) -> bool:
|
||||
"""Check if modlist name matches TTW compatibility criteria
|
||||
|
||||
Args:
|
||||
modlist_name: Name of the modlist to check
|
||||
|
||||
Returns:
|
||||
bool: True if modlist is TTW-compatible, False otherwise
|
||||
"""
|
||||
import re
|
||||
|
||||
# Check exact matches
|
||||
if modlist_name in TTW_COMPATIBLE_MODLISTS['exact_matches']:
|
||||
return True
|
||||
|
||||
# Check pattern matches
|
||||
for pattern in TTW_COMPATIBLE_MODLISTS['patterns']:
|
||||
if re.match(pattern, modlist_name, re.IGNORECASE):
|
||||
return True
|
||||
|
||||
return False
|
||||
@@ -6,12 +6,15 @@ Handles application settings and configuration
|
||||
"""
|
||||
|
||||
import os
|
||||
import sys
|
||||
import json
|
||||
import logging
|
||||
import shutil
|
||||
import re
|
||||
import base64
|
||||
import hashlib
|
||||
from pathlib import Path
|
||||
from typing import Optional
|
||||
|
||||
# Initialize logger
|
||||
logger = logging.getLogger(__name__)
|
||||
@@ -20,14 +23,27 @@ logger = logging.getLogger(__name__)
|
||||
class ConfigHandler:
|
||||
"""
|
||||
Handles application configuration and settings
|
||||
Singleton pattern ensures all code shares the same instance
|
||||
"""
|
||||
|
||||
_instance = None
|
||||
_initialized = False
|
||||
|
||||
def __new__(cls):
|
||||
if cls._instance is None:
|
||||
cls._instance = super(ConfigHandler, cls).__new__(cls)
|
||||
return cls._instance
|
||||
|
||||
def __init__(self):
|
||||
"""Initialize configuration handler with default settings"""
|
||||
# Only initialize once (singleton pattern)
|
||||
if ConfigHandler._initialized:
|
||||
return
|
||||
ConfigHandler._initialized = True
|
||||
|
||||
self.config_dir = os.path.expanduser("~/.config/jackify")
|
||||
self.config_file = os.path.join(self.config_dir, "config.json")
|
||||
self.settings = {
|
||||
"version": "0.0.5",
|
||||
"version": "0.2.0",
|
||||
"last_selected_modlist": None,
|
||||
"steam_libraries": [],
|
||||
"resolution": None,
|
||||
@@ -39,19 +55,27 @@ class ConfigHandler:
|
||||
"modlist_install_base_dir": os.path.expanduser("~/Games"), # Configurable base directory for modlist installations
|
||||
"modlist_downloads_base_dir": os.path.expanduser("~/Games/Modlist_Downloads"), # Configurable base directory for downloads
|
||||
"jackify_data_dir": None, # Configurable Jackify data directory (default: ~/Jackify)
|
||||
"use_winetricks_for_components": True, # True = use winetricks (faster), False = use protontricks for all (legacy)
|
||||
"game_proton_path": None # Proton version for game shortcuts (can be any Proton 9+), separate from install proton
|
||||
"use_winetricks_for_components": True, # DEPRECATED: Migrated to component_installation_method. Kept for backward compatibility.
|
||||
"component_installation_method": "winetricks", # "winetricks" (default) or "system_protontricks"
|
||||
"game_proton_path": None, # Proton version for game shortcuts (can be any Proton 9+), separate from install proton
|
||||
"steam_restart_strategy": "jackify", # "jackify" (default) or "nak_simple"
|
||||
"window_width": None, # Saved window width (None = use dynamic sizing)
|
||||
"window_height": None # Saved window height (None = use dynamic sizing)
|
||||
}
|
||||
|
||||
# Load configuration if exists
|
||||
self._load_config()
|
||||
|
||||
|
||||
# Perform version migrations
|
||||
self._migrate_config()
|
||||
|
||||
# If steam_path is not set, detect it
|
||||
if not self.settings["steam_path"]:
|
||||
self.settings["steam_path"] = self._detect_steam_path()
|
||||
|
||||
# Auto-detect and set Proton version on first run
|
||||
if not self.settings.get("proton_path"):
|
||||
# Auto-detect and set Proton version ONLY on first run (config file doesn't exist)
|
||||
# Do NOT overwrite user's saved settings!
|
||||
if not os.path.exists(self.config_file) and not self.settings.get("proton_path"):
|
||||
self._auto_detect_proton()
|
||||
|
||||
# If jackify_data_dir is not set, initialize it to default
|
||||
@@ -86,7 +110,8 @@ class ConfigHandler:
|
||||
libraryfolders_vdf_paths = [
|
||||
os.path.expanduser("~/.steam/steam/config/libraryfolders.vdf"),
|
||||
os.path.expanduser("~/.local/share/Steam/config/libraryfolders.vdf"),
|
||||
os.path.expanduser("~/.steam/root/config/libraryfolders.vdf")
|
||||
os.path.expanduser("~/.steam/root/config/libraryfolders.vdf"),
|
||||
os.path.expanduser("~/.var/app/com.valvesoftware.Steam/.local/share/Steam/config/libraryfolders.vdf") # Flatpak
|
||||
]
|
||||
|
||||
for vdf_path in libraryfolders_vdf_paths:
|
||||
@@ -100,7 +125,10 @@ class ConfigHandler:
|
||||
return None
|
||||
|
||||
def _load_config(self):
|
||||
"""Load configuration from file"""
|
||||
"""
|
||||
Load configuration from file and update in-memory cache.
|
||||
For legacy compatibility with initialization code.
|
||||
"""
|
||||
try:
|
||||
if os.path.exists(self.config_file):
|
||||
with open(self.config_file, 'r') as f:
|
||||
@@ -113,6 +141,84 @@ class ConfigHandler:
|
||||
self._create_config_dir()
|
||||
except Exception as e:
|
||||
logger.error(f"Error loading configuration: {e}")
|
||||
|
||||
def _migrate_config(self):
|
||||
"""
|
||||
Migrate configuration between versions
|
||||
Handles breaking changes and data format updates
|
||||
"""
|
||||
current_version = self.settings.get("version", "0.0.0")
|
||||
target_version = "0.2.0"
|
||||
|
||||
if current_version == target_version:
|
||||
return
|
||||
|
||||
logger.info(f"Migrating config from {current_version} to {target_version}")
|
||||
|
||||
# Migration: v0.0.x -> v0.2.0
|
||||
# Encryption changed from cryptography (Fernet) to pycryptodome (AES-GCM)
|
||||
# Old encrypted API keys cannot be decrypted, must be re-entered
|
||||
from packaging import version
|
||||
if version.parse(current_version) < version.parse("0.2.0"):
|
||||
# Clear old encrypted credentials
|
||||
if self.settings.get("nexus_api_key"):
|
||||
logger.warning("Clearing saved API key due to encryption format change")
|
||||
logger.warning("Please re-enter your Nexus API key in Settings")
|
||||
self.settings["nexus_api_key"] = None
|
||||
|
||||
# Clear OAuth token file (different encryption format)
|
||||
oauth_token_file = Path(self.config_dir) / "nexus-oauth.json"
|
||||
if oauth_token_file.exists():
|
||||
logger.warning("Clearing saved OAuth token due to encryption format change")
|
||||
logger.warning("Please re-authorize with Nexus Mods")
|
||||
try:
|
||||
oauth_token_file.unlink()
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to remove old OAuth token: {e}")
|
||||
|
||||
# Remove obsolete keys
|
||||
obsolete_keys = [
|
||||
"hoolamike_install_path",
|
||||
"hoolamike_version",
|
||||
"api_key_fallback_enabled",
|
||||
"proton_version", # Display string only, path stored in proton_path
|
||||
"game_proton_version" # Display string only, path stored in game_proton_path
|
||||
]
|
||||
|
||||
removed_count = 0
|
||||
for key in obsolete_keys:
|
||||
if key in self.settings:
|
||||
del self.settings[key]
|
||||
removed_count += 1
|
||||
|
||||
if removed_count > 0:
|
||||
logger.info(f"Removed {removed_count} obsolete config keys")
|
||||
|
||||
# Update version
|
||||
self.settings["version"] = target_version
|
||||
self.save_config()
|
||||
logger.info("Config migration completed")
|
||||
|
||||
def _read_config_from_disk(self):
|
||||
"""
|
||||
Read configuration directly from disk without caching.
|
||||
Returns merged config (defaults + saved values).
|
||||
"""
|
||||
try:
|
||||
config = self.settings.copy() # Start with defaults
|
||||
if os.path.exists(self.config_file):
|
||||
with open(self.config_file, 'r') as f:
|
||||
saved_config = json.load(f)
|
||||
config.update(saved_config)
|
||||
return config
|
||||
except Exception as e:
|
||||
# Don't use logger here - can cause recursion if logger tries to access config
|
||||
print(f"Warning: Error reading configuration from disk: {e}", file=sys.stderr)
|
||||
return self.settings.copy()
|
||||
|
||||
def reload_config(self):
|
||||
"""Reload configuration from disk to pick up external changes"""
|
||||
self._load_config()
|
||||
|
||||
def _create_config_dir(self):
|
||||
"""Create configuration directory if it doesn't exist"""
|
||||
@@ -135,8 +241,12 @@ class ConfigHandler:
|
||||
return False
|
||||
|
||||
def get(self, key, default=None):
|
||||
"""Get a configuration value by key"""
|
||||
return self.settings.get(key, default)
|
||||
"""
|
||||
Get a configuration value by key.
|
||||
Always reads fresh from disk to avoid stale data.
|
||||
"""
|
||||
config = self._read_config_from_disk()
|
||||
return config.get(key, default)
|
||||
|
||||
def set(self, key, value):
|
||||
"""Set a configuration value"""
|
||||
@@ -195,48 +305,192 @@ class ConfigHandler:
|
||||
"""Get the path to protontricks executable"""
|
||||
return self.settings.get("protontricks_path")
|
||||
|
||||
def _get_encryption_key(self) -> bytes:
|
||||
"""
|
||||
Generate encryption key for API key storage using same method as OAuth tokens
|
||||
|
||||
Returns:
|
||||
Fernet-compatible encryption key
|
||||
"""
|
||||
import socket
|
||||
import getpass
|
||||
|
||||
try:
|
||||
hostname = socket.gethostname()
|
||||
username = getpass.getuser()
|
||||
|
||||
# Try to get machine ID
|
||||
machine_id = None
|
||||
try:
|
||||
with open('/etc/machine-id', 'r') as f:
|
||||
machine_id = f.read().strip()
|
||||
except:
|
||||
try:
|
||||
with open('/var/lib/dbus/machine-id', 'r') as f:
|
||||
machine_id = f.read().strip()
|
||||
except:
|
||||
pass
|
||||
|
||||
if machine_id:
|
||||
key_material = f"{hostname}:{username}:{machine_id}:jackify"
|
||||
else:
|
||||
key_material = f"{hostname}:{username}:jackify"
|
||||
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to get machine info for encryption: {e}")
|
||||
key_material = "jackify:default:key"
|
||||
|
||||
# Generate Fernet-compatible key
|
||||
key_bytes = hashlib.sha256(key_material.encode('utf-8')).digest()
|
||||
return base64.urlsafe_b64encode(key_bytes)
|
||||
|
||||
def _encrypt_api_key(self, api_key: str) -> str:
|
||||
"""
|
||||
Encrypt API key using AES-GCM
|
||||
|
||||
Args:
|
||||
api_key: Plain text API key
|
||||
|
||||
Returns:
|
||||
Encrypted API key string
|
||||
"""
|
||||
try:
|
||||
from Crypto.Cipher import AES
|
||||
from Crypto.Random import get_random_bytes
|
||||
|
||||
# Derive 32-byte AES key
|
||||
key = base64.urlsafe_b64decode(self._get_encryption_key())
|
||||
|
||||
# Generate random nonce
|
||||
nonce = get_random_bytes(12)
|
||||
|
||||
# Encrypt with AES-GCM
|
||||
cipher = AES.new(key, AES.MODE_GCM, nonce=nonce)
|
||||
ciphertext, tag = cipher.encrypt_and_digest(api_key.encode('utf-8'))
|
||||
|
||||
# Combine and encode
|
||||
combined = nonce + ciphertext + tag
|
||||
return base64.b64encode(combined).decode('utf-8')
|
||||
|
||||
except ImportError:
|
||||
# Fallback to base64 if pycryptodome not available
|
||||
logger.warning("pycryptodome not available, using base64 encoding (less secure)")
|
||||
return base64.b64encode(api_key.encode('utf-8')).decode('utf-8')
|
||||
except Exception as e:
|
||||
logger.error(f"Error encrypting API key: {e}")
|
||||
return ""
|
||||
|
||||
def _decrypt_api_key(self, encrypted_key: str) -> Optional[str]:
|
||||
"""
|
||||
Decrypt API key using AES-GCM
|
||||
|
||||
Args:
|
||||
encrypted_key: Encrypted API key string
|
||||
|
||||
Returns:
|
||||
Decrypted API key or None on failure
|
||||
"""
|
||||
try:
|
||||
from Crypto.Cipher import AES
|
||||
|
||||
# Check if MODE_GCM is available (pycryptodome has it, old pycrypto doesn't)
|
||||
if not hasattr(AES, 'MODE_GCM'):
|
||||
# Fallback to base64 decode if old pycrypto is installed
|
||||
try:
|
||||
return base64.b64decode(encrypted_key.encode('utf-8')).decode('utf-8')
|
||||
except:
|
||||
return None
|
||||
|
||||
# Derive 32-byte AES key
|
||||
key = base64.urlsafe_b64decode(self._get_encryption_key())
|
||||
|
||||
# Decode and split
|
||||
combined = base64.b64decode(encrypted_key.encode('utf-8'))
|
||||
nonce = combined[:12]
|
||||
tag = combined[-16:]
|
||||
ciphertext = combined[12:-16]
|
||||
|
||||
# Decrypt with AES-GCM
|
||||
cipher = AES.new(key, AES.MODE_GCM, nonce=nonce)
|
||||
plaintext = cipher.decrypt_and_verify(ciphertext, tag)
|
||||
|
||||
return plaintext.decode('utf-8')
|
||||
|
||||
except ImportError:
|
||||
# Fallback to base64 decode
|
||||
try:
|
||||
return base64.b64decode(encrypted_key.encode('utf-8')).decode('utf-8')
|
||||
except:
|
||||
return None
|
||||
except AttributeError:
|
||||
# Old pycrypto doesn't have MODE_GCM, fallback to base64
|
||||
try:
|
||||
return base64.b64decode(encrypted_key.encode('utf-8')).decode('utf-8')
|
||||
except:
|
||||
return None
|
||||
except Exception as e:
|
||||
# Might be old base64-only format, try decoding
|
||||
try:
|
||||
return base64.b64decode(encrypted_key.encode('utf-8')).decode('utf-8')
|
||||
except:
|
||||
logger.error(f"Error decrypting API key: {e}")
|
||||
return None
|
||||
|
||||
def save_api_key(self, api_key):
|
||||
"""
|
||||
Save Nexus API key with base64 encoding
|
||||
|
||||
Save Nexus API key with Fernet encryption
|
||||
|
||||
Args:
|
||||
api_key (str): Plain text API key
|
||||
|
||||
|
||||
Returns:
|
||||
bool: True if saved successfully, False otherwise
|
||||
"""
|
||||
try:
|
||||
if api_key:
|
||||
# Encode the API key using base64
|
||||
encoded_key = base64.b64encode(api_key.encode('utf-8')).decode('utf-8')
|
||||
self.settings["nexus_api_key"] = encoded_key
|
||||
logger.debug("API key saved successfully")
|
||||
# Encrypt the API key using Fernet
|
||||
encrypted_key = self._encrypt_api_key(api_key)
|
||||
if not encrypted_key:
|
||||
logger.error("Failed to encrypt API key")
|
||||
return False
|
||||
|
||||
self.settings["nexus_api_key"] = encrypted_key
|
||||
logger.debug("API key encrypted and saved successfully")
|
||||
else:
|
||||
# Clear the API key if empty
|
||||
self.settings["nexus_api_key"] = None
|
||||
logger.debug("API key cleared")
|
||||
|
||||
return self.save_config()
|
||||
|
||||
result = self.save_config()
|
||||
|
||||
# Set restrictive permissions on config file
|
||||
if result:
|
||||
try:
|
||||
os.chmod(self.config_file, 0o600)
|
||||
except Exception as e:
|
||||
logger.warning(f"Could not set restrictive permissions on config: {e}")
|
||||
|
||||
return result
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error saving API key: {e}")
|
||||
return False
|
||||
|
||||
|
||||
def get_api_key(self):
|
||||
"""
|
||||
Retrieve and decode the saved Nexus API key
|
||||
Always reads fresh from disk to pick up changes from other instances
|
||||
|
||||
Retrieve and decrypt the saved Nexus API key.
|
||||
Always reads fresh from disk.
|
||||
|
||||
Returns:
|
||||
str: Decoded API key or None if not saved
|
||||
str: Decrypted API key or None if not saved
|
||||
"""
|
||||
try:
|
||||
# Reload config from disk to pick up changes from Settings dialog
|
||||
self._load_config()
|
||||
encoded_key = self.settings.get("nexus_api_key")
|
||||
if encoded_key:
|
||||
# Decode the base64 encoded key
|
||||
decoded_key = base64.b64decode(encoded_key.encode('utf-8')).decode('utf-8')
|
||||
return decoded_key
|
||||
config = self._read_config_from_disk()
|
||||
encrypted_key = config.get("nexus_api_key")
|
||||
if encrypted_key:
|
||||
# Decrypt the API key
|
||||
decrypted_key = self._decrypt_api_key(encrypted_key)
|
||||
return decrypted_key
|
||||
return None
|
||||
except Exception as e:
|
||||
logger.error(f"Error retrieving API key: {e}")
|
||||
@@ -244,15 +498,14 @@ class ConfigHandler:
|
||||
|
||||
def has_saved_api_key(self):
|
||||
"""
|
||||
Check if an API key is saved in configuration
|
||||
Always reads fresh from disk to pick up changes from other instances
|
||||
|
||||
Check if an API key is saved in configuration.
|
||||
Always reads fresh from disk.
|
||||
|
||||
Returns:
|
||||
bool: True if API key exists, False otherwise
|
||||
"""
|
||||
# Reload config from disk to pick up changes from Settings dialog
|
||||
self._load_config()
|
||||
return self.settings.get("nexus_api_key") is not None
|
||||
config = self._read_config_from_disk()
|
||||
return config.get("nexus_api_key") is not None
|
||||
|
||||
def clear_api_key(self):
|
||||
"""
|
||||
@@ -500,16 +753,15 @@ class ConfigHandler:
|
||||
|
||||
def get_proton_path(self):
|
||||
"""
|
||||
Retrieve the saved Install Proton path from configuration (for jackify-engine)
|
||||
Always reads fresh from disk to pick up changes from Settings dialog
|
||||
Retrieve the saved Install Proton path from configuration (for jackify-engine).
|
||||
Always reads fresh from disk.
|
||||
|
||||
Returns:
|
||||
str: Saved Install Proton path or 'auto' if not saved
|
||||
"""
|
||||
try:
|
||||
# Reload config from disk to pick up changes from Settings dialog
|
||||
self._load_config()
|
||||
proton_path = self.settings.get("proton_path", "auto")
|
||||
config = self._read_config_from_disk()
|
||||
proton_path = config.get("proton_path", "auto")
|
||||
logger.debug(f"Retrieved fresh install proton_path from config: {proton_path}")
|
||||
return proton_path
|
||||
except Exception as e:
|
||||
@@ -518,21 +770,20 @@ class ConfigHandler:
|
||||
|
||||
def get_game_proton_path(self):
|
||||
"""
|
||||
Retrieve the saved Game Proton path from configuration (for game shortcuts)
|
||||
Falls back to install Proton path if game Proton not set
|
||||
Always reads fresh from disk to pick up changes from Settings dialog
|
||||
Retrieve the saved Game Proton path from configuration (for game shortcuts).
|
||||
Falls back to install Proton path if game Proton not set.
|
||||
Always reads fresh from disk.
|
||||
|
||||
Returns:
|
||||
str: Saved Game Proton path, Install Proton path, or 'auto' if not saved
|
||||
"""
|
||||
try:
|
||||
# Reload config from disk to pick up changes from Settings dialog
|
||||
self._load_config()
|
||||
game_proton_path = self.settings.get("game_proton_path")
|
||||
config = self._read_config_from_disk()
|
||||
game_proton_path = config.get("game_proton_path")
|
||||
|
||||
# If game proton not set or set to same_as_install, use install proton
|
||||
if not game_proton_path or game_proton_path == "same_as_install":
|
||||
game_proton_path = self.settings.get("proton_path", "auto")
|
||||
game_proton_path = config.get("proton_path", "auto")
|
||||
|
||||
logger.debug(f"Retrieved fresh game proton_path from config: {game_proton_path}")
|
||||
return game_proton_path
|
||||
@@ -542,16 +793,15 @@ class ConfigHandler:
|
||||
|
||||
def get_proton_version(self):
|
||||
"""
|
||||
Retrieve the saved Proton version from configuration
|
||||
Always reads fresh from disk to pick up changes from Settings dialog
|
||||
Retrieve the saved Proton version from configuration.
|
||||
Always reads fresh from disk.
|
||||
|
||||
Returns:
|
||||
str: Saved Proton version or 'auto' if not saved
|
||||
"""
|
||||
try:
|
||||
# Reload config from disk to pick up changes from Settings dialog
|
||||
self._load_config()
|
||||
proton_version = self.settings.get("proton_version", "auto")
|
||||
config = self._read_config_from_disk()
|
||||
proton_version = config.get("proton_version", "auto")
|
||||
logger.debug(f"Retrieved fresh proton_version from config: {proton_version}")
|
||||
return proton_version
|
||||
except Exception as e:
|
||||
|
||||
@@ -784,7 +784,8 @@ class FileSystemHandler:
|
||||
possible_vdf_paths = [
|
||||
Path.home() / ".steam/steam/config/libraryfolders.vdf",
|
||||
Path.home() / ".local/share/Steam/config/libraryfolders.vdf",
|
||||
Path.home() / ".steam/root/config/libraryfolders.vdf"
|
||||
Path.home() / ".steam/root/config/libraryfolders.vdf",
|
||||
Path.home() / ".var/app/com.valvesoftware.Steam/.local/share/Steam/config/libraryfolders.vdf" # Flatpak
|
||||
]
|
||||
|
||||
libraryfolders_vdf_path: Optional[Path] = None
|
||||
|
||||
@@ -1,994 +0,0 @@
|
||||
import logging
|
||||
import os
|
||||
import subprocess
|
||||
import zipfile
|
||||
import tarfile
|
||||
from pathlib import Path
|
||||
import yaml # Assuming PyYAML is installed
|
||||
from typing import Dict, Optional, List
|
||||
import requests
|
||||
|
||||
# Import necessary handlers from the current Jackify structure
|
||||
from .path_handler import PathHandler
|
||||
from .vdf_handler import VDFHandler # Keeping just in case
|
||||
from .filesystem_handler import FileSystemHandler
|
||||
from .config_handler import ConfigHandler
|
||||
# Import color constants needed for print statements in this module
|
||||
from .ui_colors import COLOR_ERROR, COLOR_SUCCESS, COLOR_WARNING, COLOR_RESET, COLOR_INFO, COLOR_PROMPT, COLOR_SELECTION
|
||||
# Standard logging (no file handler) - LoggingHandler import removed
|
||||
from .status_utils import show_status, clear_status
|
||||
from .subprocess_utils import get_clean_subprocess_env
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Define default Hoolamike AppIDs for relevant games
|
||||
TARGET_GAME_APPIDS = {
|
||||
'Fallout 3': '22370', # GOTY Edition
|
||||
'Fallout New Vegas': '22380', # Base game
|
||||
'Skyrim Special Edition': '489830',
|
||||
'Oblivion': '22330', # GOTY Edition
|
||||
'Fallout 4': '377160'
|
||||
}
|
||||
|
||||
# Define the expected name of the native Hoolamike executable
|
||||
HOOLAMIKE_EXECUTABLE_NAME = "hoolamike" # Assuming this is the binary name
|
||||
# Keep consistent with logs directory - use ~/Jackify/ for user-visible managed components
|
||||
JACKIFY_BASE_DIR = Path.home() / "Jackify"
|
||||
# Use Jackify base directory for ALL Hoolamike-related files to centralize management
|
||||
DEFAULT_HOOLAMIKE_APP_INSTALL_DIR = JACKIFY_BASE_DIR / "Hoolamike"
|
||||
HOOLAMIKE_CONFIG_DIR = DEFAULT_HOOLAMIKE_APP_INSTALL_DIR
|
||||
HOOLAMIKE_CONFIG_FILENAME = "hoolamike.yaml"
|
||||
# Default dirs for other components
|
||||
DEFAULT_HOOLAMIKE_DOWNLOADS_DIR = JACKIFY_BASE_DIR / "Mod_Downloads"
|
||||
DEFAULT_MODLIST_INSTALL_BASE_DIR = Path.home() / "ModdedGames"
|
||||
|
||||
class HoolamikeHandler:
|
||||
"""Handles discovery, configuration, and execution of Hoolamike tasks.
|
||||
Assumes Hoolamike is a native Linux CLI application.
|
||||
"""
|
||||
|
||||
def __init__(self, steamdeck: bool, verbose: bool, filesystem_handler: FileSystemHandler, config_handler: ConfigHandler, menu_handler=None):
|
||||
"""Initialize the handler and perform initial discovery."""
|
||||
self.steamdeck = steamdeck
|
||||
self.verbose = verbose
|
||||
self.path_handler = PathHandler()
|
||||
self.filesystem_handler = filesystem_handler
|
||||
self.config_handler = config_handler
|
||||
self.menu_handler = menu_handler
|
||||
# Use standard logging (no file handler)
|
||||
self.logger = logging.getLogger(__name__)
|
||||
|
||||
# --- Discovered/Managed State ---
|
||||
self.game_install_paths: Dict[str, Path] = {}
|
||||
# Allow user override for Hoolamike app install path later
|
||||
self.hoolamike_app_install_path: Path = DEFAULT_HOOLAMIKE_APP_INSTALL_DIR
|
||||
self.hoolamike_executable_path: Optional[Path] = None # Path to the binary
|
||||
self.hoolamike_installed: bool = False
|
||||
self.hoolamike_config_path: Path = HOOLAMIKE_CONFIG_DIR / HOOLAMIKE_CONFIG_FILENAME
|
||||
self.hoolamike_config: Optional[Dict] = None
|
||||
|
||||
# Load Hoolamike install path from Jackify config if it exists
|
||||
saved_path_str = self.config_handler.get('hoolamike_install_path')
|
||||
if saved_path_str and Path(saved_path_str).is_dir(): # Basic check if path exists
|
||||
self.hoolamike_app_install_path = Path(saved_path_str)
|
||||
self.logger.info(f"Loaded Hoolamike install path from Jackify config: {self.hoolamike_app_install_path}")
|
||||
|
||||
self._load_hoolamike_config()
|
||||
self._run_discovery()
|
||||
|
||||
def _ensure_hoolamike_dirs_exist(self):
|
||||
"""Ensure base directories for Hoolamike exist."""
|
||||
try:
|
||||
HOOLAMIKE_CONFIG_DIR.mkdir(parents=True, exist_ok=True) # Separate Hoolamike config
|
||||
self.hoolamike_app_install_path.mkdir(parents=True, exist_ok=True) # Install dir (~/Jackify/Hoolamike)
|
||||
# Default downloads dir also needs to exist if we reference it
|
||||
DEFAULT_HOOLAMIKE_DOWNLOADS_DIR.mkdir(parents=True, exist_ok=True)
|
||||
except OSError as e:
|
||||
self.logger.error(f"Error creating Hoolamike directories: {e}", exc_info=True)
|
||||
# Decide how to handle this - maybe raise an exception?
|
||||
|
||||
def _check_hoolamike_installation(self):
|
||||
"""Check if Hoolamike executable exists at the expected location.
|
||||
Prioritizes path stored in config if available.
|
||||
"""
|
||||
potential_exe_path = self.hoolamike_app_install_path / HOOLAMIKE_EXECUTABLE_NAME
|
||||
check_path = None
|
||||
if potential_exe_path.is_file() and os.access(potential_exe_path, os.X_OK):
|
||||
check_path = potential_exe_path
|
||||
self.logger.info(f"Found Hoolamike at current path: {check_path}")
|
||||
else:
|
||||
self.logger.info(f"Hoolamike executable ({HOOLAMIKE_EXECUTABLE_NAME}) not found or not executable at current path {self.hoolamike_app_install_path}.")
|
||||
|
||||
# Update state based on whether we found a valid path
|
||||
if check_path:
|
||||
self.hoolamike_installed = True
|
||||
self.hoolamike_executable_path = check_path
|
||||
else:
|
||||
self.hoolamike_installed = False
|
||||
self.hoolamike_executable_path = None
|
||||
|
||||
def _generate_default_config(self) -> Dict:
|
||||
"""Generates the default configuration dictionary."""
|
||||
self.logger.info("Generating default Hoolamike config structure.")
|
||||
# Detection is now handled separately after loading config
|
||||
detected_paths = self.path_handler.find_game_install_paths(TARGET_GAME_APPIDS)
|
||||
|
||||
config = {
|
||||
"downloaders": {
|
||||
"downloads_directory": str(DEFAULT_HOOLAMIKE_DOWNLOADS_DIR),
|
||||
"nexus": {"api_key": "YOUR_API_KEY_HERE"}
|
||||
},
|
||||
"installation": {
|
||||
"wabbajack_file_path": "", # Placeholder, set per-run
|
||||
"installation_path": "" # Placeholder, set per-run
|
||||
},
|
||||
"games": { # Only include detected games with consistent formatting (no spaces)
|
||||
self._format_game_name(game_name): {"root_directory": str(path)}
|
||||
for game_name, path in detected_paths.items()
|
||||
},
|
||||
"fixup": {
|
||||
"game_resolution": "1920x1080"
|
||||
},
|
||||
"extras": {
|
||||
"tale_of_two_wastelands": {
|
||||
"path_to_ttw_mpi_file": "", # Placeholder
|
||||
"variables": {
|
||||
"DESTINATION": "" # Placeholder
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
# Add comment if no games detected
|
||||
if not detected_paths:
|
||||
# This won't appear in YAML, logic adjusted below
|
||||
pass
|
||||
return config
|
||||
|
||||
def _format_game_name(self, game_name: str) -> str:
|
||||
"""Formats game name for Hoolamike configuration (removes spaces).
|
||||
|
||||
Hoolamike expects game names without spaces like: Fallout3, FalloutNewVegas, SkyrimSpecialEdition
|
||||
"""
|
||||
# Handle specific game name formats that Hoolamike expects
|
||||
game_name_map = {
|
||||
"Fallout 3": "Fallout3",
|
||||
"Fallout New Vegas": "FalloutNewVegas",
|
||||
"Skyrim Special Edition": "SkyrimSpecialEdition",
|
||||
"Fallout 4": "Fallout4",
|
||||
"Oblivion": "Oblivion" # No change needed
|
||||
}
|
||||
|
||||
# Use predefined mapping if available
|
||||
if game_name in game_name_map:
|
||||
return game_name_map[game_name]
|
||||
|
||||
# Otherwise, just remove spaces as fallback
|
||||
return game_name.replace(" ", "")
|
||||
|
||||
def _load_hoolamike_config(self):
|
||||
"""Load hoolamike.yaml if it exists, or generate a default one."""
|
||||
self._ensure_hoolamike_dirs_exist() # Ensure parent dir exists
|
||||
|
||||
if self.hoolamike_config_path.is_file():
|
||||
self.logger.info(f"Found existing hoolamike.yaml at {self.hoolamike_config_path}. Loading...")
|
||||
try:
|
||||
with open(self.hoolamike_config_path, 'r', encoding='utf-8') as f:
|
||||
self.hoolamike_config = yaml.safe_load(f)
|
||||
if not isinstance(self.hoolamike_config, dict):
|
||||
self.logger.warning(f"Failed to parse hoolamike.yaml as a dictionary. Generating default.")
|
||||
self.hoolamike_config = self._generate_default_config()
|
||||
self.save_hoolamike_config() # Save the newly generated default
|
||||
else:
|
||||
self.logger.info("Successfully loaded hoolamike.yaml configuration.")
|
||||
# Game path merging is handled in _run_discovery now
|
||||
except yaml.YAMLError as e:
|
||||
self.logger.error(f"Error parsing hoolamike.yaml: {e}. The file may be corrupted.")
|
||||
# Don't automatically overwrite - let user decide
|
||||
self.hoolamike_config = None
|
||||
return False
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error reading hoolamike.yaml: {e}.", exc_info=True)
|
||||
# Don't automatically overwrite - let user decide
|
||||
self.hoolamike_config = None
|
||||
return False
|
||||
else:
|
||||
self.logger.info(f"hoolamike.yaml not found at {self.hoolamike_config_path}. Generating default configuration.")
|
||||
self.hoolamike_config = self._generate_default_config()
|
||||
self.save_hoolamike_config()
|
||||
|
||||
return True
|
||||
|
||||
def save_hoolamike_config(self):
|
||||
"""Saves the current configuration dictionary to hoolamike.yaml."""
|
||||
if self.hoolamike_config is None:
|
||||
self.logger.error("Cannot save config, internal config dictionary is None.")
|
||||
return False
|
||||
|
||||
self._ensure_hoolamike_dirs_exist() # Ensure parent dir exists
|
||||
self.logger.info(f"Saving configuration to {self.hoolamike_config_path}")
|
||||
try:
|
||||
with open(self.hoolamike_config_path, 'w', encoding='utf-8') as f:
|
||||
# Add comments conditionally
|
||||
f.write("# Configuration file created or updated by Jackify\n")
|
||||
if not self.hoolamike_config.get("games"):
|
||||
f.write("# No games were detected by Jackify. Add game paths manually if needed.\n")
|
||||
# Dump the actual YAML
|
||||
yaml.dump(self.hoolamike_config, f, default_flow_style=False, sort_keys=False)
|
||||
self.logger.info("Configuration saved successfully.")
|
||||
return True
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error saving hoolamike.yaml: {e}", exc_info=True)
|
||||
return False
|
||||
|
||||
def _run_discovery(self):
|
||||
"""Execute all discovery steps."""
|
||||
self.logger.info("Starting Hoolamike feature discovery phase...")
|
||||
|
||||
# Detect game paths and update internal state + config
|
||||
self._detect_and_update_game_paths()
|
||||
|
||||
self.logger.info("Hoolamike discovery phase complete.")
|
||||
|
||||
def _detect_and_update_game_paths(self):
|
||||
"""Detect game install paths and update state and config."""
|
||||
self.logger.info("Detecting game install paths...")
|
||||
# Always run detection
|
||||
detected_paths = self.path_handler.find_game_install_paths(TARGET_GAME_APPIDS)
|
||||
self.game_install_paths = detected_paths # Update internal state
|
||||
self.logger.info(f"Detected game paths: {detected_paths}")
|
||||
|
||||
# Update the loaded config if it exists
|
||||
if self.hoolamike_config is not None:
|
||||
self.logger.debug("Updating loaded hoolamike.yaml with detected game paths.")
|
||||
if "games" not in self.hoolamike_config or not isinstance(self.hoolamike_config.get("games"), dict):
|
||||
self.hoolamike_config["games"] = {} # Ensure games section exists
|
||||
|
||||
# Define a unified format for game names in config - no spaces
|
||||
# Clear existing entries first to avoid duplicates
|
||||
self.hoolamike_config["games"] = {}
|
||||
|
||||
# Add detected paths with proper formatting - no spaces
|
||||
for game_name, detected_path in detected_paths.items():
|
||||
formatted_name = self._format_game_name(game_name)
|
||||
self.hoolamike_config["games"][formatted_name] = {"root_directory": str(detected_path)}
|
||||
|
||||
self.logger.info(f"Updated config with {len(detected_paths)} game paths using correct naming format (no spaces)")
|
||||
else:
|
||||
self.logger.warning("Cannot update game paths in config because config is not loaded.")
|
||||
|
||||
# --- Methods for Hoolamike Tasks (To be implemented later) ---
|
||||
# TODO: Update these methods to accept necessary parameters and update/save config
|
||||
|
||||
def install_update_hoolamike(self, context=None) -> bool:
|
||||
"""Install or update Hoolamike application.
|
||||
|
||||
Returns:
|
||||
bool: True if installation/update was successful or process was properly cancelled,
|
||||
False if a critical error occurred.
|
||||
"""
|
||||
self.logger.info("Starting Hoolamike Installation/Update...")
|
||||
print("\nStarting Hoolamike Installation/Update...")
|
||||
|
||||
# 1. Prompt user to install/reinstall/update
|
||||
try:
|
||||
# Check if Hoolamike is already installed at the expected path
|
||||
self._check_hoolamike_installation()
|
||||
if self.hoolamike_installed:
|
||||
self.logger.info(f"Hoolamike appears to be installed at: {self.hoolamike_executable_path}")
|
||||
print(f"{COLOR_INFO}Hoolamike is already installed at:{COLOR_RESET}")
|
||||
print(f" {self.hoolamike_executable_path}")
|
||||
# Use a menu-style prompt for reinstall/update
|
||||
print(f"\n{COLOR_PROMPT}Choose an action for Hoolamike:{COLOR_RESET}")
|
||||
print(f" 1. Reinstall/Update Hoolamike")
|
||||
print(f" 2. Keep existing installation (return to menu)")
|
||||
while True:
|
||||
choice = input(f"Select an option [1-2]: ").strip()
|
||||
if choice == '1':
|
||||
self.logger.info("User chose to reinstall/update Hoolamike.")
|
||||
break
|
||||
elif choice == '2' or choice.lower() == 'q':
|
||||
self.logger.info("User chose to keep existing Hoolamike installation.")
|
||||
print("Skipping Hoolamike installation/update.")
|
||||
return True
|
||||
else:
|
||||
print(f"{COLOR_WARNING}Invalid choice. Please enter 1 or 2.{COLOR_RESET}")
|
||||
# 2. Get installation directory from user (allow override)
|
||||
self.logger.info(f"Default install path: {self.hoolamike_app_install_path}")
|
||||
print("\nHoolamike Installation Directory:")
|
||||
print(f"Default: {self.hoolamike_app_install_path}")
|
||||
install_dir = self.menu_handler.get_directory_path(
|
||||
prompt_message=f"Specify where to install Hoolamike (or press Enter for default)",
|
||||
default_path=self.hoolamike_app_install_path,
|
||||
create_if_missing=True,
|
||||
no_header=True
|
||||
)
|
||||
if install_dir is None:
|
||||
self.logger.warning("User cancelled Hoolamike installation path selection.")
|
||||
print("Installation cancelled.")
|
||||
return True
|
||||
# Check if hoolamike already exists at this specific path
|
||||
potential_existing_exe = install_dir / HOOLAMIKE_EXECUTABLE_NAME
|
||||
if potential_existing_exe.is_file() and os.access(potential_existing_exe, os.X_OK):
|
||||
self.logger.info(f"Hoolamike executable found at the chosen path: {potential_existing_exe}")
|
||||
print(f"{COLOR_INFO}Hoolamike appears to already be installed at:{COLOR_RESET}")
|
||||
print(f" {install_dir}")
|
||||
# Use menu-style prompt for overwrite
|
||||
print(f"{COLOR_PROMPT}Choose an action for the existing installation:{COLOR_RESET}")
|
||||
print(f" 1. Download and overwrite (update)")
|
||||
print(f" 2. Keep existing installation (return to menu)")
|
||||
while True:
|
||||
overwrite_choice = input(f"Select an option [1-2]: ").strip()
|
||||
if overwrite_choice == '1':
|
||||
self.logger.info("User chose to update (overwrite) existing Hoolamike installation.")
|
||||
break
|
||||
elif overwrite_choice == '2' or overwrite_choice.lower() == 'q':
|
||||
self.logger.info("User chose to keep existing Hoolamike installation at chosen path.")
|
||||
print("Update cancelled. Using existing installation for this session.")
|
||||
self.hoolamike_app_install_path = install_dir
|
||||
self.hoolamike_executable_path = potential_existing_exe
|
||||
self.hoolamike_installed = True
|
||||
return True
|
||||
else:
|
||||
print(f"{COLOR_WARNING}Invalid choice. Please enter 1 or 2.{COLOR_RESET}")
|
||||
# Proceed with install/update
|
||||
self.logger.info(f"Proceeding with installation to directory: {install_dir}")
|
||||
self.hoolamike_app_install_path = install_dir
|
||||
# Get latest release info from GitHub
|
||||
release_url = "https://api.github.com/repos/Niedzwiedzw/hoolamike/releases/latest"
|
||||
download_url = None
|
||||
asset_name = None
|
||||
try:
|
||||
self.logger.info(f"Fetching latest release info from {release_url}")
|
||||
show_status("Fetching latest Hoolamike release info...")
|
||||
response = requests.get(release_url, timeout=15, verify=True)
|
||||
response.raise_for_status()
|
||||
release_data = response.json()
|
||||
self.logger.debug(f"GitHub Release Data: {release_data}")
|
||||
linux_tar_asset = None
|
||||
linux_zip_asset = None
|
||||
for asset in release_data.get('assets', []):
|
||||
name = asset.get('name', '').lower()
|
||||
self.logger.debug(f"Checking asset: {name}")
|
||||
is_linux = 'linux' in name
|
||||
is_x64 = 'x86_64' in name or 'amd64' in name
|
||||
is_incompatible_arch = 'arm' in name or 'aarch64' in name or 'darwin' in name
|
||||
if is_linux and is_x64 and not is_incompatible_arch:
|
||||
if name.endswith(('.tar.gz', '.tgz')):
|
||||
linux_tar_asset = asset
|
||||
self.logger.debug(f"Found potential tar asset: {name}")
|
||||
break
|
||||
elif name.endswith('.zip') and not linux_tar_asset:
|
||||
linux_zip_asset = asset
|
||||
self.logger.debug(f"Found potential zip asset: {name}")
|
||||
chosen_asset = linux_tar_asset or linux_zip_asset
|
||||
if not chosen_asset:
|
||||
clear_status()
|
||||
self.logger.error("Could not find a suitable Linux x86_64 download asset (tar.gz/zip) in the latest release.")
|
||||
print(f"{COLOR_ERROR}Error: Could not find a linux x86_64 download asset in the latest Hoolamike release.{COLOR_RESET}")
|
||||
return False
|
||||
download_url = chosen_asset.get('browser_download_url')
|
||||
asset_name = chosen_asset.get('name')
|
||||
if not download_url or not asset_name:
|
||||
clear_status()
|
||||
self.logger.error(f"Chosen asset is missing URL or name: {chosen_asset}")
|
||||
print(f"{COLOR_ERROR}Error: Found asset but could not get download details.{COLOR_RESET}")
|
||||
return False
|
||||
self.logger.info(f"Found asset '{asset_name}' for download: {download_url}")
|
||||
clear_status()
|
||||
except requests.exceptions.RequestException as e:
|
||||
clear_status()
|
||||
self.logger.error(f"Failed to fetch release info from GitHub: {e}")
|
||||
print(f"Error: Failed to contact GitHub to check for Hoolamike updates: {e}")
|
||||
return False
|
||||
except Exception as e:
|
||||
clear_status()
|
||||
self.logger.error(f"Error parsing release info: {e}", exc_info=True)
|
||||
print("Error: Failed to understand release information from GitHub.")
|
||||
return False
|
||||
# Download the asset
|
||||
show_status(f"Downloading {asset_name}...")
|
||||
temp_download_path = self.hoolamike_app_install_path / asset_name
|
||||
if not self.filesystem_handler.download_file(download_url, temp_download_path, overwrite=True, quiet=True):
|
||||
clear_status()
|
||||
self.logger.error(f"Failed to download {asset_name} from {download_url}")
|
||||
print(f"{COLOR_ERROR}Error: Failed to download Hoolamike asset.{COLOR_RESET}")
|
||||
return False
|
||||
clear_status()
|
||||
self.logger.info(f"Downloaded {asset_name} successfully to {temp_download_path}")
|
||||
show_status("Extracting Hoolamike archive...")
|
||||
# Extract the asset
|
||||
try:
|
||||
if asset_name.lower().endswith(('.tar.gz', '.tgz')):
|
||||
self.logger.debug(f"Extracting tar file: {temp_download_path}")
|
||||
with tarfile.open(temp_download_path, 'r:*') as tar:
|
||||
tar.extractall(path=self.hoolamike_app_install_path)
|
||||
self.logger.info("Extracted tar file successfully.")
|
||||
elif asset_name.lower().endswith('.zip'):
|
||||
self.logger.debug(f"Extracting zip file: {temp_download_path}")
|
||||
with zipfile.ZipFile(temp_download_path, 'r') as zip_ref:
|
||||
zip_ref.extractall(self.hoolamike_app_install_path)
|
||||
self.logger.info("Extracted zip file successfully.")
|
||||
else:
|
||||
clear_status()
|
||||
self.logger.error(f"Unknown archive format for asset: {asset_name}")
|
||||
print(f"{COLOR_ERROR}Error: Unknown file type '{asset_name}'. Cannot extract.{COLOR_RESET}")
|
||||
return False
|
||||
clear_status()
|
||||
print("Extraction complete. Setting permissions...")
|
||||
except (tarfile.TarError, zipfile.BadZipFile, EOFError) as e:
|
||||
clear_status()
|
||||
self.logger.error(f"Failed to extract archive {temp_download_path}: {e}", exc_info=True)
|
||||
print(f"{COLOR_ERROR}Error: Failed to extract downloaded file: {e}{COLOR_RESET}")
|
||||
return False
|
||||
except Exception as e:
|
||||
clear_status()
|
||||
self.logger.error(f"An unexpected error occurred during extraction: {e}", exc_info=True)
|
||||
print(f"{COLOR_ERROR}An unexpected error occurred during extraction.{COLOR_RESET}")
|
||||
return False
|
||||
finally:
|
||||
# Clean up downloaded archive
|
||||
if temp_download_path.exists():
|
||||
try:
|
||||
temp_download_path.unlink()
|
||||
self.logger.debug(f"Removed temporary download file: {temp_download_path}")
|
||||
except OSError as e:
|
||||
self.logger.warning(f"Could not remove temporary download file {temp_download_path}: {e}")
|
||||
# Set execute permissions on the binary
|
||||
executable_path = self.hoolamike_app_install_path / HOOLAMIKE_EXECUTABLE_NAME
|
||||
if executable_path.is_file():
|
||||
try:
|
||||
show_status("Setting permissions on Hoolamike executable...")
|
||||
os.chmod(executable_path, 0o755)
|
||||
self.logger.info(f"Set execute permissions (+x) on {executable_path}")
|
||||
clear_status()
|
||||
print("Permissions set successfully.")
|
||||
except OSError as e:
|
||||
clear_status()
|
||||
self.logger.error(f"Failed to set execute permission on {executable_path}: {e}")
|
||||
print(f"{COLOR_ERROR}Error: Could not set execute permission on Hoolamike executable.{COLOR_RESET}")
|
||||
else:
|
||||
clear_status()
|
||||
self.logger.error(f"Hoolamike executable not found after extraction at {executable_path}")
|
||||
print(f"{COLOR_ERROR}Error: Hoolamike executable missing after extraction!{COLOR_RESET}")
|
||||
return False
|
||||
# Update self.hoolamike_installed and self.hoolamike_executable_path state
|
||||
self.logger.info("Refreshing Hoolamike installation status...")
|
||||
self._check_hoolamike_installation()
|
||||
if not self.hoolamike_installed:
|
||||
self.logger.error("Hoolamike check failed after apparent successful install/extract.")
|
||||
print(f"{COLOR_ERROR}Error: Installation completed, but failed final verification check.{COLOR_RESET}")
|
||||
return False
|
||||
# Save install path to Jackify config
|
||||
self.logger.info(f"Saving Hoolamike install path to Jackify config: {self.hoolamike_app_install_path}")
|
||||
self.config_handler.set('hoolamike_install_path', str(self.hoolamike_app_install_path))
|
||||
if not self.config_handler.save_config():
|
||||
self.logger.warning("Failed to save Jackify config file after updating Hoolamike path.")
|
||||
# Non-fatal, but warn user?
|
||||
print(f"{COLOR_WARNING}Warning: Could not save installation path to main Jackify config file.{COLOR_RESET}")
|
||||
print(f"{COLOR_SUCCESS}Hoolamike installation/update successful!{COLOR_RESET}")
|
||||
self.logger.info("Hoolamike install/update process completed successfully.")
|
||||
return True
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error during Hoolamike installation/update: {e}", exc_info=True)
|
||||
print(f"{COLOR_ERROR}Error: An unexpected error occurred during Hoolamike installation/update: {e}{COLOR_RESET}")
|
||||
return False
|
||||
|
||||
def install_modlist(self, wabbajack_path=None, install_path=None, downloads_path=None, premium=False, api_key=None, game_resolution=None, context=None):
|
||||
"""
|
||||
Install a Wabbajack modlist using Hoolamike, following Jackify's Discovery/Configuration/Confirmation pattern.
|
||||
"""
|
||||
self.logger.info("Starting Hoolamike modlist install (Discovery Phase)")
|
||||
self._check_hoolamike_installation()
|
||||
menu = self.menu_handler
|
||||
print(f"\n{'='*60}")
|
||||
print(f"{COLOR_INFO}Hoolamike Modlist Installation{COLOR_RESET}")
|
||||
print(f"{'='*60}\n")
|
||||
|
||||
# --- Discovery Phase ---
|
||||
# 1. Auto-detect games (robust, multi-library)
|
||||
detected_games = self.path_handler.find_vanilla_game_paths()
|
||||
# 2. Prompt for .wabbajack file (custom prompt, only accept .wabbajack, q to exit, with tab-completion)
|
||||
print()
|
||||
while not wabbajack_path:
|
||||
print(f"{COLOR_WARNING}This option requires a Nexus Mods Premium account for automatic downloads.{COLOR_RESET}")
|
||||
print(f"If you don't have a premium account, please use the '{COLOR_SELECTION}Non-Premium Installation{COLOR_RESET}' option from the previous menu instead.\n")
|
||||
print(f"Before continuing, you'll need a .wabbajack file. You can usually find these at:")
|
||||
print(f" 1. {COLOR_INFO}https://build.wabbajack.org/authored_files{COLOR_RESET} - Official Wabbajack modlist repository")
|
||||
print(f" 2. {COLOR_INFO}https://www.nexusmods.com/{COLOR_RESET} - Some modlist authors publish on Nexus Mods")
|
||||
print(f" 3. Various Discord communities for specific modlists\n")
|
||||
print(f"{COLOR_WARNING}NOTE: Download the .wabbajack file first, then continue. Enter 'q' to exit.{COLOR_RESET}\n")
|
||||
# Use menu.get_existing_file_path for tab-completion
|
||||
candidate = menu.get_existing_file_path(
|
||||
prompt_message="Enter the path to your .wabbajack file (or 'q' to cancel):",
|
||||
extension_filter=".wabbajack",
|
||||
no_header=True
|
||||
)
|
||||
if candidate is None:
|
||||
print(f"{COLOR_WARNING}Cancelled by user.{COLOR_RESET}")
|
||||
return False
|
||||
# If user literally typed 'q', treat as cancel
|
||||
if str(candidate).strip().lower() == 'q':
|
||||
print(f"{COLOR_WARNING}Cancelled by user.{COLOR_RESET}")
|
||||
return False
|
||||
wabbajack_path = candidate
|
||||
# 3. Prompt for install directory
|
||||
print()
|
||||
while True:
|
||||
install_path_result = menu.get_directory_path(
|
||||
prompt_message="Select the directory where the modlist should be installed:",
|
||||
default_path=DEFAULT_MODLIST_INSTALL_BASE_DIR / wabbajack_path.stem,
|
||||
create_if_missing=True,
|
||||
no_header=False
|
||||
)
|
||||
if not install_path_result:
|
||||
print(f"{COLOR_WARNING}Cancelled by user.{COLOR_RESET}")
|
||||
return False
|
||||
# Handle tuple (path, should_create)
|
||||
if isinstance(install_path_result, tuple):
|
||||
install_path, install_should_create = install_path_result
|
||||
else:
|
||||
install_path, install_should_create = install_path_result, False
|
||||
# Check if directory exists and is not empty
|
||||
if install_path.exists() and any(install_path.iterdir()):
|
||||
print(f"{COLOR_WARNING}Warning: The selected directory '{install_path}' already exists and is not empty. Its contents may be overwritten!{COLOR_RESET}")
|
||||
confirm = input(f"{COLOR_PROMPT}This directory is not empty and may be overwritten. Proceed? (y/N): {COLOR_RESET}").strip().lower()
|
||||
if not confirm.startswith('y'):
|
||||
print(f"{COLOR_INFO}Please select a different directory.\n{COLOR_RESET}")
|
||||
continue
|
||||
break
|
||||
# 4. Prompt for downloads directory
|
||||
print()
|
||||
if not downloads_path:
|
||||
downloads_path_result = menu.get_directory_path(
|
||||
prompt_message="Select the directory for mod downloads:",
|
||||
default_path=DEFAULT_HOOLAMIKE_DOWNLOADS_DIR,
|
||||
create_if_missing=True,
|
||||
no_header=False
|
||||
)
|
||||
if not downloads_path_result:
|
||||
print(f"{COLOR_WARNING}Cancelled by user.{COLOR_RESET}")
|
||||
return False
|
||||
# Handle tuple (path, should_create)
|
||||
if isinstance(downloads_path_result, tuple):
|
||||
downloads_path, downloads_should_create = downloads_path_result
|
||||
else:
|
||||
downloads_path, downloads_should_create = downloads_path_result, False
|
||||
else:
|
||||
downloads_should_create = False
|
||||
# 5. Nexus API key
|
||||
print()
|
||||
current_api_key = self.hoolamike_config.get('downloaders', {}).get('nexus', {}).get('api_key') if self.hoolamike_config else None
|
||||
if not current_api_key or current_api_key == 'YOUR_API_KEY_HERE':
|
||||
api_key = menu.get_nexus_api_key(current_api_key)
|
||||
if not api_key:
|
||||
print(f"{COLOR_WARNING}Cancelled by user.{COLOR_RESET}")
|
||||
return False
|
||||
else:
|
||||
api_key = current_api_key
|
||||
|
||||
# --- Summary & Confirmation ---
|
||||
print(f"\n{'-'*60}")
|
||||
print(f"{COLOR_INFO}Summary of configuration:{COLOR_RESET}")
|
||||
print(f"- Wabbajack file: {wabbajack_path}")
|
||||
print(f"- Install directory: {install_path}")
|
||||
print(f"- Downloads directory: {downloads_path}")
|
||||
print(f"- Nexus API key: [{'Set' if api_key else 'Not Set'}]")
|
||||
print("- Games:")
|
||||
for game in ["Fallout 3", "Fallout New Vegas", "Skyrim Special Edition", "Oblivion", "Fallout 4"]:
|
||||
found = detected_games.get(game)
|
||||
print(f" {game}: {found if found else 'Not Found'}")
|
||||
print(f"{'-'*60}")
|
||||
print(f"{COLOR_WARNING}Proceed with these settings and start Hoolamike install? (Warning: This can take MANY HOURS){COLOR_RESET}")
|
||||
confirm = input(f"{COLOR_PROMPT}[Y/n]: {COLOR_RESET}").strip().lower()
|
||||
if confirm and not confirm.startswith('y'):
|
||||
print(f"{COLOR_WARNING}Cancelled by user.{COLOR_RESET}")
|
||||
return False
|
||||
# --- Actually create directories if needed ---
|
||||
if install_should_create and not install_path.exists():
|
||||
try:
|
||||
install_path.mkdir(parents=True, exist_ok=True)
|
||||
print(f"{COLOR_SUCCESS}Install directory created: {install_path}{COLOR_RESET}")
|
||||
except Exception as e:
|
||||
print(f"{COLOR_ERROR}Failed to create install directory: {e}{COLOR_RESET}")
|
||||
return False
|
||||
if downloads_should_create and not downloads_path.exists():
|
||||
try:
|
||||
downloads_path.mkdir(parents=True, exist_ok=True)
|
||||
print(f"{COLOR_SUCCESS}Downloads directory created: {downloads_path}{COLOR_RESET}")
|
||||
except Exception as e:
|
||||
print(f"{COLOR_ERROR}Failed to create downloads directory: {e}{COLOR_RESET}")
|
||||
return False
|
||||
|
||||
# --- Configuration Phase ---
|
||||
# Prepare config dict
|
||||
config = {
|
||||
"downloaders": {
|
||||
"downloads_directory": str(downloads_path),
|
||||
"nexus": {"api_key": api_key}
|
||||
},
|
||||
"installation": {
|
||||
"wabbajack_file_path": str(wabbajack_path),
|
||||
"installation_path": str(install_path)
|
||||
},
|
||||
"games": {
|
||||
self._format_game_name(game): {"root_directory": str(path)}
|
||||
for game, path in detected_games.items()
|
||||
},
|
||||
"fixup": {
|
||||
"game_resolution": "1920x1080"
|
||||
},
|
||||
# Resolution intentionally omitted
|
||||
# "extras": {},
|
||||
# No 'jackify_managed' key here
|
||||
}
|
||||
self.hoolamike_config = config
|
||||
if not self.save_hoolamike_config():
|
||||
print(f"{COLOR_ERROR}Failed to save hoolamike.yaml. Aborting.{COLOR_RESET}")
|
||||
return False
|
||||
|
||||
# --- Run Hoolamike ---
|
||||
print(f"\n{COLOR_INFO}Starting Hoolamike...{COLOR_RESET}")
|
||||
print(f"{COLOR_INFO}Streaming output below. Press Ctrl+C to cancel and return to Jackify menu.{COLOR_RESET}\n")
|
||||
# Defensive: Ensure executable path is set and valid
|
||||
if not self.hoolamike_executable_path or not Path(self.hoolamike_executable_path).is_file():
|
||||
print(f"{COLOR_ERROR}Error: Hoolamike executable not found or not set. Please (re)install Hoolamike from the menu before continuing.{COLOR_RESET}")
|
||||
input(f"{COLOR_PROMPT}Press Enter to return to the Hoolamike menu...{COLOR_RESET}")
|
||||
return False
|
||||
try:
|
||||
cmd = [str(self.hoolamike_executable_path), "install"]
|
||||
ret = subprocess.call(cmd, cwd=str(self.hoolamike_app_install_path), env=get_clean_subprocess_env())
|
||||
if ret == 0:
|
||||
print(f"\n{COLOR_SUCCESS}Hoolamike completed successfully!{COLOR_RESET}")
|
||||
input(f"{COLOR_PROMPT}Press Enter to return to the Hoolamike menu...{COLOR_RESET}")
|
||||
return True
|
||||
else:
|
||||
print(f"\n{COLOR_ERROR}Hoolamike process failed with exit code {ret}.{COLOR_RESET}")
|
||||
input(f"{COLOR_PROMPT}Press Enter to return to the Hoolamike menu...{COLOR_RESET}")
|
||||
return False
|
||||
except KeyboardInterrupt:
|
||||
print(f"\n{COLOR_WARNING}Hoolamike install interrupted by user. Returning to menu.{COLOR_RESET}")
|
||||
input(f"{COLOR_PROMPT}Press Enter to return to the Hoolamike menu...{COLOR_RESET}")
|
||||
return False
|
||||
except Exception as e:
|
||||
print(f"\n{COLOR_ERROR}Error running Hoolamike: {e}{COLOR_RESET}")
|
||||
input(f"{COLOR_PROMPT}Press Enter to return to the Hoolamike menu...{COLOR_RESET}")
|
||||
return False
|
||||
|
||||
def install_ttw(self, ttw_mpi_path=None, ttw_output_path=None, context=None):
|
||||
"""Install Tale of Two Wastelands (TTW) using Hoolamike.
|
||||
|
||||
Args:
|
||||
ttw_mpi_path: Path to the TTW installer .mpi file
|
||||
ttw_output_path: Target installation directory for TTW
|
||||
|
||||
Returns:
|
||||
bool: True if successful, False otherwise
|
||||
"""
|
||||
self.logger.info(f"Starting Tale of Two Wastelands installation via Hoolamike")
|
||||
self._check_hoolamike_installation()
|
||||
menu = self.menu_handler
|
||||
print(f"\n{'='*60}")
|
||||
print(f"{COLOR_INFO}Hoolamike: Tale of Two Wastelands Installation{COLOR_RESET}")
|
||||
print(f"{'='*60}\n")
|
||||
print(f"This feature will install Tale of Two Wastelands (TTW) using Hoolamike.")
|
||||
print(f"Requirements:")
|
||||
print(f" • Fallout 3 and Fallout New Vegas must be installed and detected.")
|
||||
print(f" • You must provide the path to your TTW .mpi installer file.")
|
||||
print(f" • You must select an output directory for the TTW install.\n")
|
||||
|
||||
# Ensure config is loaded
|
||||
if self.hoolamike_config is None:
|
||||
loaded = self._load_hoolamike_config()
|
||||
if not loaded or self.hoolamike_config is None:
|
||||
self.logger.error("Failed to load or generate hoolamike.yaml configuration.")
|
||||
print(f"{COLOR_ERROR}Error: Could not load or generate Hoolamike configuration. Aborting TTW install.{COLOR_RESET}")
|
||||
return False
|
||||
|
||||
# Verify required games are in configuration
|
||||
required_games = ['Fallout 3', 'Fallout New Vegas']
|
||||
detected_games = self.path_handler.find_vanilla_game_paths()
|
||||
missing_games = [game for game in required_games if game not in detected_games]
|
||||
if missing_games:
|
||||
self.logger.error(f"Missing required games for TTW installation: {', '.join(missing_games)}")
|
||||
print(f"{COLOR_ERROR}Error: The following required games were not found: {', '.join(missing_games)}{COLOR_RESET}")
|
||||
print("TTW requires both Fallout 3 and Fallout New Vegas to be installed.")
|
||||
return False
|
||||
|
||||
# Prompt for TTW .mpi file
|
||||
print(f"{COLOR_INFO}Please provide the path to your TTW .mpi installer file.{COLOR_RESET}")
|
||||
print(f"You can download this from: {COLOR_INFO}https://mod.pub/ttw/133/files{COLOR_RESET}")
|
||||
print(f"(Extract the .mpi file from the downloaded archive.)\n")
|
||||
while not ttw_mpi_path:
|
||||
candidate = menu.get_existing_file_path(
|
||||
prompt_message="Enter the path to your TTW .mpi file (or 'q' to cancel):",
|
||||
extension_filter=".mpi",
|
||||
no_header=True
|
||||
)
|
||||
if candidate is None:
|
||||
print(f"{COLOR_WARNING}Cancelled by user.{COLOR_RESET}")
|
||||
return False
|
||||
if str(candidate).strip().lower() == 'q':
|
||||
print(f"{COLOR_WARNING}Cancelled by user.{COLOR_RESET}")
|
||||
return False
|
||||
ttw_mpi_path = candidate
|
||||
|
||||
# Prompt for output directory
|
||||
print(f"\n{COLOR_INFO}Please select the output directory where TTW will be installed.{COLOR_RESET}")
|
||||
print(f"(This should be an empty or new directory.)\n")
|
||||
while not ttw_output_path:
|
||||
ttw_output_path = menu.get_directory_path(
|
||||
prompt_message="Select the TTW output directory:",
|
||||
default_path=self.hoolamike_app_install_path / "TTW_Output",
|
||||
create_if_missing=True,
|
||||
no_header=False
|
||||
)
|
||||
if not ttw_output_path:
|
||||
print(f"{COLOR_WARNING}Cancelled by user.{COLOR_RESET}")
|
||||
return False
|
||||
if ttw_output_path.exists() and any(ttw_output_path.iterdir()):
|
||||
print(f"{COLOR_WARNING}Warning: The selected directory '{ttw_output_path}' already exists and is not empty. Its contents may be overwritten!{COLOR_RESET}")
|
||||
confirm = input(f"{COLOR_PROMPT}This directory is not empty and may be overwritten. Proceed? (y/N): {COLOR_RESET}").strip().lower()
|
||||
if not confirm.startswith('y'):
|
||||
print(f"{COLOR_INFO}Please select a different directory.\n{COLOR_RESET}")
|
||||
ttw_output_path = None
|
||||
continue
|
||||
|
||||
# --- Summary & Confirmation ---
|
||||
print(f"\n{'-'*60}")
|
||||
print(f"{COLOR_INFO}Summary of configuration:{COLOR_RESET}")
|
||||
print(f"- TTW .mpi file: {ttw_mpi_path}")
|
||||
print(f"- Output directory: {ttw_output_path}")
|
||||
print("- Games:")
|
||||
for game in required_games:
|
||||
found = detected_games.get(game)
|
||||
print(f" {game}: {found if found else 'Not Found'}")
|
||||
print(f"{'-'*60}")
|
||||
print(f"{COLOR_WARNING}Proceed with these settings and start TTW installation? (This can take MANY HOURS){COLOR_RESET}")
|
||||
confirm = input(f"{COLOR_PROMPT}[Y/n]: {COLOR_RESET}").strip().lower()
|
||||
if confirm and not confirm.startswith('y'):
|
||||
print(f"{COLOR_WARNING}Cancelled by user.{COLOR_RESET}")
|
||||
return False
|
||||
|
||||
# --- Always re-detect games before updating config ---
|
||||
detected_games = self.path_handler.find_vanilla_game_paths()
|
||||
if not detected_games:
|
||||
print(f"{COLOR_ERROR}No supported games were detected on your system. TTW requires Fallout 3 and Fallout New Vegas to be installed.{COLOR_RESET}")
|
||||
return False
|
||||
# Update the games section with correct keys
|
||||
if self.hoolamike_config is None:
|
||||
self.hoolamike_config = {}
|
||||
self.hoolamike_config['games'] = {
|
||||
self._format_game_name(game): {"root_directory": str(path)}
|
||||
for game, path in detected_games.items()
|
||||
}
|
||||
|
||||
# Update TTW configuration
|
||||
self._update_hoolamike_config_for_ttw(ttw_mpi_path, ttw_output_path)
|
||||
if not self.save_hoolamike_config():
|
||||
self.logger.error("Failed to save hoolamike.yaml configuration.")
|
||||
print(f"{COLOR_ERROR}Error: Failed to save Hoolamike configuration.{COLOR_RESET}")
|
||||
print("Attempting to continue anyway...")
|
||||
|
||||
# Construct command to execute
|
||||
cmd = [
|
||||
str(self.hoolamike_executable_path),
|
||||
"tale-of-two-wastelands"
|
||||
]
|
||||
self.logger.info(f"Executing Hoolamike command: {' '.join(cmd)}")
|
||||
print(f"\n{COLOR_INFO}Executing Hoolamike for TTW Installation...{COLOR_RESET}")
|
||||
print(f"Command: {' '.join(cmd)}")
|
||||
print(f"{COLOR_INFO}Streaming output below. Press Ctrl+C to cancel and return to Jackify menu.{COLOR_RESET}\n")
|
||||
try:
|
||||
ret = subprocess.call(cmd, cwd=str(self.hoolamike_app_install_path), env=get_clean_subprocess_env())
|
||||
if ret == 0:
|
||||
self.logger.info("TTW installation completed successfully.")
|
||||
print(f"\n{COLOR_SUCCESS}TTW installation completed successfully!{COLOR_RESET}")
|
||||
input(f"{COLOR_PROMPT}Press Enter to return to the Hoolamike menu...{COLOR_RESET}")
|
||||
return True
|
||||
else:
|
||||
self.logger.error(f"TTW installation process returned non-zero exit code: {ret}")
|
||||
print(f"\n{COLOR_ERROR}Error: TTW installation failed with exit code {ret}.{COLOR_RESET}")
|
||||
input(f"{COLOR_PROMPT}Press Enter to return to the Hoolamike menu...{COLOR_RESET}")
|
||||
return False
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error executing Hoolamike TTW installation: {e}", exc_info=True)
|
||||
print(f"\n{COLOR_ERROR}Error executing Hoolamike TTW installation: {e}{COLOR_RESET}")
|
||||
input(f"{COLOR_PROMPT}Press Enter to return to the Hoolamike menu...{COLOR_RESET}")
|
||||
return False
|
||||
|
||||
def _update_hoolamike_config_for_ttw(self, ttw_mpi_path: Path, ttw_output_path: Path):
|
||||
"""Update the Hoolamike configuration with settings for TTW installation."""
|
||||
# Ensure extras and TTW sections exist
|
||||
if "extras" not in self.hoolamike_config:
|
||||
self.hoolamike_config["extras"] = {}
|
||||
|
||||
if "tale_of_two_wastelands" not in self.hoolamike_config["extras"]:
|
||||
self.hoolamike_config["extras"]["tale_of_two_wastelands"] = {
|
||||
"variables": {}
|
||||
}
|
||||
|
||||
# Update TTW configuration
|
||||
ttw_config = self.hoolamike_config["extras"]["tale_of_two_wastelands"]
|
||||
ttw_config["path_to_ttw_mpi_file"] = str(ttw_mpi_path)
|
||||
|
||||
# Ensure variables section exists
|
||||
if "variables" not in ttw_config:
|
||||
ttw_config["variables"] = {}
|
||||
|
||||
# Set destination variable
|
||||
ttw_config["variables"]["DESTINATION"] = str(ttw_output_path)
|
||||
|
||||
# Set USERPROFILE to a Jackify-managed directory for TTW
|
||||
userprofile_path = str(self.hoolamike_app_install_path / "USERPROFILE")
|
||||
if "variables" not in self.hoolamike_config["extras"]["tale_of_two_wastelands"]:
|
||||
self.hoolamike_config["extras"]["tale_of_two_wastelands"]["variables"] = {}
|
||||
self.hoolamike_config["extras"]["tale_of_two_wastelands"]["variables"]["USERPROFILE"] = userprofile_path
|
||||
|
||||
# Make sure game paths are set correctly
|
||||
for game in ['Fallout 3', 'Fallout New Vegas']:
|
||||
if game in self.game_install_paths:
|
||||
game_key = game.replace(' ', '').lower()
|
||||
|
||||
if "games" not in self.hoolamike_config:
|
||||
self.hoolamike_config["games"] = {}
|
||||
|
||||
if game not in self.hoolamike_config["games"]:
|
||||
self.hoolamike_config["games"][game] = {}
|
||||
|
||||
self.hoolamike_config["games"][game]["root_directory"] = str(self.game_install_paths[game])
|
||||
|
||||
self.logger.info("Updated Hoolamike configuration with TTW settings.")
|
||||
|
||||
def reset_config(self):
|
||||
"""Resets the hoolamike.yaml to default settings, backing up any existing file."""
|
||||
if self.hoolamike_config_path.is_file():
|
||||
# Create a backup with timestamp
|
||||
import datetime
|
||||
timestamp = datetime.datetime.now().strftime("%Y%m%d_%H%M%S")
|
||||
backup_path = self.hoolamike_config_path.with_suffix(f".{timestamp}.bak")
|
||||
try:
|
||||
import shutil
|
||||
shutil.copy2(self.hoolamike_config_path, backup_path)
|
||||
self.logger.info(f"Created backup of existing config at {backup_path}")
|
||||
print(f"{COLOR_INFO}Created backup of existing config at {backup_path}{COLOR_RESET}")
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to create backup of config: {e}")
|
||||
print(f"{COLOR_WARNING}Warning: Failed to create backup of config: {e}{COLOR_RESET}")
|
||||
|
||||
# Generate and save a fresh default config
|
||||
self.logger.info("Generating new default configuration")
|
||||
self.hoolamike_config = self._generate_default_config()
|
||||
if self.save_hoolamike_config():
|
||||
self.logger.info("Successfully reset config to defaults")
|
||||
print(f"{COLOR_SUCCESS}Successfully reset configuration to defaults.{COLOR_RESET}")
|
||||
return True
|
||||
else:
|
||||
self.logger.error("Failed to save new default config")
|
||||
print(f"{COLOR_ERROR}Failed to save new default configuration.{COLOR_RESET}")
|
||||
return False
|
||||
|
||||
def edit_hoolamike_config(self):
|
||||
"""Opens the hoolamike.yaml file in a chosen editor, with a 0 option to return to menu."""
|
||||
self.logger.info("Task: Edit Hoolamike Config started...")
|
||||
self._check_hoolamike_installation()
|
||||
if not self.hoolamike_installed:
|
||||
self.logger.warning("Cannot edit config - Hoolamike not installed")
|
||||
print(f"\n{COLOR_WARNING}Hoolamike is not installed through Jackify yet.{COLOR_RESET}")
|
||||
print(f"Please use option 1 from the Hoolamike menu to install Hoolamike first.")
|
||||
print(f"This will ensure that Jackify can properly manage the Hoolamike configuration.")
|
||||
return False
|
||||
if self.hoolamike_config is None:
|
||||
self.logger.warning("Config is not loaded properly. Will attempt to fix or create.")
|
||||
print(f"\n{COLOR_WARNING}Configuration file may be corrupted or not accessible.{COLOR_RESET}")
|
||||
print("Options:")
|
||||
print("1. Reset to default configuration (backup will be created)")
|
||||
print("2. Try to edit the file anyway (may be corrupted)")
|
||||
print("0. Cancel and return to menu")
|
||||
choice = input("\nEnter your choice (0-2): ").strip()
|
||||
if choice == "1":
|
||||
if not self.reset_config():
|
||||
self.logger.error("Failed to reset configuration")
|
||||
print(f"{COLOR_ERROR}Failed to reset configuration. See logs for details.{COLOR_RESET}")
|
||||
return
|
||||
elif choice == "2":
|
||||
self.logger.warning("User chose to edit potentially corrupted config")
|
||||
# Continue to editing
|
||||
elif choice == "0":
|
||||
self.logger.info("User cancelled editing corrupted config")
|
||||
print("Edit cancelled.")
|
||||
return
|
||||
else:
|
||||
self.logger.info("User cancelled editing corrupted config")
|
||||
print("Edit cancelled.")
|
||||
return
|
||||
if not self.hoolamike_config_path.exists():
|
||||
self.logger.warning(f"Hoolamike config file does not exist at {self.hoolamike_config_path}. Generating default before editing.")
|
||||
self.hoolamike_config = self._generate_default_config()
|
||||
self.save_hoolamike_config()
|
||||
if not self.hoolamike_config_path.exists():
|
||||
self.logger.error("Failed to create config file for editing.")
|
||||
print("Error: Could not create configuration file.")
|
||||
return
|
||||
available_editors = ["nano", "vim", "vi", "gedit", "kate", "micro"]
|
||||
preferred_editor = os.environ.get("EDITOR")
|
||||
found_editors = {}
|
||||
import shutil
|
||||
for editor_name in available_editors:
|
||||
editor_path = shutil.which(editor_name)
|
||||
if editor_path and editor_path not in found_editors.values():
|
||||
found_editors[editor_name] = editor_path
|
||||
if preferred_editor:
|
||||
preferred_editor_path = shutil.which(preferred_editor)
|
||||
if preferred_editor_path and preferred_editor_path not in found_editors.values():
|
||||
display_name = os.path.basename(preferred_editor) if '/' in preferred_editor else preferred_editor
|
||||
if display_name not in found_editors:
|
||||
found_editors[display_name] = preferred_editor_path
|
||||
if not found_editors:
|
||||
self.logger.error("No suitable text editors found on the system.")
|
||||
print(f"{COLOR_ERROR}Error: No common text editors (nano, vim, gedit, kate, micro) found.{COLOR_RESET}")
|
||||
return
|
||||
sorted_editor_names = sorted(found_editors.keys())
|
||||
print("\nSelect an editor to open the configuration file:")
|
||||
print(f"(System default EDITOR is: {preferred_editor if preferred_editor else 'Not set'})")
|
||||
for i, name in enumerate(sorted_editor_names):
|
||||
print(f" {i + 1}. {name}")
|
||||
print(f" 0. Return to Hoolamike Menu")
|
||||
while True:
|
||||
try:
|
||||
choice = input(f"Enter choice (0-{len(sorted_editor_names)}): ").strip()
|
||||
if choice == "0":
|
||||
print("Edit cancelled.")
|
||||
return
|
||||
choice_index = int(choice) - 1
|
||||
if 0 <= choice_index < len(sorted_editor_names):
|
||||
chosen_name = sorted_editor_names[choice_index]
|
||||
editor_to_use_path = found_editors[chosen_name]
|
||||
break
|
||||
else:
|
||||
print("Invalid choice.")
|
||||
except ValueError:
|
||||
print("Invalid input. Please enter a number.")
|
||||
except KeyboardInterrupt:
|
||||
print("\nEdit cancelled.")
|
||||
return
|
||||
if editor_to_use_path:
|
||||
self.logger.info(f"Launching editor '{editor_to_use_path}' for {self.hoolamike_config_path}")
|
||||
try:
|
||||
process = subprocess.Popen([editor_to_use_path, str(self.hoolamike_config_path)])
|
||||
process.wait()
|
||||
self.logger.info(f"Editor '{editor_to_use_path}' closed. Reloading config...")
|
||||
if not self._load_hoolamike_config():
|
||||
self.logger.error("Failed to load config after editing. It may still be corrupted.")
|
||||
print(f"{COLOR_ERROR}Warning: The configuration file could not be parsed after editing.{COLOR_RESET}")
|
||||
print("You may need to fix it manually or reset it to defaults.")
|
||||
return False
|
||||
else:
|
||||
self.logger.info("Successfully reloaded config after editing.")
|
||||
print(f"{COLOR_SUCCESS}Configuration file successfully updated.{COLOR_RESET}")
|
||||
return True
|
||||
except FileNotFoundError:
|
||||
self.logger.error(f"Editor '{editor_to_use_path}' not found unexpectedly.")
|
||||
print(f"{COLOR_ERROR}Error: Editor command '{editor_to_use_path}' not found.{COLOR_RESET}")
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error launching or waiting for editor: {e}")
|
||||
print(f"{COLOR_ERROR}An error occurred while launching the editor: {e}{COLOR_RESET}")
|
||||
|
||||
# Example usage (for testing, remove later)
|
||||
if __name__ == '__main__':
|
||||
logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(name)s - %(levelname)s - %(message)s')
|
||||
print("Running HoolamikeHandler discovery...")
|
||||
handler = HoolamikeHandler(steamdeck=False, verbose=True)
|
||||
print("\n--- Discovery Results ---")
|
||||
print(f"Game Paths: {handler.game_install_paths}")
|
||||
print(f"Hoolamike App Install Path: {handler.hoolamike_app_install_path}")
|
||||
print(f"Hoolamike Executable: {handler.hoolamike_executable_path}")
|
||||
print(f"Hoolamike Installed: {handler.hoolamike_installed}")
|
||||
print(f"Hoolamike Config Path: {handler.hoolamike_config_path}")
|
||||
config_loaded = isinstance(handler.hoolamike_config, dict)
|
||||
print(f"Hoolamike Config Loaded: {config_loaded}")
|
||||
if config_loaded:
|
||||
print(f" Downloads Dir: {handler.hoolamike_config.get('downloaders', {}).get('downloads_directory')}")
|
||||
print(f" API Key Set: {'Yes' if handler.hoolamike_config.get('downloaders', {}).get('nexus', {}).get('api_key') != 'YOUR_API_KEY_HERE' else 'No'}")
|
||||
print("-------------------------")
|
||||
# Test edit config (example)
|
||||
# handler.edit_hoolamike_config()
|
||||
@@ -1196,7 +1196,8 @@ class InstallWabbajackHandler:
|
||||
"""Displays the final success message and next steps."""
|
||||
# Basic log file path (assuming standard location)
|
||||
# TODO: Get log file path more reliably if needed
|
||||
log_path = Path.home() / "Jackify" / "logs" / "jackify-cli.log"
|
||||
from jackify.shared.paths import get_jackify_logs_dir
|
||||
log_path = get_jackify_logs_dir() / "jackify-cli.log"
|
||||
|
||||
print("\n───────────────────────────────────────────────────────────────────")
|
||||
print(f"{COLOR_INFO}Wabbajack Installation Completed Successfully!{COLOR_RESET}")
|
||||
|
||||
@@ -21,7 +21,8 @@ class LoggingHandler:
|
||||
logger = LoggingHandler().setup_logger('install_wabbajack', 'jackify-install-wabbajack.log')
|
||||
"""
|
||||
def __init__(self):
|
||||
self.log_dir = Path.home() / "Jackify" / "logs"
|
||||
from jackify.shared.paths import get_jackify_logs_dir
|
||||
self.log_dir = get_jackify_logs_dir()
|
||||
self.ensure_log_directory()
|
||||
|
||||
def ensure_log_directory(self) -> None:
|
||||
|
||||
@@ -152,8 +152,10 @@ class ModlistMenuHandler:
|
||||
self.path_handler = PathHandler()
|
||||
self.vdf_handler = VDFHandler()
|
||||
|
||||
# Determine Steam Deck status (already done by ConfigHandler, use it)
|
||||
self.steamdeck = config_handler.settings.get('steamdeck', False)
|
||||
# Determine Steam Deck status using centralized detection
|
||||
from ..services.platform_detection_service import PlatformDetectionService
|
||||
platform_service = PlatformDetectionService.get_instance()
|
||||
self.steamdeck = platform_service.is_steamdeck
|
||||
|
||||
# Create the resolution handler
|
||||
self.resolution_handler = ResolutionHandler()
|
||||
@@ -178,7 +180,13 @@ class ModlistMenuHandler:
|
||||
self.logger.error(f"Error initializing ModlistMenuHandler: {e}")
|
||||
# Initialize with defaults/empty to prevent errors
|
||||
self.filesystem_handler = FileSystemHandler()
|
||||
self.steamdeck = False
|
||||
# Use centralized detection even in fallback
|
||||
try:
|
||||
from ..services.platform_detection_service import PlatformDetectionService
|
||||
platform_service = PlatformDetectionService.get_instance()
|
||||
self.steamdeck = platform_service.is_steamdeck
|
||||
except:
|
||||
self.steamdeck = False # Final fallback
|
||||
self.modlist_handler = None
|
||||
|
||||
def show_modlist_menu(self):
|
||||
@@ -563,15 +571,19 @@ class ModlistMenuHandler:
|
||||
self.logger.warning(f"[DEBUG] Could not find AppID for {context['name']} with exe {context['mo2_exe_path']}")
|
||||
set_modlist_result = self.modlist_handler.set_modlist(context)
|
||||
self.logger.debug(f"[DEBUG] set_modlist returned: {set_modlist_result}")
|
||||
|
||||
# Check GUI mode early to avoid input() calls in GUI context
|
||||
import os
|
||||
gui_mode = os.environ.get('JACKIFY_GUI_MODE') == '1'
|
||||
|
||||
if not set_modlist_result:
|
||||
print(f"{COLOR_ERROR}\nError setting up context for configuration.{COLOR_RESET}")
|
||||
self.logger.error(f"set_modlist failed for {context.get('name')}")
|
||||
input(f"\n{COLOR_PROMPT}Press Enter to continue...{COLOR_RESET}")
|
||||
if not gui_mode:
|
||||
input(f"\n{COLOR_PROMPT}Press Enter to continue...{COLOR_RESET}")
|
||||
return False
|
||||
|
||||
# --- Resolution selection logic for GUI mode ---
|
||||
import os
|
||||
gui_mode = os.environ.get('JACKIFY_GUI_MODE') == '1'
|
||||
selected_resolution = context.get('resolution', None)
|
||||
if gui_mode:
|
||||
# If resolution is provided, set it and do not prompt
|
||||
@@ -642,7 +654,12 @@ class ModlistMenuHandler:
|
||||
print("Modlist Install and Configuration complete!")
|
||||
print(f"• You should now be able to Launch '{context.get('name')}' through Steam")
|
||||
print("• Congratulations and enjoy the game!")
|
||||
print("Detailed log available at: ~/Jackify/logs/Configure_New_Modlist_workflow.log")
|
||||
print("")
|
||||
print("NOTE: If you experience ENB issues, consider using GE-Proton 10-14 instead of")
|
||||
print(" Valve's Proton 10 (known ENB compatibility issues in Valve's Proton 10).")
|
||||
print("")
|
||||
from jackify.shared.paths import get_jackify_logs_dir
|
||||
print(f"Detailed log available at: {get_jackify_logs_dir()}/Configure_New_Modlist_workflow.log")
|
||||
# Only wait for input in CLI mode, not GUI mode
|
||||
if not gui_mode:
|
||||
input(f"{COLOR_PROMPT}Press Enter to return to the menu...{COLOR_RESET}")
|
||||
@@ -851,60 +868,6 @@ class MenuHandler:
|
||||
self.logger.debug("_clear_screen: Clearing screen for POSIX by printing 100 newlines.")
|
||||
print("\n" * 100, flush=True)
|
||||
|
||||
def show_hoolamike_menu(self, cli_instance):
|
||||
"""Show the Hoolamike Modlist Management menu"""
|
||||
if not hasattr(cli_instance, 'hoolamike_handler') or cli_instance.hoolamike_handler is None:
|
||||
try:
|
||||
from .hoolamike_handler import HoolamikeHandler
|
||||
cli_instance.hoolamike_handler = HoolamikeHandler(
|
||||
steamdeck=getattr(cli_instance, 'steamdeck', False),
|
||||
verbose=getattr(cli_instance, 'verbose', False),
|
||||
filesystem_handler=getattr(cli_instance, 'filesystem_handler', None),
|
||||
config_handler=getattr(cli_instance, 'config_handler', None),
|
||||
menu_handler=self
|
||||
)
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to initialize Hoolamike features: {e}", exc_info=True)
|
||||
print(f"{COLOR_ERROR}Error: Failed to initialize Hoolamike features. Check logs.{COLOR_RESET}")
|
||||
input("\nPress Enter to return to the main menu...")
|
||||
return # Exit this menu if handler fails
|
||||
|
||||
while True:
|
||||
self._clear_screen()
|
||||
# Banner display handled by frontend
|
||||
# Use print_section_header for consistency if available, otherwise manual with COLOR_SELECTION
|
||||
if hasattr(self, 'print_section_header'): # Check if method exists (it's from ui_utils)
|
||||
print_section_header("Hoolamike Modlist Management")
|
||||
else: # Fallback if not imported or available directly on self
|
||||
print(f"{COLOR_SELECTION}Hoolamike Modlist Management{COLOR_RESET}")
|
||||
print(f"{COLOR_SELECTION}{'-'*30}{COLOR_RESET}")
|
||||
|
||||
print(f"{COLOR_SELECTION}1.{COLOR_RESET} Install or Update Hoolamike App")
|
||||
print(f"{COLOR_SELECTION}2.{COLOR_RESET} Install Modlist (Nexus Premium)")
|
||||
print(f"{COLOR_SELECTION}3.{COLOR_RESET} Install Modlist (Non-Premium) {COLOR_DISABLED}(Not Implemented){COLOR_RESET}")
|
||||
print(f"{COLOR_SELECTION}4.{COLOR_RESET} Install Tale of Two Wastelands (TTW)")
|
||||
print(f"{COLOR_SELECTION}5.{COLOR_RESET} Edit Hoolamike Configuration")
|
||||
print(f"{COLOR_SELECTION}0.{COLOR_RESET} Return to Main Menu")
|
||||
selection = input(f"\n{COLOR_PROMPT}Enter your selection (0-5): {COLOR_RESET}").strip()
|
||||
|
||||
if selection.lower() == 'q': # Allow 'q' to re-display menu
|
||||
continue
|
||||
if selection == "1":
|
||||
cli_instance.hoolamike_handler.install_update_hoolamike()
|
||||
elif selection == "2":
|
||||
cli_instance.hoolamike_handler.install_modlist(premium=True)
|
||||
elif selection == "3":
|
||||
print(f"{COLOR_INFO}Install Modlist (Non-Premium) is not yet implemented.{COLOR_RESET}")
|
||||
input("\nPress Enter to return to the Hoolamike menu...")
|
||||
elif selection == "4":
|
||||
cli_instance.hoolamike_handler.install_ttw()
|
||||
elif selection == "5":
|
||||
cli_instance.hoolamike_handler.edit_hoolamike_config()
|
||||
elif selection == "0":
|
||||
break
|
||||
else:
|
||||
print("Invalid selection. Please try again.")
|
||||
time.sleep(1)
|
||||
|
||||
|
||||
|
||||
|
||||
@@ -109,6 +109,12 @@ class ModlistHandler:
|
||||
self.logger = logging.getLogger(__name__)
|
||||
self.logger.propagate = False
|
||||
self.steamdeck = steamdeck
|
||||
|
||||
# DEBUG: Log ModlistHandler instantiation details for SD card path debugging
|
||||
import traceback
|
||||
caller_info = traceback.extract_stack()[-2] # Get caller info
|
||||
self.logger.debug(f"[SD_CARD_DEBUG] ModlistHandler created: id={id(self)}, steamdeck={steamdeck}")
|
||||
self.logger.debug(f"[SD_CARD_DEBUG] Created from: {caller_info.filename}:{caller_info.lineno} in {caller_info.name}()")
|
||||
self.steam_path: Optional[Path] = None
|
||||
self.verbose = verbose # Store verbose flag
|
||||
self.mo2_path: Optional[Path] = None
|
||||
@@ -321,12 +327,19 @@ class ModlistHandler:
|
||||
|
||||
# Determine if modlist is on SD card (Steam Deck only)
|
||||
# On non-Steam Deck systems, /media mounts should use Z: drive, not D: drive
|
||||
if (str(self.modlist_dir).startswith("/run/media") or str(self.modlist_dir).startswith("/media")) and self.steamdeck:
|
||||
is_on_sdcard_path = str(self.modlist_dir).startswith("/run/media") or str(self.modlist_dir).startswith("/media")
|
||||
|
||||
# Log SD card detection for debugging
|
||||
self.logger.debug(f"SD card detection: modlist_dir={self.modlist_dir}, is_sdcard_path={is_on_sdcard_path}, steamdeck={self.steamdeck}")
|
||||
|
||||
if is_on_sdcard_path and self.steamdeck:
|
||||
self.modlist_sdcard = True
|
||||
self.logger.info("Modlist appears to be on an SD card (Steam Deck).")
|
||||
self.logger.debug(f"Set modlist_sdcard=True")
|
||||
else:
|
||||
self.modlist_sdcard = False
|
||||
if (str(self.modlist_dir).startswith("/run/media") or str(self.modlist_dir).startswith("/media")) and not self.steamdeck:
|
||||
self.logger.debug(f"Set modlist_sdcard=False (is_on_sdcard_path={is_on_sdcard_path}, steamdeck={self.steamdeck})")
|
||||
if is_on_sdcard_path and not self.steamdeck:
|
||||
self.logger.info("Modlist on /media mount detected on non-Steam Deck system - using Z: drive mapping.")
|
||||
|
||||
# Find and set compatdata path now that we have appid
|
||||
@@ -558,15 +571,19 @@ class ModlistHandler:
|
||||
status_callback (callable, optional): A function to call with status updates during configuration.
|
||||
manual_steps_completed (bool): If True, skip the manual steps prompt (used for new modlist flow).
|
||||
"""
|
||||
# Store status_callback for Configuration Summary
|
||||
self._current_status_callback = status_callback
|
||||
|
||||
self.logger.info("Executing configuration steps...")
|
||||
|
||||
# Ensure required context is set
|
||||
if not all([self.modlist_dir, self.appid, self.game_var, self.steamdeck is not None]):
|
||||
self.logger.error("Cannot execute configuration steps: Missing required context (modlist_dir, appid, game_var, steamdeck status).")
|
||||
print("Error: Missing required information to start configuration.")
|
||||
try:
|
||||
# Store status_callback for Configuration Summary
|
||||
self._current_status_callback = status_callback
|
||||
|
||||
self.logger.info("Executing configuration steps...")
|
||||
|
||||
# Ensure required context is set
|
||||
if not all([self.modlist_dir, self.appid, self.game_var, self.steamdeck is not None]):
|
||||
self.logger.error("Cannot execute configuration steps: Missing required context (modlist_dir, appid, game_var, steamdeck status).")
|
||||
print("Error: Missing required information to start configuration.")
|
||||
return False
|
||||
except Exception as e:
|
||||
self.logger.error(f"Exception in _execute_configuration_steps initialization: {e}", exc_info=True)
|
||||
return False
|
||||
|
||||
# Step 1: Set protontricks permissions
|
||||
@@ -672,25 +689,6 @@ class ModlistHandler:
|
||||
return False
|
||||
self.logger.info("Step 3: Curated user.reg.modlist and system.reg.modlist applied successfully.")
|
||||
|
||||
# Step 3.5: Apply universal dotnet4.x compatibility registry fixes
|
||||
if status_callback:
|
||||
status_callback(f"{self._get_progress_timestamp()} Applying universal dotnet4.x compatibility fixes")
|
||||
self.logger.info("Step 3.5: Applying universal dotnet4.x compatibility registry fixes...")
|
||||
registry_success = False
|
||||
try:
|
||||
registry_success = self._apply_universal_dotnet_fixes()
|
||||
except Exception as e:
|
||||
self.logger.error(f"CRITICAL: Registry fixes failed - modlist may have .NET compatibility issues: {e}")
|
||||
registry_success = False
|
||||
|
||||
if not registry_success:
|
||||
self.logger.error("=" * 80)
|
||||
self.logger.error("WARNING: Universal dotnet4.x registry fixes FAILED!")
|
||||
self.logger.error("This modlist may experience .NET Framework compatibility issues.")
|
||||
self.logger.error("Consider manually setting mscoree=native in winecfg if problems occur.")
|
||||
self.logger.error("=" * 80)
|
||||
# Continue but user should be aware of potential issues
|
||||
|
||||
# Step 4: Install Wine Components
|
||||
if status_callback:
|
||||
status_callback(f"{self._get_progress_timestamp()} Installing Wine components (this may take a while)")
|
||||
@@ -712,18 +710,23 @@ class ModlistHandler:
|
||||
target_appid = self.appid
|
||||
|
||||
# Use user's preferred component installation method (respects settings toggle)
|
||||
self.logger.debug(f"Getting WINEPREFIX for AppID {target_appid}...")
|
||||
wineprefix = self.protontricks_handler.get_wine_prefix_path(target_appid)
|
||||
if not wineprefix:
|
||||
self.logger.error("Failed to get WINEPREFIX path for component installation.")
|
||||
print("Error: Could not determine wine prefix location.")
|
||||
return False
|
||||
self.logger.debug(f"WINEPREFIX obtained: {wineprefix}")
|
||||
|
||||
# Use the winetricks handler which respects the user's toggle setting
|
||||
try:
|
||||
self.logger.info("Installing Wine components using user's preferred method...")
|
||||
success = self.winetricks_handler.install_wine_components(wineprefix, self.game_var_full, specific_components=components)
|
||||
self.logger.debug(f"Calling winetricks_handler.install_wine_components with wineprefix={wineprefix}, game_var={self.game_var_full}, components={components}")
|
||||
success = self.winetricks_handler.install_wine_components(wineprefix, self.game_var_full, specific_components=components, status_callback=status_callback)
|
||||
if success:
|
||||
self.logger.info("Wine component installation completed successfully")
|
||||
if status_callback:
|
||||
status_callback(f"{self._get_progress_timestamp()} Wine components verified and installed successfully")
|
||||
else:
|
||||
self.logger.error("Wine component installation failed")
|
||||
print("Error: Failed to install necessary Wine components.")
|
||||
@@ -734,6 +737,39 @@ class ModlistHandler:
|
||||
return False
|
||||
self.logger.info("Step 4: Installing Wine components... Done")
|
||||
|
||||
# Step 4.5: Apply universal dotnet4.x compatibility registry fixes AFTER wine components
|
||||
# This ensures the fixes are not overwritten by component installation processes
|
||||
if status_callback:
|
||||
status_callback(f"{self._get_progress_timestamp()} Applying universal dotnet4.x compatibility fixes")
|
||||
self.logger.info("Step 4.5: Applying universal dotnet4.x compatibility registry fixes...")
|
||||
registry_success = False
|
||||
try:
|
||||
registry_success = self._apply_universal_dotnet_fixes()
|
||||
except Exception as e:
|
||||
self.logger.error(f"CRITICAL: Registry fixes failed - modlist may have .NET compatibility issues: {e}")
|
||||
registry_success = False
|
||||
|
||||
if not registry_success:
|
||||
self.logger.error("=" * 80)
|
||||
self.logger.error("WARNING: Universal dotnet4.x registry fixes FAILED!")
|
||||
self.logger.error("This modlist may experience .NET Framework compatibility issues.")
|
||||
self.logger.error("Consider manually setting mscoree=native in winecfg if problems occur.")
|
||||
self.logger.error("=" * 80)
|
||||
# Continue but user should be aware of potential issues
|
||||
|
||||
# Step 4.6: Enable dotfiles visibility for Wine prefix
|
||||
if status_callback:
|
||||
status_callback(f"{self._get_progress_timestamp()} Enabling dotfiles visibility")
|
||||
self.logger.info("Step 4.6: Enabling dotfiles visibility in Wine prefix...")
|
||||
try:
|
||||
if self.protontricks_handler.enable_dotfiles(self.appid):
|
||||
self.logger.info("Dotfiles visibility enabled successfully")
|
||||
else:
|
||||
self.logger.warning("Failed to enable dotfiles visibility (non-critical, continuing)")
|
||||
except Exception as e:
|
||||
self.logger.warning(f"Error enabling dotfiles visibility: {e} (non-critical, continuing)")
|
||||
self.logger.info("Step 4.6: Enabling dotfiles visibility... Done")
|
||||
|
||||
# Step 5: Ensure permissions of Modlist directory
|
||||
if status_callback:
|
||||
status_callback(f"{self._get_progress_timestamp()} Setting ownership and permissions for modlist directory")
|
||||
@@ -812,6 +848,15 @@ class ModlistHandler:
|
||||
# Conditionally update binary and working directory paths
|
||||
# Skip for jackify-engine workflows since paths are already correct
|
||||
# Exception: Always run for SD card installs to fix Z:/run/media/... to D:/... paths
|
||||
|
||||
# DEBUG: Add comprehensive logging to identify Steam Deck SD card path manipulation issues
|
||||
engine_installed = getattr(self, 'engine_installed', False)
|
||||
self.logger.debug(f"[SD_CARD_DEBUG] ModlistHandler instance: id={id(self)}")
|
||||
self.logger.debug(f"[SD_CARD_DEBUG] engine_installed: {engine_installed}")
|
||||
self.logger.debug(f"[SD_CARD_DEBUG] modlist_sdcard: {self.modlist_sdcard}")
|
||||
self.logger.debug(f"[SD_CARD_DEBUG] steamdeck parameter passed to constructor: {getattr(self, 'steamdeck', 'NOT_SET')}")
|
||||
self.logger.debug(f"[SD_CARD_DEBUG] Path manipulation condition: not {engine_installed} or {self.modlist_sdcard} = {not engine_installed or self.modlist_sdcard}")
|
||||
|
||||
if not getattr(self, 'engine_installed', False) or self.modlist_sdcard:
|
||||
# Convert steamapps/common path to library root path
|
||||
steam_libraries = None
|
||||
@@ -831,7 +876,8 @@ class ModlistHandler:
|
||||
print("Error: Failed to update binary and working directory paths in ModOrganizer.ini.")
|
||||
return False # Abort on failure
|
||||
else:
|
||||
self.logger.debug("Skipping path manipulation - jackify-engine already set correct paths in ModOrganizer.ini")
|
||||
self.logger.debug("[SD_CARD_DEBUG] Skipping path manipulation - jackify-engine already set correct paths in ModOrganizer.ini")
|
||||
self.logger.debug(f"[SD_CARD_DEBUG] SKIPPED because: engine_installed={engine_installed} and modlist_sdcard={self.modlist_sdcard}")
|
||||
self.logger.info("Step 8: Updating ModOrganizer.ini paths... Done")
|
||||
|
||||
# Step 9: Update Resolution Settings (if applicable)
|
||||
@@ -881,16 +927,25 @@ class ModlistHandler:
|
||||
if self.steam_library and self.game_var_full:
|
||||
vanilla_game_dir = str(Path(self.steam_library) / "steamapps" / "common" / self.game_var_full)
|
||||
|
||||
if not self.path_handler.create_dxvk_conf(
|
||||
dxvk_created = self.path_handler.create_dxvk_conf(
|
||||
modlist_dir=self.modlist_dir,
|
||||
modlist_sdcard=self.modlist_sdcard,
|
||||
steam_library=str(self.steam_library) if self.steam_library else None, # Pass as string or None
|
||||
basegame_sdcard=self.basegame_sdcard,
|
||||
game_var_full=self.game_var_full,
|
||||
vanilla_game_dir=vanilla_game_dir
|
||||
):
|
||||
self.logger.warning("Failed to create dxvk.conf file.")
|
||||
print("Warning: Failed to create dxvk.conf file.")
|
||||
vanilla_game_dir=vanilla_game_dir,
|
||||
stock_game_path=self.stock_game_path
|
||||
)
|
||||
dxvk_verified = self.path_handler.verify_dxvk_conf_exists(
|
||||
modlist_dir=self.modlist_dir,
|
||||
steam_library=str(self.steam_library) if self.steam_library else None,
|
||||
game_var_full=self.game_var_full,
|
||||
vanilla_game_dir=vanilla_game_dir,
|
||||
stock_game_path=self.stock_game_path
|
||||
)
|
||||
if not dxvk_created or not dxvk_verified:
|
||||
self.logger.warning("DXVK configuration file is missing or incomplete after post-install steps.")
|
||||
print("Warning: Failed to verify dxvk.conf file (required for AMD GPUs).")
|
||||
self.logger.info("Step 10: Creating dxvk.conf... Done")
|
||||
|
||||
# Step 11a: Small Tasks - Delete Incompatible Plugins
|
||||
@@ -1501,14 +1556,18 @@ class ModlistHandler:
|
||||
return False
|
||||
|
||||
def _apply_universal_dotnet_fixes(self):
|
||||
"""Apply universal dotnet4.x compatibility registry fixes to ALL modlists"""
|
||||
"""
|
||||
Apply universal dotnet4.x compatibility registry fixes to ALL modlists.
|
||||
Now called AFTER wine component installation to prevent overwrites.
|
||||
Includes wineserver shutdown/flush to ensure persistence.
|
||||
"""
|
||||
try:
|
||||
prefix_path = os.path.join(str(self.compat_data_path), "pfx")
|
||||
if not os.path.exists(prefix_path):
|
||||
self.logger.warning(f"Prefix path not found: {prefix_path}")
|
||||
return False
|
||||
|
||||
self.logger.info("Applying universal dotnet4.x compatibility registry fixes...")
|
||||
self.logger.info("Applying universal dotnet4.x compatibility registry fixes (post-component installation)...")
|
||||
|
||||
# Find the appropriate Wine binary to use for registry operations
|
||||
wine_binary = self._find_wine_binary_for_registry()
|
||||
@@ -1516,11 +1575,27 @@ class ModlistHandler:
|
||||
self.logger.error("Could not find Wine binary for registry operations")
|
||||
return False
|
||||
|
||||
# Find wineserver binary for flushing registry changes
|
||||
wine_dir = os.path.dirname(wine_binary)
|
||||
wineserver_binary = os.path.join(wine_dir, 'wineserver')
|
||||
if not os.path.exists(wineserver_binary):
|
||||
self.logger.warning(f"wineserver not found at {wineserver_binary}, registry flush may not work")
|
||||
wineserver_binary = None
|
||||
|
||||
# Set environment for Wine registry operations
|
||||
env = os.environ.copy()
|
||||
env['WINEPREFIX'] = prefix_path
|
||||
env['WINEDEBUG'] = '-all' # Suppress Wine debug output
|
||||
|
||||
# Shutdown any running wineserver processes to ensure clean slate
|
||||
if wineserver_binary:
|
||||
self.logger.debug("Shutting down wineserver before applying registry fixes...")
|
||||
try:
|
||||
subprocess.run([wineserver_binary, '-w'], env=env, timeout=30, capture_output=True)
|
||||
self.logger.debug("Wineserver shutdown complete")
|
||||
except Exception as e:
|
||||
self.logger.warning(f"Wineserver shutdown failed (non-critical): {e}")
|
||||
|
||||
# Registry fix 1: Set mscoree=native DLL override
|
||||
# This tells Wine to use native .NET runtime instead of Wine's implementation
|
||||
self.logger.debug("Setting mscoree=native DLL override...")
|
||||
@@ -1530,7 +1605,7 @@ class ModlistHandler:
|
||||
'/v', 'mscoree', '/t', 'REG_SZ', '/d', 'native', '/f'
|
||||
]
|
||||
|
||||
result1 = subprocess.run(cmd1, env=env, capture_output=True, text=True)
|
||||
result1 = subprocess.run(cmd1, env=env, capture_output=True, text=True, errors='replace', timeout=30)
|
||||
if result1.returncode == 0:
|
||||
self.logger.info("Successfully applied mscoree=native DLL override")
|
||||
else:
|
||||
@@ -1545,18 +1620,57 @@ class ModlistHandler:
|
||||
'/v', 'OnlyUseLatestCLR', '/t', 'REG_DWORD', '/d', '1', '/f'
|
||||
]
|
||||
|
||||
result2 = subprocess.run(cmd2, env=env, capture_output=True, text=True)
|
||||
result2 = subprocess.run(cmd2, env=env, capture_output=True, text=True, errors='replace', timeout=30)
|
||||
if result2.returncode == 0:
|
||||
self.logger.info("Successfully applied OnlyUseLatestCLR=1 registry entry")
|
||||
else:
|
||||
self.logger.warning(f"Failed to set OnlyUseLatestCLR: {result2.stderr}")
|
||||
|
||||
# Both fixes applied - this should eliminate dotnet4.x installation requirements
|
||||
if result1.returncode == 0 and result2.returncode == 0:
|
||||
self.logger.info("Universal dotnet4.x compatibility fixes applied successfully")
|
||||
# Force wineserver to flush registry changes to disk
|
||||
if wineserver_binary:
|
||||
self.logger.debug("Flushing registry changes to disk via wineserver shutdown...")
|
||||
try:
|
||||
subprocess.run([wineserver_binary, '-w'], env=env, timeout=30, capture_output=True)
|
||||
self.logger.debug("Registry changes flushed to disk")
|
||||
except Exception as e:
|
||||
self.logger.warning(f"Registry flush failed (non-critical): {e}")
|
||||
|
||||
# VERIFICATION: Confirm the registry entries persisted
|
||||
self.logger.info("Verifying registry entries were applied and persisted...")
|
||||
verification_passed = True
|
||||
|
||||
# Verify mscoree=native
|
||||
verify_cmd1 = [
|
||||
wine_binary, 'reg', 'query',
|
||||
'HKEY_CURRENT_USER\\Software\\Wine\\DllOverrides',
|
||||
'/v', 'mscoree'
|
||||
]
|
||||
verify_result1 = subprocess.run(verify_cmd1, env=env, capture_output=True, text=True, errors='replace', timeout=30)
|
||||
if verify_result1.returncode == 0 and 'native' in verify_result1.stdout:
|
||||
self.logger.info("VERIFIED: mscoree=native is set correctly")
|
||||
else:
|
||||
self.logger.error(f"VERIFICATION FAILED: mscoree=native not found in registry. Query output: {verify_result1.stdout}")
|
||||
verification_passed = False
|
||||
|
||||
# Verify OnlyUseLatestCLR=1
|
||||
verify_cmd2 = [
|
||||
wine_binary, 'reg', 'query',
|
||||
'HKEY_LOCAL_MACHINE\\Software\\Microsoft\\.NETFramework',
|
||||
'/v', 'OnlyUseLatestCLR'
|
||||
]
|
||||
verify_result2 = subprocess.run(verify_cmd2, env=env, capture_output=True, text=True, errors='replace', timeout=30)
|
||||
if verify_result2.returncode == 0 and ('0x1' in verify_result2.stdout or 'REG_DWORD' in verify_result2.stdout):
|
||||
self.logger.info("VERIFIED: OnlyUseLatestCLR=1 is set correctly")
|
||||
else:
|
||||
self.logger.error(f"VERIFICATION FAILED: OnlyUseLatestCLR=1 not found in registry. Query output: {verify_result2.stdout}")
|
||||
verification_passed = False
|
||||
|
||||
# Both fixes applied and verified
|
||||
if result1.returncode == 0 and result2.returncode == 0 and verification_passed:
|
||||
self.logger.info("Universal dotnet4.x compatibility fixes applied, flushed, and verified successfully")
|
||||
return True
|
||||
else:
|
||||
self.logger.warning("Some dotnet4.x registry fixes failed, but continuing...")
|
||||
self.logger.error("Registry fixes failed verification - fixes may not persist across prefix restarts")
|
||||
return False
|
||||
|
||||
except Exception as e:
|
||||
|
||||
@@ -49,7 +49,7 @@ logger = logging.getLogger(__name__) # Standard logger init
|
||||
# Helper function to get path to jackify-install-engine
|
||||
def get_jackify_engine_path():
|
||||
if getattr(sys, 'frozen', False) and hasattr(sys, '_MEIPASS'):
|
||||
# Running in a PyInstaller bundle
|
||||
# Running inside the bundled AppImage (frozen)
|
||||
# Engine is expected at <bundle_root>/jackify/engine/jackify-engine
|
||||
return os.path.join(sys._MEIPASS, 'jackify', 'engine', 'jackify-engine')
|
||||
else:
|
||||
@@ -408,51 +408,76 @@ class ModlistInstallCLI:
|
||||
self.context['download_dir'] = download_dir_path
|
||||
self.logger.debug(f"Download directory context set to: {self.context['download_dir']}")
|
||||
|
||||
# 5. Prompt for Nexus API key (skip if in context)
|
||||
# 5. Get Nexus authentication (OAuth or API key)
|
||||
if 'nexus_api_key' not in self.context:
|
||||
from jackify.backend.services.api_key_service import APIKeyService
|
||||
api_key_service = APIKeyService()
|
||||
saved_key = api_key_service.get_saved_api_key()
|
||||
api_key = None
|
||||
if saved_key:
|
||||
print("\n" + "-" * 28)
|
||||
print(f"{COLOR_INFO}A Nexus API Key is already saved.{COLOR_RESET}")
|
||||
use_saved = input(f"{COLOR_PROMPT}Use the saved API key? [Y/n]: {COLOR_RESET}").strip().lower()
|
||||
if use_saved in ('', 'y', 'yes'):
|
||||
api_key = saved_key
|
||||
from jackify.backend.services.nexus_auth_service import NexusAuthService
|
||||
auth_service = NexusAuthService()
|
||||
|
||||
# Get current auth status
|
||||
authenticated, method, username = auth_service.get_auth_status()
|
||||
|
||||
if authenticated:
|
||||
# Already authenticated - use existing auth
|
||||
if method == 'oauth':
|
||||
print("\n" + "-" * 28)
|
||||
print(f"{COLOR_SUCCESS}Nexus Authentication: Authorized via OAuth{COLOR_RESET}")
|
||||
if username:
|
||||
print(f"{COLOR_INFO}Logged in as: {username}{COLOR_RESET}")
|
||||
elif method == 'api_key':
|
||||
print("\n" + "-" * 28)
|
||||
print(f"{COLOR_INFO}Nexus Authentication: Using API Key (Legacy){COLOR_RESET}")
|
||||
|
||||
# Get valid token/key
|
||||
api_key = auth_service.ensure_valid_auth()
|
||||
if api_key:
|
||||
self.context['nexus_api_key'] = api_key
|
||||
else:
|
||||
new_key = input(f"{COLOR_PROMPT}Enter a new Nexus API Key (or press Enter to keep the saved one): {COLOR_RESET}").strip()
|
||||
if new_key:
|
||||
api_key = new_key
|
||||
replace = input(f"{COLOR_PROMPT}Replace the saved key with this one? [y/N]: {COLOR_RESET}").strip().lower()
|
||||
if replace == 'y':
|
||||
if api_key_service.save_api_key(api_key):
|
||||
print(f"{COLOR_SUCCESS}API key saved successfully.{COLOR_RESET}")
|
||||
else:
|
||||
print(f"{COLOR_WARNING}Failed to save API key. Using for this session only.{COLOR_RESET}")
|
||||
# Auth expired or invalid - prompt to set up
|
||||
print(f"\n{COLOR_WARNING}Your authentication has expired or is invalid.{COLOR_RESET}")
|
||||
authenticated = False
|
||||
|
||||
if not authenticated:
|
||||
# Not authenticated - offer to set up OAuth
|
||||
print("\n" + "-" * 28)
|
||||
print(f"{COLOR_WARNING}Nexus Mods authentication is required for downloading mods.{COLOR_RESET}")
|
||||
print(f"\n{COLOR_PROMPT}Would you like to authorize with Nexus now?{COLOR_RESET}")
|
||||
print(f"{COLOR_INFO}This will open your browser for secure OAuth authorization.{COLOR_RESET}")
|
||||
|
||||
authorize = input(f"{COLOR_PROMPT}Authorize now? [Y/n]: {COLOR_RESET}").strip().lower()
|
||||
|
||||
if authorize in ('', 'y', 'yes'):
|
||||
# Launch OAuth authorization
|
||||
print(f"\n{COLOR_INFO}Starting OAuth authorization...{COLOR_RESET}")
|
||||
print(f"{COLOR_WARNING}Your browser will open shortly.{COLOR_RESET}")
|
||||
print(f"{COLOR_INFO}Note: Your browser may ask permission to open 'xdg-open' or{COLOR_RESET}")
|
||||
print(f"{COLOR_INFO}Jackify's protocol handler - please click 'Open' or 'Allow'.{COLOR_RESET}")
|
||||
|
||||
def show_message(msg):
|
||||
print(f"\n{COLOR_INFO}{msg}{COLOR_RESET}")
|
||||
|
||||
success = auth_service.authorize_oauth(show_browser_message_callback=show_message)
|
||||
|
||||
if success:
|
||||
print(f"\n{COLOR_SUCCESS}OAuth authorization successful!{COLOR_RESET}")
|
||||
_, _, username = auth_service.get_auth_status()
|
||||
if username:
|
||||
print(f"{COLOR_INFO}Authorized as: {username}{COLOR_RESET}")
|
||||
|
||||
api_key = auth_service.ensure_valid_auth()
|
||||
if api_key:
|
||||
self.context['nexus_api_key'] = api_key
|
||||
else:
|
||||
print(f"{COLOR_INFO}Using new key for this session only. Saved key unchanged.{COLOR_RESET}")
|
||||
print(f"{COLOR_ERROR}Failed to retrieve auth token after authorization.{COLOR_RESET}")
|
||||
return None
|
||||
else:
|
||||
api_key = saved_key
|
||||
else:
|
||||
print("\n" + "-" * 28)
|
||||
print(f"{COLOR_INFO}A Nexus Mods API key is required for downloading mods.{COLOR_RESET}")
|
||||
print(f"{COLOR_INFO}You can get your personal key at: {COLOR_SELECTION}https://www.nexusmods.com/users/myaccount?tab=api{COLOR_RESET}")
|
||||
print(f"{COLOR_WARNING}Your API Key is NOT saved locally. It is used only for this session unless you choose to save it.{COLOR_RESET}")
|
||||
api_key = input(f"{COLOR_PROMPT}Enter Nexus API Key (or 'q' to cancel): {COLOR_RESET}").strip()
|
||||
if not api_key or api_key.lower() == 'q':
|
||||
self.logger.info("User cancelled or provided no API key.")
|
||||
return None
|
||||
save = input(f"{COLOR_PROMPT}Would you like to save this API key for future use? [y/N]: {COLOR_RESET}").strip().lower()
|
||||
if save == 'y':
|
||||
if api_key_service.save_api_key(api_key):
|
||||
print(f"{COLOR_SUCCESS}API key saved successfully.{COLOR_RESET}")
|
||||
else:
|
||||
print(f"{COLOR_WARNING}Failed to save API key. Using for this session only.{COLOR_RESET}")
|
||||
print(f"\n{COLOR_ERROR}OAuth authorization failed.{COLOR_RESET}")
|
||||
return None
|
||||
else:
|
||||
print(f"{COLOR_INFO}Using API key for this session only. It will not be saved.{COLOR_RESET}")
|
||||
self.context['nexus_api_key'] = api_key
|
||||
self.logger.debug(f"NEXUS_API_KEY is set in environment for engine (presence check).")
|
||||
# User declined OAuth - cancelled
|
||||
print(f"\n{COLOR_INFO}Authorization required to proceed. Installation cancelled.{COLOR_RESET}")
|
||||
self.logger.info("User declined Nexus authorization.")
|
||||
return None
|
||||
self.logger.debug(f"Nexus authentication configured for engine.")
|
||||
|
||||
# Display summary and confirm
|
||||
self._display_summary() # Ensure this method exists or implement it
|
||||
@@ -501,11 +526,23 @@ class ModlistInstallCLI:
|
||||
if isinstance(download_dir_display, tuple):
|
||||
download_dir_display = download_dir_display[0] # Get the Path object from (Path, bool)
|
||||
print(f"Download Directory: {download_dir_display}")
|
||||
|
||||
if self.context.get('nexus_api_key'):
|
||||
print(f"Nexus API Key: [SET]")
|
||||
|
||||
# Show authentication method
|
||||
from jackify.backend.services.nexus_auth_service import NexusAuthService
|
||||
auth_service = NexusAuthService()
|
||||
authenticated, method, username = auth_service.get_auth_status()
|
||||
|
||||
if method == 'oauth':
|
||||
auth_display = f"Nexus Authentication: OAuth"
|
||||
if username:
|
||||
auth_display += f" ({username})"
|
||||
elif method == 'api_key':
|
||||
auth_display = "Nexus Authentication: API Key (Legacy)"
|
||||
else:
|
||||
print(f"Nexus API Key: [NOT SET - WILL LIKELY FAIL]")
|
||||
# Should never reach here since we validate auth before getting to summary
|
||||
auth_display = "Nexus Authentication: Unknown"
|
||||
|
||||
print(auth_display)
|
||||
print(f"{COLOR_INFO}----------------------------------------{COLOR_RESET}")
|
||||
|
||||
def configuration_phase(self):
|
||||
@@ -521,7 +558,8 @@ class ModlistInstallCLI:
|
||||
start_time = time.time()
|
||||
|
||||
# --- BEGIN: TEE LOGGING SETUP & LOG ROTATION ---
|
||||
log_dir = Path.home() / "Jackify" / "logs"
|
||||
from jackify.shared.paths import get_jackify_logs_dir
|
||||
log_dir = get_jackify_logs_dir()
|
||||
log_dir.mkdir(parents=True, exist_ok=True)
|
||||
workflow_log_path = log_dir / "Modlist_Install_workflow.log"
|
||||
# Log rotation: keep last 3 logs, 1MB each (adjust as needed)
|
||||
@@ -597,8 +635,8 @@ class ModlistInstallCLI:
|
||||
# --- End Patch ---
|
||||
|
||||
# Build command
|
||||
cmd = [engine_path, 'install']
|
||||
|
||||
cmd = [engine_path, 'install', '--show-file-progress']
|
||||
|
||||
# Check for debug mode and pass --debug to engine if needed
|
||||
from jackify.backend.handlers.config_handler import ConfigHandler
|
||||
config_handler = ConfigHandler()
|
||||
@@ -606,7 +644,7 @@ class ModlistInstallCLI:
|
||||
if debug_mode:
|
||||
cmd.append('--debug')
|
||||
self.logger.info("Debug mode enabled in config - passing --debug flag to jackify-engine")
|
||||
|
||||
|
||||
# Determine if this is a local .wabbajack file or an online modlist
|
||||
modlist_value = self.context.get('modlist_value')
|
||||
machineid = self.context.get('machineid')
|
||||
@@ -667,8 +705,10 @@ class ModlistInstallCLI:
|
||||
else:
|
||||
self.logger.warning(f"File descriptor limit: {message}")
|
||||
|
||||
# Popen now inherits the modified os.environ because env=None
|
||||
proc = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, text=False, env=None, cwd=engine_dir)
|
||||
# Use cleaned environment to prevent AppImage variable inheritance
|
||||
from jackify.backend.handlers.subprocess_utils import get_clean_subprocess_env
|
||||
clean_env = get_clean_subprocess_env()
|
||||
proc = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, text=False, env=clean_env, cwd=engine_dir)
|
||||
|
||||
# Start performance monitoring for the engine process
|
||||
# Adjust monitoring based on debug mode
|
||||
@@ -945,6 +985,9 @@ class ModlistInstallCLI:
|
||||
|
||||
if configuration_success:
|
||||
self.logger.info("Post-installation configuration completed successfully")
|
||||
|
||||
# Check for TTW integration eligibility
|
||||
self._check_and_prompt_ttw_integration(install_dir_str, detected_game, modlist_name)
|
||||
else:
|
||||
self.logger.warning("Post-installation configuration had issues")
|
||||
else:
|
||||
@@ -1098,11 +1141,23 @@ class ModlistInstallCLI:
|
||||
if isinstance(download_dir_display, tuple):
|
||||
download_dir_display = download_dir_display[0] # Get the Path object from (Path, bool)
|
||||
print(f"Download Directory: {download_dir_display}")
|
||||
|
||||
if self.context.get('nexus_api_key'):
|
||||
print(f"Nexus API Key: [SET]")
|
||||
|
||||
# Show authentication method
|
||||
from jackify.backend.services.nexus_auth_service import NexusAuthService
|
||||
auth_service = NexusAuthService()
|
||||
authenticated, method, username = auth_service.get_auth_status()
|
||||
|
||||
if method == 'oauth':
|
||||
auth_display = f"Nexus Authentication: OAuth"
|
||||
if username:
|
||||
auth_display += f" ({username})"
|
||||
elif method == 'api_key':
|
||||
auth_display = "Nexus Authentication: API Key (Legacy)"
|
||||
else:
|
||||
print(f"Nexus API Key: [NOT SET - WILL LIKELY FAIL]")
|
||||
# Should never reach here since we validate auth before getting to summary
|
||||
auth_display = "Nexus Authentication: Unknown"
|
||||
|
||||
print(auth_display)
|
||||
print(f"{COLOR_INFO}----------------------------------------{COLOR_RESET}")
|
||||
|
||||
def _enhance_nexus_error(self, line: str) -> str:
|
||||
@@ -1134,5 +1189,173 @@ class ModlistInstallCLI:
|
||||
|
||||
# Add URL on next line for easier debugging
|
||||
return f"{line}\n Nexus URL: {mod_url}"
|
||||
|
||||
return line
|
||||
|
||||
return line
|
||||
|
||||
def _check_and_prompt_ttw_integration(self, install_dir: str, game_type: str, modlist_name: str):
|
||||
"""Check if modlist is eligible for TTW integration and prompt user"""
|
||||
try:
|
||||
# Check eligibility: FNV game, TTW-compatible modlist, no existing TTW
|
||||
if not self._is_ttw_eligible(install_dir, game_type, modlist_name):
|
||||
return
|
||||
|
||||
# Prompt user for TTW installation
|
||||
print(f"\n{COLOR_PROMPT}═══════════════════════════════════════════════════════════════{COLOR_RESET}")
|
||||
print(f"{COLOR_INFO}TTW Integration Available{COLOR_RESET}")
|
||||
print(f"{COLOR_PROMPT}═══════════════════════════════════════════════════════════════{COLOR_RESET}")
|
||||
print(f"\nThis modlist ({modlist_name}) supports Tale of Two Wastelands (TTW).")
|
||||
print(f"TTW combines Fallout 3 and New Vegas into a single game.")
|
||||
print(f"\nWould you like to install TTW now?")
|
||||
|
||||
user_input = input(f"{COLOR_PROMPT}Install TTW? (yes/no): {COLOR_RESET}").strip().lower()
|
||||
|
||||
if user_input in ['yes', 'y']:
|
||||
self._launch_ttw_installation(modlist_name, install_dir)
|
||||
else:
|
||||
print(f"{COLOR_INFO}Skipping TTW installation. You can install it later from the main menu.{COLOR_RESET}")
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error during TTW eligibility check: {e}", exc_info=True)
|
||||
|
||||
def _is_ttw_eligible(self, install_dir: str, game_type: str, modlist_name: str) -> bool:
|
||||
"""Check if modlist is eligible for TTW integration"""
|
||||
try:
|
||||
from pathlib import Path
|
||||
|
||||
# Check 1: Must be Fallout New Vegas
|
||||
if not game_type or game_type.lower() not in ['falloutnv', 'fallout new vegas', 'fallout_new_vegas']:
|
||||
return False
|
||||
|
||||
# Check 2: Must be on TTW compatibility whitelist
|
||||
from jackify.backend.data.ttw_compatible_modlists import is_ttw_compatible
|
||||
if not is_ttw_compatible(modlist_name):
|
||||
return False
|
||||
|
||||
# Check 3: TTW must not already be installed
|
||||
if self._detect_existing_ttw(install_dir):
|
||||
self.logger.info(f"TTW already installed in {install_dir}, skipping prompt")
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error checking TTW eligibility: {e}")
|
||||
return False
|
||||
|
||||
def _detect_existing_ttw(self, install_dir: str) -> bool:
|
||||
"""Detect if TTW is already installed in the modlist"""
|
||||
try:
|
||||
from pathlib import Path
|
||||
|
||||
install_path = Path(install_dir)
|
||||
|
||||
# Search for TTW indicators in common locations
|
||||
search_paths = [
|
||||
install_path,
|
||||
install_path / "mods",
|
||||
install_path / "Stock Game",
|
||||
install_path / "Game Root"
|
||||
]
|
||||
|
||||
for search_path in search_paths:
|
||||
if not search_path.exists():
|
||||
continue
|
||||
|
||||
# Look for folders containing "tale" and "two" and "wastelands"
|
||||
for folder in search_path.iterdir():
|
||||
if not folder.is_dir():
|
||||
continue
|
||||
|
||||
folder_name_lower = folder.name.lower()
|
||||
if all(keyword in folder_name_lower for keyword in ['tale', 'two', 'wastelands']):
|
||||
# Verify it has the TTW ESM file
|
||||
for file in folder.rglob('*.esm'):
|
||||
if 'taleoftwowastelands' in file.name.lower():
|
||||
self.logger.info(f"Found existing TTW installation: {file}")
|
||||
return True
|
||||
|
||||
return False
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error detecting existing TTW: {e}")
|
||||
return False
|
||||
|
||||
def _launch_ttw_installation(self, modlist_name: str, install_dir: str):
|
||||
"""Launch TTW installation workflow"""
|
||||
try:
|
||||
print(f"\n{COLOR_INFO}Starting TTW installation workflow...{COLOR_RESET}")
|
||||
|
||||
# Import TTW installation handler
|
||||
from jackify.backend.handlers.ttw_installer_handler import TTWInstallerHandler
|
||||
from jackify.backend.models.configuration import SystemInfo
|
||||
from pathlib import Path
|
||||
|
||||
system_info = SystemInfo()
|
||||
ttw_installer_handler = TTWInstallerHandler(
|
||||
steamdeck=system_info.is_steamdeck if hasattr(system_info, 'is_steamdeck') else False,
|
||||
verbose=self.verbose if hasattr(self, 'verbose') else False,
|
||||
filesystem_handler=self.filesystem_handler if hasattr(self, 'filesystem_handler') else None,
|
||||
config_handler=self.config_handler if hasattr(self, 'config_handler') else None
|
||||
)
|
||||
|
||||
# Check if TTW_Linux_Installer is installed
|
||||
ttw_installer_handler._check_installation()
|
||||
|
||||
if not ttw_installer_handler.ttw_installer_installed:
|
||||
print(f"{COLOR_INFO}TTW_Linux_Installer is not installed.{COLOR_RESET}")
|
||||
user_input = input(f"{COLOR_PROMPT}Install TTW_Linux_Installer? (yes/no): {COLOR_RESET}").strip().lower()
|
||||
|
||||
if user_input not in ['yes', 'y']:
|
||||
print(f"{COLOR_INFO}TTW installation cancelled.{COLOR_RESET}")
|
||||
return
|
||||
|
||||
# Install TTW_Linux_Installer
|
||||
print(f"{COLOR_INFO}Installing TTW_Linux_Installer...{COLOR_RESET}")
|
||||
success, message = ttw_installer_handler.install_ttw_installer()
|
||||
|
||||
if not success:
|
||||
print(f"{COLOR_ERROR}Failed to install TTW_Linux_Installer: {message}{COLOR_RESET}")
|
||||
return
|
||||
|
||||
print(f"{COLOR_INFO}TTW_Linux_Installer installed successfully.{COLOR_RESET}")
|
||||
|
||||
# Prompt for TTW .mpi file
|
||||
print(f"\n{COLOR_PROMPT}TTW Installer File (.mpi){COLOR_RESET}")
|
||||
mpi_path = input(f"{COLOR_PROMPT}Path to TTW .mpi file: {COLOR_RESET}").strip()
|
||||
if not mpi_path:
|
||||
print(f"{COLOR_WARNING}No .mpi file specified. Cancelling.{COLOR_RESET}")
|
||||
return
|
||||
|
||||
mpi_path = Path(mpi_path).expanduser()
|
||||
if not mpi_path.exists() or not mpi_path.is_file():
|
||||
print(f"{COLOR_ERROR}TTW .mpi file not found: {mpi_path}{COLOR_RESET}")
|
||||
return
|
||||
|
||||
# Prompt for TTW installation directory
|
||||
print(f"\n{COLOR_PROMPT}TTW Installation Directory{COLOR_RESET}")
|
||||
default_ttw_dir = os.path.join(install_dir, 'TTW')
|
||||
print(f"Default: {default_ttw_dir}")
|
||||
ttw_install_dir = input(f"{COLOR_PROMPT}TTW install directory (Enter for default): {COLOR_RESET}").strip()
|
||||
|
||||
if not ttw_install_dir:
|
||||
ttw_install_dir = default_ttw_dir
|
||||
|
||||
# Run TTW installation
|
||||
print(f"\n{COLOR_INFO}Installing TTW using TTW_Linux_Installer...{COLOR_RESET}")
|
||||
print(f"{COLOR_INFO}This may take a while (15-30 minutes depending on your system).{COLOR_RESET}")
|
||||
|
||||
success, message = ttw_installer_handler.install_ttw_backend(Path(mpi_path), Path(ttw_install_dir))
|
||||
|
||||
if success:
|
||||
print(f"\n{COLOR_INFO}═══════════════════════════════════════════════════════════════{COLOR_RESET}")
|
||||
print(f"{COLOR_INFO}TTW Installation Complete!{COLOR_RESET}")
|
||||
print(f"{COLOR_PROMPT}═══════════════════════════════════════════════════════════════{COLOR_RESET}")
|
||||
print(f"\nTTW has been installed to: {ttw_install_dir}")
|
||||
print(f"The modlist '{modlist_name}' is now ready to use with TTW.")
|
||||
else:
|
||||
print(f"\n{COLOR_ERROR}TTW installation failed. Check the logs for details.{COLOR_RESET}")
|
||||
print(f"{COLOR_ERROR}Error: {message}{COLOR_RESET}")
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error during TTW installation: {e}", exc_info=True)
|
||||
print(f"{COLOR_ERROR}Error during TTW installation: {e}{COLOR_RESET}")
|
||||
442
jackify/backend/handlers/oauth_token_handler.py
Normal file
442
jackify/backend/handlers/oauth_token_handler.py
Normal file
@@ -0,0 +1,442 @@
|
||||
#!/usr/bin/env python3
|
||||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
OAuth Token Handler
|
||||
Handles encrypted storage and retrieval of OAuth tokens
|
||||
"""
|
||||
|
||||
import os
|
||||
import json
|
||||
import base64
|
||||
import hashlib
|
||||
import logging
|
||||
import time
|
||||
from typing import Optional, Dict
|
||||
from pathlib import Path
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class OAuthTokenHandler:
|
||||
"""
|
||||
Handles OAuth token storage with simple encryption
|
||||
Stores tokens in ~/.config/jackify/nexus-oauth.json
|
||||
"""
|
||||
|
||||
def __init__(self, config_dir: Optional[str] = None):
|
||||
"""
|
||||
Initialize token handler
|
||||
|
||||
Args:
|
||||
config_dir: Optional custom config directory (defaults to ~/.config/jackify)
|
||||
"""
|
||||
if config_dir:
|
||||
self.config_dir = Path(config_dir)
|
||||
else:
|
||||
self.config_dir = Path.home() / ".config" / "jackify"
|
||||
|
||||
self.token_file = self.config_dir / "nexus-oauth.json"
|
||||
|
||||
# Ensure config directory exists
|
||||
self.config_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# Generate encryption key based on machine-specific data
|
||||
self._encryption_key = self._generate_encryption_key()
|
||||
|
||||
def _generate_encryption_key(self) -> bytes:
|
||||
"""
|
||||
Generate encryption key based on machine-specific data using Fernet
|
||||
|
||||
Uses hostname + username + machine ID as key material, similar to DPAPI approach.
|
||||
This provides proper symmetric encryption while remaining machine-specific.
|
||||
|
||||
Returns:
|
||||
Fernet-compatible 32-byte encryption key
|
||||
"""
|
||||
import socket
|
||||
import getpass
|
||||
|
||||
try:
|
||||
hostname = socket.gethostname()
|
||||
username = getpass.getuser()
|
||||
|
||||
# Try to get machine ID for additional entropy
|
||||
machine_id = None
|
||||
try:
|
||||
# Linux machine-id
|
||||
with open('/etc/machine-id', 'r') as f:
|
||||
machine_id = f.read().strip()
|
||||
except:
|
||||
try:
|
||||
# Alternative locations
|
||||
with open('/var/lib/dbus/machine-id', 'r') as f:
|
||||
machine_id = f.read().strip()
|
||||
except:
|
||||
pass
|
||||
|
||||
# Combine multiple sources of machine-specific data
|
||||
if machine_id:
|
||||
key_material = f"{hostname}:{username}:{machine_id}:jackify"
|
||||
else:
|
||||
key_material = f"{hostname}:{username}:jackify"
|
||||
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to get machine info for encryption: {e}")
|
||||
key_material = "jackify:default:key"
|
||||
|
||||
# Generate 32-byte key using SHA256 for Fernet
|
||||
# Fernet requires base64-encoded 32-byte key
|
||||
key_bytes = hashlib.sha256(key_material.encode('utf-8')).digest()
|
||||
return base64.urlsafe_b64encode(key_bytes)
|
||||
|
||||
def _encrypt_data(self, data: str) -> str:
|
||||
"""
|
||||
Encrypt data using AES-GCM (authenticated encryption)
|
||||
|
||||
Uses pycryptodome for cross-platform compatibility.
|
||||
AES-GCM provides authenticated encryption similar to Fernet.
|
||||
|
||||
Args:
|
||||
data: Plain text data
|
||||
|
||||
Returns:
|
||||
Encrypted data as base64 string (nonce:ciphertext:tag format)
|
||||
"""
|
||||
try:
|
||||
from Crypto.Cipher import AES
|
||||
from Crypto.Random import get_random_bytes
|
||||
|
||||
# Derive 32-byte AES key from encryption_key (which is base64-encoded)
|
||||
key = base64.urlsafe_b64decode(self._encryption_key)
|
||||
|
||||
# Generate random nonce (12 bytes for GCM)
|
||||
nonce = get_random_bytes(12)
|
||||
|
||||
# Create AES-GCM cipher
|
||||
cipher = AES.new(key, AES.MODE_GCM, nonce=nonce)
|
||||
|
||||
# Encrypt and get authentication tag
|
||||
data_bytes = data.encode('utf-8')
|
||||
ciphertext, tag = cipher.encrypt_and_digest(data_bytes)
|
||||
|
||||
# Combine nonce:ciphertext:tag and base64 encode
|
||||
combined = nonce + ciphertext + tag
|
||||
return base64.b64encode(combined).decode('utf-8')
|
||||
|
||||
except ImportError:
|
||||
logger.error("pycryptodome package not available for token encryption")
|
||||
return ""
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to encrypt data: {e}")
|
||||
return ""
|
||||
|
||||
def _decrypt_data(self, encrypted_data: str) -> Optional[str]:
|
||||
"""
|
||||
Decrypt data using AES-GCM (authenticated encryption)
|
||||
|
||||
Args:
|
||||
encrypted_data: Encrypted data string (base64-encoded nonce:ciphertext:tag)
|
||||
|
||||
Returns:
|
||||
Decrypted plain text or None on failure
|
||||
"""
|
||||
try:
|
||||
from Crypto.Cipher import AES
|
||||
|
||||
# Check if MODE_GCM is available (pycryptodome has it, old pycrypto doesn't)
|
||||
if not hasattr(AES, 'MODE_GCM'):
|
||||
logger.error("pycryptodome required for token decryption (pycrypto doesn't support MODE_GCM)")
|
||||
return None
|
||||
|
||||
# Derive 32-byte AES key from encryption_key
|
||||
key = base64.urlsafe_b64decode(self._encryption_key)
|
||||
|
||||
# Decode base64 and split nonce:ciphertext:tag
|
||||
combined = base64.b64decode(encrypted_data.encode('utf-8'))
|
||||
nonce = combined[:12]
|
||||
tag = combined[-16:]
|
||||
ciphertext = combined[12:-16]
|
||||
|
||||
# Create AES-GCM cipher
|
||||
cipher = AES.new(key, AES.MODE_GCM, nonce=nonce)
|
||||
|
||||
# Decrypt and verify authentication tag
|
||||
plaintext = cipher.decrypt_and_verify(ciphertext, tag)
|
||||
|
||||
return plaintext.decode('utf-8')
|
||||
|
||||
except ImportError:
|
||||
logger.error("pycryptodome package not available for token decryption")
|
||||
return None
|
||||
except AttributeError:
|
||||
logger.error("pycryptodome required for token decryption (pycrypto doesn't support MODE_GCM)")
|
||||
return None
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to decrypt data: {e}")
|
||||
return None
|
||||
|
||||
def save_token(self, token_data: Dict) -> bool:
|
||||
"""
|
||||
Save OAuth token to encrypted file with proper permissions
|
||||
|
||||
Args:
|
||||
token_data: Token data dict from OAuth response
|
||||
|
||||
Returns:
|
||||
True if saved successfully
|
||||
"""
|
||||
try:
|
||||
# Add timestamp for tracking
|
||||
token_data['_saved_at'] = int(time.time())
|
||||
|
||||
# Convert to JSON
|
||||
json_data = json.dumps(token_data, indent=2)
|
||||
|
||||
# Encrypt using Fernet
|
||||
encrypted = self._encrypt_data(json_data)
|
||||
|
||||
if not encrypted:
|
||||
logger.error("Encryption failed, cannot save token")
|
||||
return False
|
||||
|
||||
# Save to file with restricted permissions
|
||||
# Write to temp file first, then move (atomic operation)
|
||||
import tempfile
|
||||
fd, temp_path = tempfile.mkstemp(dir=self.config_dir, prefix='.oauth_tmp_')
|
||||
|
||||
try:
|
||||
with os.fdopen(fd, 'w') as f:
|
||||
json.dump({'encrypted_data': encrypted}, f, indent=2)
|
||||
|
||||
# Set restrictive permissions (owner read/write only)
|
||||
os.chmod(temp_path, 0o600)
|
||||
|
||||
# Atomic move
|
||||
os.replace(temp_path, self.token_file)
|
||||
|
||||
logger.info(f"Saved encrypted OAuth token to {self.token_file}")
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
# Clean up temp file on error
|
||||
try:
|
||||
os.unlink(temp_path)
|
||||
except:
|
||||
pass
|
||||
raise e
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to save OAuth token: {e}")
|
||||
return False
|
||||
|
||||
def load_token(self) -> Optional[Dict]:
|
||||
"""
|
||||
Load OAuth token from encrypted file
|
||||
|
||||
Returns:
|
||||
Token data dict or None if not found or invalid
|
||||
"""
|
||||
if not self.token_file.exists():
|
||||
logger.debug("No OAuth token file found")
|
||||
return None
|
||||
|
||||
try:
|
||||
# Load encrypted data
|
||||
with open(self.token_file, 'r') as f:
|
||||
data = json.load(f)
|
||||
|
||||
encrypted = data.get('encrypted_data')
|
||||
if not encrypted:
|
||||
logger.error("Token file missing encrypted_data field")
|
||||
return None
|
||||
|
||||
# Decrypt
|
||||
decrypted = self._decrypt_data(encrypted)
|
||||
if not decrypted:
|
||||
logger.error("Failed to decrypt token data")
|
||||
return None
|
||||
|
||||
# Parse JSON
|
||||
token_data = json.loads(decrypted)
|
||||
|
||||
logger.debug("Successfully loaded OAuth token")
|
||||
return token_data
|
||||
|
||||
except json.JSONDecodeError as e:
|
||||
logger.error(f"Token file contains invalid JSON: {e}")
|
||||
return None
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to load OAuth token: {e}")
|
||||
return None
|
||||
|
||||
def delete_token(self) -> bool:
|
||||
"""
|
||||
Delete OAuth token file
|
||||
|
||||
Returns:
|
||||
True if deleted successfully
|
||||
"""
|
||||
try:
|
||||
if self.token_file.exists():
|
||||
self.token_file.unlink()
|
||||
logger.info("Deleted OAuth token file")
|
||||
return True
|
||||
else:
|
||||
logger.debug("No OAuth token file to delete")
|
||||
return False
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to delete OAuth token: {e}")
|
||||
return False
|
||||
|
||||
def has_token(self) -> bool:
|
||||
"""
|
||||
Check if OAuth token file exists
|
||||
|
||||
Returns:
|
||||
True if token file exists
|
||||
"""
|
||||
return self.token_file.exists()
|
||||
|
||||
def is_token_expired(self, token_data: Optional[Dict] = None, buffer_minutes: int = 5) -> bool:
|
||||
"""
|
||||
Check if token is expired or close to expiring
|
||||
|
||||
Args:
|
||||
token_data: Optional token data dict (loads from file if not provided)
|
||||
buffer_minutes: Minutes before expiry to consider token expired (default 5)
|
||||
|
||||
Returns:
|
||||
True if token is expired or will expire within buffer_minutes
|
||||
"""
|
||||
if token_data is None:
|
||||
token_data = self.load_token()
|
||||
|
||||
if not token_data:
|
||||
return True
|
||||
|
||||
# Extract OAuth data if nested
|
||||
oauth_data = token_data.get('oauth', token_data)
|
||||
|
||||
# Get expiry information
|
||||
expires_in = oauth_data.get('expires_in')
|
||||
saved_at = token_data.get('_saved_at')
|
||||
|
||||
if not expires_in or not saved_at:
|
||||
logger.debug("Token missing expiry information, assuming valid")
|
||||
return False # Assume token is valid if no expiry info
|
||||
|
||||
# Calculate expiry time
|
||||
expires_at = saved_at + expires_in
|
||||
buffer_seconds = buffer_minutes * 60
|
||||
now = int(time.time())
|
||||
|
||||
# Check if expired or within buffer
|
||||
is_expired = (expires_at - buffer_seconds) < now
|
||||
|
||||
if is_expired:
|
||||
remaining = expires_at - now
|
||||
if remaining < 0:
|
||||
logger.debug(f"Token expired {-remaining} seconds ago")
|
||||
else:
|
||||
logger.debug(f"Token expires in {remaining} seconds (within buffer)")
|
||||
|
||||
return is_expired
|
||||
|
||||
def get_access_token(self) -> Optional[str]:
|
||||
"""
|
||||
Get access token from storage
|
||||
|
||||
Returns:
|
||||
Access token string or None if not found or expired
|
||||
"""
|
||||
token_data = self.load_token()
|
||||
|
||||
if not token_data:
|
||||
return None
|
||||
|
||||
# Check if expired
|
||||
if self.is_token_expired(token_data):
|
||||
logger.debug("Stored token is expired")
|
||||
return None
|
||||
|
||||
# Extract access token from OAuth structure
|
||||
oauth_data = token_data.get('oauth', token_data)
|
||||
access_token = oauth_data.get('access_token')
|
||||
|
||||
if not access_token:
|
||||
logger.error("Token data missing access_token field")
|
||||
return None
|
||||
|
||||
return access_token
|
||||
|
||||
def get_refresh_token(self) -> Optional[str]:
|
||||
"""
|
||||
Get refresh token from storage
|
||||
|
||||
Returns:
|
||||
Refresh token string or None if not found
|
||||
"""
|
||||
token_data = self.load_token()
|
||||
|
||||
if not token_data:
|
||||
return None
|
||||
|
||||
# Extract refresh token from OAuth structure
|
||||
oauth_data = token_data.get('oauth', token_data)
|
||||
refresh_token = oauth_data.get('refresh_token')
|
||||
|
||||
return refresh_token
|
||||
|
||||
def get_token_info(self) -> Dict:
|
||||
"""
|
||||
Get diagnostic information about current token
|
||||
|
||||
Returns:
|
||||
Dict with token status information
|
||||
"""
|
||||
token_data = self.load_token()
|
||||
|
||||
if not token_data:
|
||||
return {
|
||||
'has_token': False,
|
||||
'error': 'No token file found'
|
||||
}
|
||||
|
||||
oauth_data = token_data.get('oauth', token_data)
|
||||
expires_in = oauth_data.get('expires_in')
|
||||
saved_at = token_data.get('_saved_at')
|
||||
|
||||
# Check if refresh token is likely expired (30 days since last auth)
|
||||
# Nexus doesn't provide refresh token expiry, so we estimate conservatively
|
||||
REFRESH_TOKEN_LIFETIME_DAYS = 30
|
||||
now = int(time.time())
|
||||
refresh_token_age_days = (now - saved_at) / 86400 if saved_at else 0
|
||||
refresh_token_likely_expired = refresh_token_age_days > REFRESH_TOKEN_LIFETIME_DAYS
|
||||
|
||||
if expires_in and saved_at:
|
||||
expires_at = saved_at + expires_in
|
||||
remaining_seconds = expires_at - now
|
||||
|
||||
return {
|
||||
'has_token': True,
|
||||
'has_refresh_token': bool(oauth_data.get('refresh_token')),
|
||||
'expires_in_seconds': remaining_seconds,
|
||||
'expires_in_minutes': remaining_seconds / 60,
|
||||
'expires_in_hours': remaining_seconds / 3600,
|
||||
'is_expired': remaining_seconds < 0,
|
||||
'expires_soon_5min': remaining_seconds < 300,
|
||||
'expires_soon_15min': remaining_seconds < 900,
|
||||
'saved_at': saved_at,
|
||||
'expires_at': expires_at,
|
||||
'refresh_token_age_days': refresh_token_age_days,
|
||||
'refresh_token_likely_expired': refresh_token_likely_expired,
|
||||
}
|
||||
else:
|
||||
return {
|
||||
'has_token': True,
|
||||
'has_refresh_token': bool(oauth_data.get('refresh_token')),
|
||||
'refresh_token_age_days': refresh_token_age_days,
|
||||
'refresh_token_likely_expired': refresh_token_likely_expired,
|
||||
'error': 'Token missing expiry information'
|
||||
}
|
||||
@@ -12,6 +12,7 @@ import shutil
|
||||
from pathlib import Path
|
||||
from typing import Optional, Union, Dict, Any, List, Tuple
|
||||
from datetime import datetime
|
||||
import vdf
|
||||
|
||||
# Initialize logger
|
||||
logger = logging.getLogger(__name__)
|
||||
@@ -258,7 +259,7 @@ class PathHandler:
|
||||
return False
|
||||
|
||||
@staticmethod
|
||||
def create_dxvk_conf(modlist_dir, modlist_sdcard, steam_library, basegame_sdcard, game_var_full, vanilla_game_dir=None):
|
||||
def create_dxvk_conf(modlist_dir, modlist_sdcard, steam_library, basegame_sdcard, game_var_full, vanilla_game_dir=None, stock_game_path=None):
|
||||
"""
|
||||
Create dxvk.conf file in the appropriate location
|
||||
|
||||
@@ -269,6 +270,7 @@ class PathHandler:
|
||||
basegame_sdcard (bool): Whether the base game is on an SD card
|
||||
game_var_full (str): Full name of the game (e.g., "Skyrim Special Edition")
|
||||
vanilla_game_dir (str): Optional path to vanilla game directory for fallback
|
||||
stock_game_path (str): Direct path to detected stock game directory (if available)
|
||||
|
||||
Returns:
|
||||
bool: True on success, False on failure
|
||||
@@ -276,49 +278,45 @@ class PathHandler:
|
||||
try:
|
||||
logger.info("Creating dxvk.conf file...")
|
||||
|
||||
# Determine the location for dxvk.conf
|
||||
dxvk_conf_path = None
|
||||
candidate_dirs = PathHandler._build_dxvk_candidate_dirs(
|
||||
modlist_dir=modlist_dir,
|
||||
stock_game_path=stock_game_path,
|
||||
steam_library=steam_library,
|
||||
game_var_full=game_var_full,
|
||||
vanilla_game_dir=vanilla_game_dir
|
||||
)
|
||||
|
||||
# Check for common stock game directories first, then vanilla as fallback
|
||||
stock_game_paths = [
|
||||
os.path.join(modlist_dir, "Stock Game"),
|
||||
os.path.join(modlist_dir, "Game Root"),
|
||||
os.path.join(modlist_dir, "STOCK GAME"),
|
||||
os.path.join(modlist_dir, "Stock Game Folder"),
|
||||
os.path.join(modlist_dir, "Stock Folder"),
|
||||
os.path.join(modlist_dir, "Skyrim Stock"),
|
||||
os.path.join(modlist_dir, "root", "Skyrim Special Edition")
|
||||
]
|
||||
if not candidate_dirs:
|
||||
logger.error("Could not determine location for dxvk.conf (no candidate directories found)")
|
||||
return False
|
||||
|
||||
# Add vanilla game directory as fallback if steam_library and game_var_full are provided
|
||||
if steam_library and game_var_full:
|
||||
stock_game_paths.append(os.path.join(steam_library, "steamapps", "common", game_var_full))
|
||||
|
||||
for path in stock_game_paths:
|
||||
if os.path.exists(path):
|
||||
dxvk_conf_path = os.path.join(path, "dxvk.conf")
|
||||
target_dir = None
|
||||
for directory in candidate_dirs:
|
||||
if directory.is_dir():
|
||||
target_dir = directory
|
||||
break
|
||||
|
||||
if not dxvk_conf_path:
|
||||
# Fallback: Try vanilla game directory if provided
|
||||
if vanilla_game_dir and os.path.exists(vanilla_game_dir):
|
||||
logger.info(f"Attempting fallback to vanilla game directory: {vanilla_game_dir}")
|
||||
dxvk_conf_path = os.path.join(vanilla_game_dir, "dxvk.conf")
|
||||
logger.info(f"Using vanilla game directory for dxvk.conf: {dxvk_conf_path}")
|
||||
if target_dir is None:
|
||||
fallback_dir = Path(modlist_dir) if modlist_dir and Path(modlist_dir).is_dir() else None
|
||||
if fallback_dir:
|
||||
logger.warning(f"No stock/vanilla directories found; falling back to modlist directory: {fallback_dir}")
|
||||
target_dir = fallback_dir
|
||||
else:
|
||||
logger.error("Could not determine location for dxvk.conf")
|
||||
logger.error("All candidate directories for dxvk.conf are missing.")
|
||||
return False
|
||||
|
||||
dxvk_conf_path = target_dir / "dxvk.conf"
|
||||
|
||||
# The required line that Jackify needs
|
||||
required_line = "dxvk.enableGraphicsPipelineLibrary = False"
|
||||
|
||||
# Check if dxvk.conf already exists
|
||||
if os.path.exists(dxvk_conf_path):
|
||||
if dxvk_conf_path.exists():
|
||||
logger.info(f"Found existing dxvk.conf at {dxvk_conf_path}")
|
||||
|
||||
# Read existing content
|
||||
try:
|
||||
with open(dxvk_conf_path, 'r') as f:
|
||||
with open(dxvk_conf_path, 'r', encoding='utf-8') as f:
|
||||
existing_content = f.read().strip()
|
||||
|
||||
# Check if our required line is already present
|
||||
@@ -339,7 +337,7 @@ class PathHandler:
|
||||
updated_content = required_line + '\n'
|
||||
logger.info("Adding required DXVK setting to empty file")
|
||||
|
||||
with open(dxvk_conf_path, 'w') as f:
|
||||
with open(dxvk_conf_path, 'w', encoding='utf-8') as f:
|
||||
f.write(updated_content)
|
||||
|
||||
logger.info(f"dxvk.conf updated successfully at {dxvk_conf_path}")
|
||||
@@ -353,7 +351,8 @@ class PathHandler:
|
||||
# Create new dxvk.conf file (original behavior)
|
||||
dxvk_conf_content = required_line + '\n'
|
||||
|
||||
with open(dxvk_conf_path, 'w') as f:
|
||||
dxvk_conf_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
with open(dxvk_conf_path, 'w', encoding='utf-8') as f:
|
||||
f.write(dxvk_conf_content)
|
||||
|
||||
logger.info(f"dxvk.conf created successfully at {dxvk_conf_path}")
|
||||
@@ -362,6 +361,99 @@ class PathHandler:
|
||||
except Exception as e:
|
||||
logger.error(f"Error creating dxvk.conf: {e}")
|
||||
return False
|
||||
|
||||
@staticmethod
|
||||
def verify_dxvk_conf_exists(modlist_dir, steam_library, game_var_full, vanilla_game_dir=None, stock_game_path=None) -> bool:
|
||||
"""
|
||||
Verify that dxvk.conf exists in at least one of the candidate directories and contains the required setting.
|
||||
"""
|
||||
required_line = "dxvk.enableGraphicsPipelineLibrary = False"
|
||||
candidate_dirs = PathHandler._build_dxvk_candidate_dirs(
|
||||
modlist_dir=modlist_dir,
|
||||
stock_game_path=stock_game_path,
|
||||
steam_library=steam_library,
|
||||
game_var_full=game_var_full,
|
||||
vanilla_game_dir=vanilla_game_dir
|
||||
)
|
||||
|
||||
for directory in candidate_dirs:
|
||||
conf_path = directory / "dxvk.conf"
|
||||
if conf_path.is_file():
|
||||
try:
|
||||
with open(conf_path, 'r', encoding='utf-8') as f:
|
||||
content = f.read()
|
||||
if required_line not in content:
|
||||
logger.warning(f"dxvk.conf found at {conf_path} but missing required setting. Appending now.")
|
||||
with open(conf_path, 'a', encoding='utf-8') as f:
|
||||
if not content.endswith('\n'):
|
||||
f.write('\n')
|
||||
f.write(required_line + '\n')
|
||||
logger.info(f"Verified dxvk.conf at {conf_path}")
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to verify dxvk.conf at {conf_path}: {e}")
|
||||
|
||||
logger.warning("dxvk.conf verification failed - file not found in any candidate directory.")
|
||||
return False
|
||||
|
||||
@staticmethod
|
||||
def _normalize_common_library_path(steam_library: Optional[str]) -> Optional[Path]:
|
||||
if not steam_library:
|
||||
return None
|
||||
path = Path(steam_library)
|
||||
parts_lower = [part.lower() for part in path.parts]
|
||||
if len(parts_lower) >= 2 and parts_lower[-2:] == ['steamapps', 'common']:
|
||||
return path
|
||||
if parts_lower and parts_lower[-1] == 'common':
|
||||
return path
|
||||
if 'steamapps' in parts_lower:
|
||||
idx = parts_lower.index('steamapps')
|
||||
truncated = Path(*path.parts[:idx + 1])
|
||||
return truncated / 'common'
|
||||
return path / 'steamapps' / 'common'
|
||||
|
||||
@staticmethod
|
||||
def _build_dxvk_candidate_dirs(modlist_dir, stock_game_path, steam_library, game_var_full, vanilla_game_dir) -> List[Path]:
|
||||
candidates: List[Path] = []
|
||||
seen = set()
|
||||
|
||||
def add_candidate(path_obj: Optional[Path]):
|
||||
if not path_obj:
|
||||
return
|
||||
key = path_obj.resolve() if path_obj.exists() else path_obj
|
||||
if key in seen:
|
||||
return
|
||||
seen.add(key)
|
||||
candidates.append(path_obj)
|
||||
|
||||
if stock_game_path:
|
||||
add_candidate(Path(stock_game_path))
|
||||
|
||||
if modlist_dir:
|
||||
base_path = Path(modlist_dir)
|
||||
common_names = [
|
||||
"Stock Game",
|
||||
"Game Root",
|
||||
"STOCK GAME",
|
||||
"Stock Game Folder",
|
||||
"Stock Folder",
|
||||
"Skyrim Stock",
|
||||
os.path.join("root", "Skyrim Special Edition")
|
||||
]
|
||||
for name in common_names:
|
||||
add_candidate(base_path / name)
|
||||
|
||||
steam_common = PathHandler._normalize_common_library_path(steam_library)
|
||||
if steam_common and game_var_full:
|
||||
add_candidate(steam_common / game_var_full)
|
||||
|
||||
if vanilla_game_dir:
|
||||
add_candidate(Path(vanilla_game_dir))
|
||||
|
||||
if modlist_dir:
|
||||
add_candidate(Path(modlist_dir))
|
||||
|
||||
return candidates
|
||||
|
||||
@staticmethod
|
||||
def find_steam_config_vdf() -> Optional[Path]:
|
||||
@@ -390,7 +482,7 @@ class PathHandler:
|
||||
libraryfolders_vdf_paths = [
|
||||
os.path.expanduser("~/.steam/steam/config/libraryfolders.vdf"),
|
||||
os.path.expanduser("~/.local/share/Steam/config/libraryfolders.vdf"),
|
||||
# Add other potential standard locations if necessary
|
||||
os.path.expanduser("~/.var/app/com.valvesoftware.Steam/.local/share/Steam/config/libraryfolders.vdf"), # Flatpak
|
||||
]
|
||||
|
||||
# Simple backup mechanism (optional but good practice)
|
||||
@@ -491,40 +583,53 @@ class PathHandler:
|
||||
|
||||
logger.debug(f"Searching for compatdata directory for AppID: {appid}")
|
||||
|
||||
# Use libraryfolders.vdf to find all Steam library paths
|
||||
# Use libraryfolders.vdf to find all Steam library paths, when available
|
||||
library_paths = PathHandler.get_all_steam_library_paths()
|
||||
if not library_paths:
|
||||
logger.error("Could not find any Steam library paths from libraryfolders.vdf")
|
||||
return None
|
||||
if library_paths:
|
||||
logger.debug(f"Checking compatdata in {len(library_paths)} Steam libraries")
|
||||
|
||||
# Check each Steam library's compatdata directory
|
||||
for library_path in library_paths:
|
||||
compatdata_base = library_path / "steamapps" / "compatdata"
|
||||
if not compatdata_base.is_dir():
|
||||
logger.debug(f"Compatdata directory does not exist: {compatdata_base}")
|
||||
continue
|
||||
|
||||
potential_path = compatdata_base / appid
|
||||
if potential_path.is_dir():
|
||||
logger.info(f"Found compatdata directory: {potential_path}")
|
||||
return potential_path
|
||||
else:
|
||||
logger.debug(f"Compatdata for AppID {appid} not found in {compatdata_base}")
|
||||
|
||||
logger.debug(f"Checking compatdata in {len(library_paths)} Steam libraries")
|
||||
# Check fallback locations only if we didn't find valid libraries
|
||||
# If we have valid libraries from libraryfolders.vdf, we should NOT fall back to wrong locations
|
||||
is_flatpak_steam = any('.var/app/com.valvesoftware.Steam' in str(lib) for lib in library_paths) if library_paths else False
|
||||
|
||||
# Check each Steam library's compatdata directory
|
||||
for library_path in library_paths:
|
||||
compatdata_base = library_path / "steamapps" / "compatdata"
|
||||
if not compatdata_base.is_dir():
|
||||
logger.debug(f"Compatdata directory does not exist: {compatdata_base}")
|
||||
continue
|
||||
|
||||
potential_path = compatdata_base / appid
|
||||
if potential_path.is_dir():
|
||||
logger.info(f"Found compatdata directory: {potential_path}")
|
||||
return potential_path
|
||||
if not library_paths or is_flatpak_steam:
|
||||
# Only check Flatpak-specific fallbacks if we have Flatpak Steam
|
||||
logger.debug("Checking fallback compatdata locations...")
|
||||
if is_flatpak_steam:
|
||||
# For Flatpak Steam, only check Flatpak-specific locations
|
||||
fallback_locations = [
|
||||
Path.home() / ".var/app/com.valvesoftware.Steam/.local/share/Steam/steamapps/compatdata",
|
||||
Path.home() / ".var/app/com.valvesoftware.Steam/data/Steam/steamapps/compatdata",
|
||||
]
|
||||
else:
|
||||
logger.debug(f"Compatdata for AppID {appid} not found in {compatdata_base}")
|
||||
|
||||
# Fallback: Broad search (can be slow, consider if needed)
|
||||
# try:
|
||||
# logger.debug(f"Compatdata not found in standard locations, attempting wider search...")
|
||||
# # This can be very slow and resource-intensive
|
||||
# # find_output = subprocess.check_output(['find', '/', '-type', 'd', '-name', appid, '-path', '*/compatdata/*', '-print', '-quit', '2>/dev/null'], text=True).strip()
|
||||
# # if find_output:
|
||||
# # logger.info(f"Found compatdata via find command: {find_output}")
|
||||
# # return Path(find_output)
|
||||
# except Exception as e:
|
||||
# logger.warning(f"Error during 'find' command for compatdata: {e}")
|
||||
|
||||
logger.warning(f"Compatdata directory for AppID {appid} not found.")
|
||||
# For native Steam or unknown, check standard locations
|
||||
fallback_locations = [
|
||||
Path.home() / ".local/share/Steam/steamapps/compatdata",
|
||||
Path.home() / ".steam/steam/steamapps/compatdata",
|
||||
]
|
||||
|
||||
for compatdata_base in fallback_locations:
|
||||
if compatdata_base.is_dir():
|
||||
potential_path = compatdata_base / appid
|
||||
if potential_path.is_dir():
|
||||
logger.warning(f"Found compatdata directory in fallback location (may be from old incorrect creation): {potential_path}")
|
||||
return potential_path
|
||||
|
||||
logger.warning(f"Compatdata directory for AppID {appid} not found in any Steam library or fallback location.")
|
||||
return None
|
||||
|
||||
@staticmethod
|
||||
@@ -617,12 +722,22 @@ class PathHandler:
|
||||
if vdf_path.is_file():
|
||||
logger.info(f"[DEBUG] Parsing libraryfolders.vdf: {vdf_path}")
|
||||
try:
|
||||
with open(vdf_path) as f:
|
||||
for line in f:
|
||||
m = re.search(r'"path"\s*"([^"]+)"', line)
|
||||
if m:
|
||||
lib_path = Path(m.group(1))
|
||||
library_paths.add(lib_path)
|
||||
with open(vdf_path, 'r', encoding='utf-8') as f:
|
||||
data = vdf.load(f)
|
||||
# libraryfolders.vdf structure: libraryfolders -> "0", "1", etc. -> "path"
|
||||
libraryfolders = data.get('libraryfolders', {})
|
||||
for key, lib_data in libraryfolders.items():
|
||||
if isinstance(lib_data, dict) and 'path' in lib_data:
|
||||
lib_path = Path(lib_data['path'])
|
||||
# Resolve symlinks for consistency (mmcblk0p1 -> deck/UUID)
|
||||
try:
|
||||
resolved_path = lib_path.resolve()
|
||||
library_paths.add(resolved_path)
|
||||
logger.debug(f"[DEBUG] Found library path: {resolved_path}")
|
||||
except (OSError, RuntimeError) as resolve_err:
|
||||
# If resolve fails, use original path
|
||||
logger.warning(f"[DEBUG] Could not resolve {lib_path}, using as-is: {resolve_err}")
|
||||
library_paths.add(lib_path)
|
||||
except Exception as e:
|
||||
logger.error(f"[DEBUG] Failed to parse {vdf_path}: {e}")
|
||||
logger.info(f"[DEBUG] All detected Steam libraries: {library_paths}")
|
||||
@@ -676,10 +791,10 @@ class PathHandler:
|
||||
|
||||
# For each library path, look for each target game
|
||||
for library_path in library_paths:
|
||||
# Check if the common directory exists
|
||||
common_dir = library_path / "common"
|
||||
# Check if the common directory exists (games are in steamapps/common)
|
||||
common_dir = library_path / "steamapps" / "common"
|
||||
if not common_dir.is_dir():
|
||||
logger.debug(f"No 'common' directory in library: {library_path}")
|
||||
logger.debug(f"No 'steamapps/common' directory in library: {library_path}")
|
||||
continue
|
||||
|
||||
# Get subdirectories in common dir
|
||||
@@ -694,8 +809,8 @@ class PathHandler:
|
||||
if game_name in results:
|
||||
continue # Already found this game
|
||||
|
||||
# Try to find by appmanifest
|
||||
appmanifest_path = library_path / f"appmanifest_{app_id}.acf"
|
||||
# Try to find by appmanifest (manifests are in steamapps subdirectory)
|
||||
appmanifest_path = library_path / "steamapps" / f"appmanifest_{app_id}.acf"
|
||||
if appmanifest_path.is_file():
|
||||
# Find the installdir value
|
||||
try:
|
||||
@@ -777,17 +892,36 @@ class PathHandler:
|
||||
|
||||
# Extract existing gamePath to use as source of truth for vanilla game location
|
||||
existing_game_path = None
|
||||
for line in lines:
|
||||
gamepath_line_index = -1
|
||||
for i, line in enumerate(lines):
|
||||
if re.match(r'^\s*gamepath\s*=.*@ByteArray\(([^)]+)\)', line, re.IGNORECASE):
|
||||
match = re.search(r'@ByteArray\(([^)]+)\)', line)
|
||||
if match:
|
||||
raw_path = match.group(1)
|
||||
gamepath_line_index = i
|
||||
# Convert Windows path back to Linux path
|
||||
if raw_path.startswith(('Z:', 'D:')):
|
||||
linux_path = raw_path[2:].replace('\\\\', '/').replace('\\', '/')
|
||||
existing_game_path = linux_path
|
||||
logger.debug(f"Extracted existing gamePath: {existing_game_path}")
|
||||
break
|
||||
|
||||
# Special handling for gamePath in three-true scenario (engine_installed + steamdeck + sdcard)
|
||||
if modlist_sdcard and existing_game_path and existing_game_path.startswith('/run/media') and gamepath_line_index != -1:
|
||||
# Simple manual stripping of /run/media/deck/UUID pattern for SD card paths
|
||||
# Match /run/media/deck/[UUID]/Games/... and extract just /Games/...
|
||||
sdcard_pattern = r'^/run/media/deck/[^/]+(/Games/.*)$'
|
||||
match = re.match(sdcard_pattern, existing_game_path)
|
||||
if match:
|
||||
stripped_path = match.group(1) # Just the /Games/... part
|
||||
windows_path = stripped_path.replace('/', '\\\\')
|
||||
new_gamepath_value = f"D:\\\\{windows_path}"
|
||||
new_gamepath_line = f"gamePath = @ByteArray({new_gamepath_value})\n"
|
||||
|
||||
logger.info(f"Updating gamePath for SD card: {lines[gamepath_line_index].strip()} -> {new_gamepath_line.strip()}")
|
||||
lines[gamepath_line_index] = new_gamepath_line
|
||||
else:
|
||||
logger.warning(f"SD card path doesn't match expected pattern: {existing_game_path}")
|
||||
|
||||
game_path_updated = False
|
||||
binary_paths_updated = 0
|
||||
@@ -852,10 +986,9 @@ class PathHandler:
|
||||
else:
|
||||
found_stock = None
|
||||
for folder in STOCK_GAME_FOLDERS:
|
||||
folder_pattern = f"/{folder.replace(' ', '')}".lower()
|
||||
value_part_lower = value_part.replace(' ', '').lower()
|
||||
if folder_pattern in value_part_lower:
|
||||
idx = value_part_lower.index(folder_pattern)
|
||||
folder_pattern = f"/{folder}"
|
||||
if folder_pattern in value_part:
|
||||
idx = value_part.index(folder_pattern)
|
||||
rel_path = value_part[idx:].lstrip('/')
|
||||
found_stock = folder
|
||||
break
|
||||
|
||||
1216
jackify/backend/handlers/progress_parser.py
Normal file
1216
jackify/backend/handlers/progress_parser.py
Normal file
File diff suppressed because it is too large
Load Diff
51
jackify/backend/handlers/progress_parser_example.py
Normal file
51
jackify/backend/handlers/progress_parser_example.py
Normal file
@@ -0,0 +1,51 @@
|
||||
"""
|
||||
Example usage of ProgressParser
|
||||
|
||||
This file demonstrates how to use the progress parser to extract
|
||||
structured information from jackify-engine output.
|
||||
|
||||
R&D NOTE: This is experimental code for investigation purposes.
|
||||
"""
|
||||
|
||||
from jackify.backend.handlers.progress_parser import ProgressStateManager
|
||||
|
||||
|
||||
def example_usage():
|
||||
"""Example of how to use the progress parser."""
|
||||
|
||||
# Create state manager
|
||||
state_manager = ProgressStateManager()
|
||||
|
||||
# Simulate processing lines from jackify-engine output
|
||||
sample_lines = [
|
||||
"[00:00:00] === Installing files ===",
|
||||
"[00:00:05] [12/14] Installing files (1.1GB/56.3GB)",
|
||||
"[00:00:10] Installing: Enderal Remastered Armory.7z (42%)",
|
||||
"[00:00:15] Extracting: Mandragora Sprouts.7z (96%)",
|
||||
"[00:00:20] Downloading at 45.2MB/s",
|
||||
"[00:00:25] Extracting at 267.3MB/s",
|
||||
"[00:00:30] Progress: 85%",
|
||||
]
|
||||
|
||||
print("Processing sample output lines...\n")
|
||||
|
||||
for line in sample_lines:
|
||||
updated = state_manager.process_line(line)
|
||||
if updated:
|
||||
state = state_manager.get_state()
|
||||
print(f"Line: {line}")
|
||||
print(f" Phase: {state.phase.value} - {state.phase_name}")
|
||||
print(f" Progress: {state.overall_percent:.1f}%")
|
||||
print(f" Step: {state.phase_progress_text}")
|
||||
print(f" Data: {state.data_progress_text}")
|
||||
print(f" Active Files: {len(state.active_files)}")
|
||||
for file_prog in state.active_files:
|
||||
print(f" - {file_prog.filename}: {file_prog.percent:.1f}%")
|
||||
print(f" Speeds: {state.speeds}")
|
||||
print(f" Display: {state.display_text}")
|
||||
print()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
example_usage()
|
||||
|
||||
@@ -34,36 +34,134 @@ class ProtontricksHandler:
|
||||
self.steamdeck = steamdeck # Store steamdeck status
|
||||
self._native_steam_service = None
|
||||
self.use_native_operations = True # Enable native Steam operations by default
|
||||
|
||||
def _get_steam_dir_from_libraryfolders(self) -> Optional[Path]:
|
||||
"""
|
||||
Determine the Steam installation directory from libraryfolders.vdf location.
|
||||
This is the source of truth - we read libraryfolders.vdf to find where Steam is actually installed.
|
||||
|
||||
Returns:
|
||||
Path to Steam installation directory (the one with config/, steamapps/, etc.) or None
|
||||
"""
|
||||
from ..handlers.path_handler import PathHandler
|
||||
|
||||
# Check all possible libraryfolders.vdf locations
|
||||
vdf_paths = [
|
||||
Path.home() / ".steam/steam/config/libraryfolders.vdf",
|
||||
Path.home() / ".local/share/Steam/config/libraryfolders.vdf",
|
||||
Path.home() / ".steam/root/config/libraryfolders.vdf",
|
||||
Path.home() / ".var/app/com.valvesoftware.Steam/.local/share/Steam/config/libraryfolders.vdf", # Flatpak
|
||||
Path.home() / ".var/app/com.valvesoftware.Steam/data/Steam/config/libraryfolders.vdf", # Flatpak alternative
|
||||
]
|
||||
|
||||
for vdf_path in vdf_paths:
|
||||
if vdf_path.is_file():
|
||||
# The Steam installation directory is the parent of the config directory
|
||||
steam_dir = vdf_path.parent.parent
|
||||
# Verify it has steamapps directory (required by protontricks)
|
||||
if (steam_dir / "steamapps").exists():
|
||||
logger.debug(f"Determined STEAM_DIR from libraryfolders.vdf: {steam_dir}")
|
||||
return steam_dir
|
||||
|
||||
# Fallback: try to get from library paths
|
||||
library_paths = PathHandler.get_all_steam_library_paths()
|
||||
if library_paths:
|
||||
# For Flatpak Steam, library path is .local/share/Steam, but Steam installation might be data/Steam
|
||||
first_lib = library_paths[0]
|
||||
if '.var/app/com.valvesoftware.Steam' in str(first_lib):
|
||||
# Check if data/Steam exists (main Flatpak Steam installation)
|
||||
data_steam = Path.home() / ".var/app/com.valvesoftware.Steam/data/Steam"
|
||||
if (data_steam / "steamapps").exists():
|
||||
logger.debug(f"Determined STEAM_DIR from Flatpak data path: {data_steam}")
|
||||
return data_steam
|
||||
# Otherwise use the library path itself
|
||||
if (first_lib / "steamapps").exists():
|
||||
logger.debug(f"Determined STEAM_DIR from Flatpak library path: {first_lib}")
|
||||
return first_lib
|
||||
else:
|
||||
# Native Steam - library path should be the Steam installation
|
||||
if (first_lib / "steamapps").exists():
|
||||
logger.debug(f"Determined STEAM_DIR from native library path: {first_lib}")
|
||||
return first_lib
|
||||
|
||||
logger.warning("Could not determine STEAM_DIR from libraryfolders.vdf")
|
||||
return None
|
||||
|
||||
def _get_bundled_winetricks_path(self) -> Optional[Path]:
|
||||
"""
|
||||
Get the path to the bundled winetricks script following AppImage best practices.
|
||||
Same logic as WinetricksHandler._get_bundled_winetricks_path()
|
||||
"""
|
||||
possible_paths = []
|
||||
|
||||
# AppImage environment - use APPDIR (standard AppImage best practice)
|
||||
if os.environ.get('APPDIR'):
|
||||
appdir_path = Path(os.environ['APPDIR']) / 'opt' / 'jackify' / 'tools' / 'winetricks'
|
||||
possible_paths.append(appdir_path)
|
||||
|
||||
# Development environment - relative to module location
|
||||
module_dir = Path(__file__).parent.parent.parent # Go from handlers/ up to jackify/
|
||||
dev_path = module_dir / 'tools' / 'winetricks'
|
||||
possible_paths.append(dev_path)
|
||||
|
||||
# Try each path until we find one that works
|
||||
for path in possible_paths:
|
||||
if path.exists() and os.access(path, os.X_OK):
|
||||
logger.debug(f"Found bundled winetricks at: {path}")
|
||||
return path
|
||||
|
||||
logger.warning(f"Bundled winetricks not found. Tried paths: {possible_paths}")
|
||||
return None
|
||||
|
||||
def _get_bundled_cabextract_path(self) -> Optional[Path]:
|
||||
"""
|
||||
Get the path to the bundled cabextract binary following AppImage best practices.
|
||||
Same logic as WinetricksHandler._get_bundled_cabextract()
|
||||
"""
|
||||
possible_paths = []
|
||||
|
||||
# AppImage environment - use APPDIR (standard AppImage best practice)
|
||||
if os.environ.get('APPDIR'):
|
||||
appdir_path = Path(os.environ['APPDIR']) / 'opt' / 'jackify' / 'tools' / 'cabextract'
|
||||
possible_paths.append(appdir_path)
|
||||
|
||||
# Development environment - relative to module location
|
||||
module_dir = Path(__file__).parent.parent.parent # Go from handlers/ up to jackify/
|
||||
dev_path = module_dir / 'tools' / 'cabextract'
|
||||
possible_paths.append(dev_path)
|
||||
|
||||
# Try each path until we find one that works
|
||||
for path in possible_paths:
|
||||
if path.exists() and os.access(path, os.X_OK):
|
||||
logger.debug(f"Found bundled cabextract at: {path}")
|
||||
return path
|
||||
|
||||
logger.warning(f"Bundled cabextract not found. Tried paths: {possible_paths}")
|
||||
return None
|
||||
|
||||
def _get_clean_subprocess_env(self):
|
||||
"""
|
||||
Create a clean environment for subprocess calls by removing PyInstaller-specific
|
||||
Create a clean environment for subprocess calls by removing bundle-specific
|
||||
environment variables that can interfere with external program execution.
|
||||
|
||||
Uses the centralized get_clean_subprocess_env() to ensure AppImage variables
|
||||
are removed to prevent subprocess spawning issues.
|
||||
|
||||
Returns:
|
||||
dict: Cleaned environment dictionary
|
||||
"""
|
||||
env = os.environ.copy()
|
||||
|
||||
# Remove PyInstaller-specific environment variables
|
||||
env.pop('_MEIPASS', None)
|
||||
env.pop('_MEIPASS2', None)
|
||||
|
||||
# Clean library path variables that PyInstaller modifies (Linux/Unix)
|
||||
# Use centralized function that removes AppImage variables
|
||||
from .subprocess_utils import get_clean_subprocess_env
|
||||
env = get_clean_subprocess_env()
|
||||
|
||||
# Clean library path variables that frozen bundles modify (Linux/Unix)
|
||||
if 'LD_LIBRARY_PATH_ORIG' in env:
|
||||
# Restore original LD_LIBRARY_PATH if it was backed up by PyInstaller
|
||||
# Restore original LD_LIBRARY_PATH if it was backed up by the bundler
|
||||
env['LD_LIBRARY_PATH'] = env['LD_LIBRARY_PATH_ORIG']
|
||||
else:
|
||||
# Remove PyInstaller-modified LD_LIBRARY_PATH
|
||||
# Remove bundle-modified LD_LIBRARY_PATH
|
||||
env.pop('LD_LIBRARY_PATH', None)
|
||||
|
||||
# Clean PATH of PyInstaller-specific entries
|
||||
if 'PATH' in env and hasattr(sys, '_MEIPASS'):
|
||||
path_entries = env['PATH'].split(os.pathsep)
|
||||
# Remove any PATH entries that point to PyInstaller temp directory
|
||||
cleaned_path = [p for p in path_entries if not p.startswith(sys._MEIPASS)]
|
||||
env['PATH'] = os.pathsep.join(cleaned_path)
|
||||
|
||||
# Clean macOS library path (if present)
|
||||
if 'DYLD_LIBRARY_PATH' in env and hasattr(sys, '_MEIPASS'):
|
||||
dyld_entries = env['DYLD_LIBRARY_PATH'].split(os.pathsep)
|
||||
@@ -84,17 +182,17 @@ class ProtontricksHandler:
|
||||
|
||||
def detect_protontricks(self):
|
||||
"""
|
||||
Detect if protontricks is installed and whether it's flatpak or native.
|
||||
If not found, prompts the user to install the Flatpak version.
|
||||
Detect if protontricks is installed (silent detection for GUI/automated use).
|
||||
|
||||
Returns True if protontricks is found or successfully installed, False otherwise
|
||||
Returns True if protontricks is found, False otherwise.
|
||||
Does NOT prompt user or attempt installation - that's handled by the GUI.
|
||||
"""
|
||||
logger.debug("Detecting if protontricks is installed...")
|
||||
|
||||
|
||||
# Check if protontricks exists as a command
|
||||
protontricks_path_which = shutil.which("protontricks")
|
||||
self.flatpak_path = shutil.which("flatpak") # Store for later use
|
||||
|
||||
self.flatpak_path = shutil.which("flatpak")
|
||||
|
||||
if protontricks_path_which:
|
||||
# Check if it's a flatpak wrapper
|
||||
try:
|
||||
@@ -103,7 +201,6 @@ class ProtontricksHandler:
|
||||
if "flatpak run" in content:
|
||||
logger.debug(f"Detected Protontricks is a Flatpak wrapper at {protontricks_path_which}")
|
||||
self.which_protontricks = 'flatpak'
|
||||
# Continue to check flatpak list just to be sure
|
||||
else:
|
||||
logger.info(f"Native Protontricks found at {protontricks_path_which}")
|
||||
self.which_protontricks = 'native'
|
||||
@@ -111,85 +208,27 @@ class ProtontricksHandler:
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.error(f"Error reading protontricks executable: {e}")
|
||||
|
||||
# Check if flatpak protontricks is installed (or if wrapper check indicated flatpak)
|
||||
flatpak_installed = False
|
||||
|
||||
# Check if flatpak protontricks is installed
|
||||
try:
|
||||
# PyInstaller fix: Comprehensive environment cleaning for subprocess calls
|
||||
env = self._get_clean_subprocess_env()
|
||||
|
||||
result = subprocess.run(
|
||||
["flatpak", "list"],
|
||||
capture_output=True,
|
||||
["flatpak", "list"],
|
||||
capture_output=True,
|
||||
text=True,
|
||||
check=True,
|
||||
env=env # Use comprehensively cleaned environment
|
||||
env=env
|
||||
)
|
||||
if "com.github.Matoking.protontricks" in result.stdout:
|
||||
if result.returncode == 0 and "com.github.Matoking.protontricks" in result.stdout:
|
||||
logger.info("Flatpak Protontricks is installed")
|
||||
self.which_protontricks = 'flatpak'
|
||||
flatpak_installed = True
|
||||
return True
|
||||
except FileNotFoundError:
|
||||
logger.warning("'flatpak' command not found. Cannot check for Flatpak Protontricks.")
|
||||
except subprocess.CalledProcessError as e:
|
||||
logger.warning(f"Error checking flatpak list: {e}")
|
||||
logger.warning("'flatpak' command not found. Cannot check for Flatpak Protontricks.")
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error checking flatpak: {e}")
|
||||
|
||||
# If neither native nor flatpak found, prompt for installation
|
||||
if not self.which_protontricks:
|
||||
logger.warning("Protontricks not found (native or flatpak).")
|
||||
|
||||
should_install = False
|
||||
if self.steamdeck:
|
||||
logger.info("Running on Steam Deck, attempting automatic Flatpak installation.")
|
||||
# Maybe add a brief pause or message?
|
||||
print("Protontricks not found. Attempting automatic installation via Flatpak...")
|
||||
should_install = True
|
||||
else:
|
||||
try:
|
||||
response = input("Protontricks not found. Install the Flatpak version? (Y/n): ").lower()
|
||||
if response == 'y' or response == '':
|
||||
should_install = True
|
||||
except KeyboardInterrupt:
|
||||
print("\nInstallation cancelled.")
|
||||
return False
|
||||
|
||||
if should_install:
|
||||
try:
|
||||
logger.info("Attempting to install Flatpak Protontricks...")
|
||||
# Use --noninteractive for automatic install where applicable
|
||||
install_cmd = ["flatpak", "install", "-u", "-y", "--noninteractive", "flathub", "com.github.Matoking.protontricks"]
|
||||
|
||||
# PyInstaller fix: Comprehensive environment cleaning for subprocess calls
|
||||
env = self._get_clean_subprocess_env()
|
||||
|
||||
# Run with output visible to user
|
||||
process = subprocess.run(install_cmd, check=True, text=True, env=env)
|
||||
logger.info("Flatpak Protontricks installation successful.")
|
||||
print("Flatpak Protontricks installed successfully.")
|
||||
self.which_protontricks = 'flatpak'
|
||||
return True
|
||||
except FileNotFoundError:
|
||||
logger.error("'flatpak' command not found. Cannot install.")
|
||||
print("Error: 'flatpak' command not found. Please install Flatpak first.")
|
||||
return False
|
||||
except subprocess.CalledProcessError as e:
|
||||
logger.error(f"Flatpak installation failed: {e}")
|
||||
print(f"Error: Flatpak installation failed (Command: {' '.join(e.cmd)}). Please try installing manually.")
|
||||
return False
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error during Flatpak installation: {e}")
|
||||
print("An unexpected error occurred during installation.")
|
||||
return False
|
||||
else:
|
||||
logger.error("User chose not to install Protontricks or installation skipped.")
|
||||
print("Protontricks installation skipped. Cannot continue without Protontricks.")
|
||||
return False
|
||||
|
||||
# Should not reach here if logic is correct, but acts as a fallback
|
||||
logger.error("Protontricks detection failed unexpectedly.")
|
||||
# Not found
|
||||
logger.warning("Protontricks not found (native or flatpak).")
|
||||
return False
|
||||
|
||||
def check_protontricks_version(self):
|
||||
@@ -239,13 +278,30 @@ class ProtontricksHandler:
|
||||
logger.error("Could not detect protontricks installation")
|
||||
return None
|
||||
|
||||
if self.which_protontricks == 'flatpak':
|
||||
# Build command based on detected protontricks type
|
||||
if self.which_protontricks == 'bundled':
|
||||
# CRITICAL: Use safe Python executable to prevent AppImage recursive spawning
|
||||
from .subprocess_utils import get_safe_python_executable
|
||||
python_exe = get_safe_python_executable()
|
||||
|
||||
# Use bundled wrapper script for reliable invocation
|
||||
# The wrapper script imports cli and calls it with sys.argv
|
||||
wrapper_script = self._get_bundled_protontricks_wrapper_path()
|
||||
if wrapper_script and Path(wrapper_script).exists():
|
||||
cmd = [python_exe, str(wrapper_script)]
|
||||
cmd.extend([str(a) for a in args])
|
||||
else:
|
||||
# Fallback: use python -m to run protontricks CLI directly
|
||||
# This avoids importing protontricks.__init__ which imports gui.py which needs Pillow
|
||||
cmd = [python_exe, "-m", "protontricks.cli.main"]
|
||||
cmd.extend([str(a) for a in args])
|
||||
elif self.which_protontricks == 'flatpak':
|
||||
cmd = ["flatpak", "run", "com.github.Matoking.protontricks"]
|
||||
else:
|
||||
cmd.extend(args)
|
||||
else: # native
|
||||
cmd = ["protontricks"]
|
||||
|
||||
cmd.extend(args)
|
||||
|
||||
cmd.extend(args)
|
||||
|
||||
# Default to capturing stdout/stderr unless specified otherwise in kwargs
|
||||
run_kwargs = {
|
||||
'stdout': subprocess.PIPE,
|
||||
@@ -253,18 +309,73 @@ class ProtontricksHandler:
|
||||
'text': True,
|
||||
**kwargs # Allow overriding defaults (like stderr=DEVNULL)
|
||||
}
|
||||
# PyInstaller fix: Use cleaned environment for all protontricks calls
|
||||
env = self._get_clean_subprocess_env()
|
||||
|
||||
# Handle environment: if env was passed in kwargs, merge it with our clean env
|
||||
# Otherwise create a clean env from scratch
|
||||
if 'env' in kwargs and kwargs['env']:
|
||||
# Merge passed env with our clean env (our values take precedence)
|
||||
env = self._get_clean_subprocess_env()
|
||||
env.update(kwargs['env']) # Merge passed env, but our clean env is base
|
||||
# Re-apply our critical settings after merge to ensure they're set
|
||||
else:
|
||||
# Bundled-runtime fix: Use cleaned environment for all protontricks calls
|
||||
env = self._get_clean_subprocess_env()
|
||||
|
||||
# Suppress Wine debug output
|
||||
env['WINEDEBUG'] = '-all'
|
||||
|
||||
# CRITICAL: Set STEAM_DIR based on libraryfolders.vdf to prevent user prompts
|
||||
steam_dir = self._get_steam_dir_from_libraryfolders()
|
||||
if steam_dir:
|
||||
env['STEAM_DIR'] = str(steam_dir)
|
||||
logger.debug(f"Set STEAM_DIR for protontricks: {steam_dir}")
|
||||
else:
|
||||
logger.warning("Could not determine STEAM_DIR from libraryfolders.vdf - protontricks may prompt user")
|
||||
|
||||
# CRITICAL: Only set bundled winetricks for NATIVE protontricks
|
||||
# Flatpak protontricks runs in a sandbox and CANNOT access AppImage FUSE mounts (/tmp/.mount_*)
|
||||
# Flatpak protontricks has its own winetricks bundled inside the flatpak
|
||||
if self.which_protontricks == 'native':
|
||||
winetricks_path = self._get_bundled_winetricks_path()
|
||||
if winetricks_path:
|
||||
env['WINETRICKS'] = str(winetricks_path)
|
||||
logger.debug(f"Set WINETRICKS for native protontricks: {winetricks_path}")
|
||||
else:
|
||||
logger.warning("Bundled winetricks not found - native protontricks will use system winetricks")
|
||||
|
||||
cabextract_path = self._get_bundled_cabextract_path()
|
||||
if cabextract_path:
|
||||
cabextract_dir = str(cabextract_path.parent)
|
||||
current_path = env.get('PATH', '')
|
||||
env['PATH'] = f"{cabextract_dir}{os.pathsep}{current_path}" if current_path else cabextract_dir
|
||||
logger.debug(f"Added bundled cabextract to PATH for native protontricks: {cabextract_dir}")
|
||||
else:
|
||||
logger.warning("Bundled cabextract not found - native protontricks will use system cabextract")
|
||||
else:
|
||||
# Flatpak protontricks - DO NOT set bundled paths
|
||||
logger.debug(f"Using {self.which_protontricks} protontricks - it has its own winetricks (cannot access AppImage mounts)")
|
||||
|
||||
# CRITICAL: Suppress winetricks verbose output when not in debug mode
|
||||
# WINETRICKS_SUPER_QUIET suppresses "Executing..." messages from winetricks
|
||||
from ..handlers.config_handler import ConfigHandler
|
||||
config_handler = ConfigHandler()
|
||||
debug_mode = config_handler.get('debug_mode', False)
|
||||
if not debug_mode:
|
||||
env['WINETRICKS_SUPER_QUIET'] = '1'
|
||||
logger.debug("Set WINETRICKS_SUPER_QUIET=1 to suppress winetricks verbose output")
|
||||
else:
|
||||
logger.debug("Debug mode enabled - winetricks verbose output will be shown")
|
||||
|
||||
# Note: No need to modify LD_LIBRARY_PATH for Wine/Proton as it's a system dependency
|
||||
# Wine/Proton finds its own libraries through the system's library search paths
|
||||
|
||||
run_kwargs['env'] = env
|
||||
try:
|
||||
return subprocess.run(cmd, **run_kwargs)
|
||||
except Exception as e:
|
||||
logger.error(f"Error running protontricks: {e}")
|
||||
# Consider returning a mock CompletedProcess with an error code?
|
||||
return None
|
||||
|
||||
|
||||
def set_protontricks_permissions(self, modlist_dir, steamdeck=False):
|
||||
"""
|
||||
Set permissions for Steam operations to access the modlist directory.
|
||||
@@ -286,7 +397,7 @@ class ProtontricksHandler:
|
||||
|
||||
logger.info("Setting Protontricks permissions...")
|
||||
try:
|
||||
# PyInstaller fix: Use cleaned environment
|
||||
# Bundled-runtime fix: Use cleaned environment
|
||||
env = self._get_clean_subprocess_env()
|
||||
|
||||
subprocess.run(["flatpak", "override", "--user", "com.github.Matoking.protontricks",
|
||||
@@ -396,7 +507,7 @@ class ProtontricksHandler:
|
||||
logger.error("Protontricks path not determined, cannot list shortcuts.")
|
||||
return {}
|
||||
self.logger.debug(f"Running command: {' '.join(cmd)}")
|
||||
# PyInstaller fix: Use cleaned environment
|
||||
# Bundled-runtime fix: Use cleaned environment
|
||||
env = self._get_clean_subprocess_env()
|
||||
result = subprocess.run(cmd, capture_output=True, text=True, check=True, encoding='utf-8', errors='ignore', env=env)
|
||||
# Regex to capture name and AppID
|
||||
@@ -548,7 +659,7 @@ class ProtontricksHandler:
|
||||
Returns True on success, False on failure
|
||||
"""
|
||||
try:
|
||||
# PyInstaller fix: Use cleaned environment
|
||||
# Bundled-runtime fix: Use cleaned environment
|
||||
env = self._get_clean_subprocess_env()
|
||||
env["WINEDEBUG"] = "-all"
|
||||
|
||||
@@ -634,16 +745,22 @@ class ProtontricksHandler:
|
||||
|
||||
def run_protontricks_launch(self, appid, installer_path, *extra_args):
|
||||
"""
|
||||
Run protontricks-launch (for WebView or similar installers) using the correct method for flatpak or native.
|
||||
Run protontricks-launch (for WebView or similar installers) using the correct method for bundled, flatpak, or native.
|
||||
Returns subprocess.CompletedProcess object.
|
||||
"""
|
||||
if self.which_protontricks is None:
|
||||
if not self.detect_protontricks():
|
||||
self.logger.error("Could not detect protontricks installation")
|
||||
return None
|
||||
if self.which_protontricks == 'flatpak':
|
||||
if self.which_protontricks == 'bundled':
|
||||
# CRITICAL: Use safe Python executable to prevent AppImage recursive spawning
|
||||
from .subprocess_utils import get_safe_python_executable
|
||||
python_exe = get_safe_python_executable()
|
||||
# Use bundled Python module
|
||||
cmd = [python_exe, "-m", "protontricks.cli.launch", "--appid", appid, str(installer_path)]
|
||||
elif self.which_protontricks == 'flatpak':
|
||||
cmd = ["flatpak", "run", "--command=protontricks-launch", "com.github.Matoking.protontricks", "--appid", appid, str(installer_path)]
|
||||
else:
|
||||
else: # native
|
||||
launch_path = shutil.which("protontricks-launch")
|
||||
if not launch_path:
|
||||
self.logger.error("protontricks-launch command not found in PATH.")
|
||||
@@ -653,13 +770,63 @@ class ProtontricksHandler:
|
||||
cmd.extend(extra_args)
|
||||
self.logger.debug(f"Running protontricks-launch: {' '.join(map(str, cmd))}")
|
||||
try:
|
||||
# PyInstaller fix: Use cleaned environment
|
||||
# Bundled-runtime fix: Use cleaned environment
|
||||
env = self._get_clean_subprocess_env()
|
||||
return subprocess.run(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, text=True, env=env)
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error running protontricks-launch: {e}")
|
||||
return None
|
||||
|
||||
def _ensure_flatpak_cache_access(self, cache_path: Path) -> bool:
|
||||
"""
|
||||
Ensure flatpak protontricks has filesystem access to the winetricks cache.
|
||||
|
||||
Args:
|
||||
cache_path: Path to winetricks cache directory
|
||||
|
||||
Returns:
|
||||
True if access granted or already exists, False on failure
|
||||
"""
|
||||
if self.which_protontricks != 'flatpak':
|
||||
return True # Not flatpak, no action needed
|
||||
|
||||
try:
|
||||
# Check if flatpak already has access to this path
|
||||
result = subprocess.run(
|
||||
['flatpak', 'override', '--user', '--show', 'com.github.Matoking.protontricks'],
|
||||
capture_output=True,
|
||||
text=True,
|
||||
timeout=10
|
||||
)
|
||||
|
||||
if result.returncode == 0:
|
||||
# Check if cache path is already in filesystem overrides
|
||||
cache_str = str(cache_path.resolve())
|
||||
if f'filesystems=' in result.stdout and cache_str in result.stdout:
|
||||
self.logger.debug(f"Flatpak protontricks already has access to cache: {cache_str}")
|
||||
return True
|
||||
|
||||
# Grant access to cache directory
|
||||
self.logger.info(f"Granting flatpak protontricks access to winetricks cache: {cache_path}")
|
||||
result = subprocess.run(
|
||||
['flatpak', 'override', '--user', 'com.github.Matoking.protontricks',
|
||||
f'--filesystem={cache_path.resolve()}'],
|
||||
capture_output=True,
|
||||
text=True,
|
||||
timeout=10
|
||||
)
|
||||
|
||||
if result.returncode == 0:
|
||||
self.logger.info("Successfully granted flatpak protontricks cache access")
|
||||
return True
|
||||
else:
|
||||
self.logger.warning(f"Failed to grant flatpak cache access: {result.stderr}")
|
||||
return False
|
||||
|
||||
except Exception as e:
|
||||
self.logger.warning(f"Could not configure flatpak cache access: {e}")
|
||||
return False
|
||||
|
||||
def install_wine_components(self, appid, game_var, specific_components: Optional[List[str]] = None):
|
||||
"""
|
||||
Install the specified Wine components into the given prefix using protontricks.
|
||||
@@ -667,6 +834,48 @@ class ProtontricksHandler:
|
||||
"""
|
||||
env = self._get_clean_subprocess_env()
|
||||
env["WINEDEBUG"] = "-all"
|
||||
|
||||
# CRITICAL: Only set bundled winetricks for NATIVE protontricks
|
||||
# Flatpak protontricks runs in a sandbox and CANNOT access AppImage FUSE mounts (/tmp/.mount_*)
|
||||
# Flatpak protontricks has its own winetricks bundled inside the flatpak
|
||||
if self.which_protontricks == 'native':
|
||||
winetricks_path = self._get_bundled_winetricks_path()
|
||||
if winetricks_path:
|
||||
env['WINETRICKS'] = str(winetricks_path)
|
||||
self.logger.debug(f"Set WINETRICKS for native protontricks: {winetricks_path}")
|
||||
else:
|
||||
self.logger.warning("Bundled winetricks not found - native protontricks will use system winetricks")
|
||||
|
||||
cabextract_path = self._get_bundled_cabextract_path()
|
||||
if cabextract_path:
|
||||
cabextract_dir = str(cabextract_path.parent)
|
||||
current_path = env.get('PATH', '')
|
||||
env['PATH'] = f"{cabextract_dir}{os.pathsep}{current_path}" if current_path else cabextract_dir
|
||||
self.logger.debug(f"Added bundled cabextract to PATH for native protontricks: {cabextract_dir}")
|
||||
else:
|
||||
self.logger.warning("Bundled cabextract not found - native protontricks will use system cabextract")
|
||||
else:
|
||||
# Flatpak protontricks - DO NOT set bundled paths
|
||||
self.logger.info(f"Using {self.which_protontricks} protontricks - it has its own winetricks (cannot access AppImage mounts)")
|
||||
|
||||
# CRITICAL: Suppress winetricks verbose output when not in debug mode
|
||||
from ..handlers.config_handler import ConfigHandler
|
||||
config_handler = ConfigHandler()
|
||||
debug_mode = config_handler.get('debug_mode', False)
|
||||
if not debug_mode:
|
||||
env['WINETRICKS_SUPER_QUIET'] = '1'
|
||||
self.logger.debug("Set WINETRICKS_SUPER_QUIET=1 in install_wine_components to suppress winetricks verbose output")
|
||||
|
||||
# Set up winetricks cache (shared with winetricks_handler for efficiency)
|
||||
from jackify.shared.paths import get_jackify_data_dir
|
||||
jackify_cache_dir = get_jackify_data_dir() / 'winetricks_cache'
|
||||
jackify_cache_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# Ensure flatpak protontricks has access to cache (no-op for native)
|
||||
self._ensure_flatpak_cache_access(jackify_cache_dir)
|
||||
|
||||
env['WINETRICKS_CACHE'] = str(jackify_cache_dir)
|
||||
self.logger.info(f"Using winetricks cache: {jackify_cache_dir}")
|
||||
if specific_components is not None:
|
||||
components_to_install = specific_components
|
||||
self.logger.info(f"Installing specific components: {components_to_install}")
|
||||
@@ -687,25 +896,108 @@ class ProtontricksHandler:
|
||||
result = self.run_protontricks("--no-bwrap", appid, "-q", *components_to_install, env=env, timeout=600)
|
||||
self.logger.debug(f"Protontricks output: {result.stdout if result else ''}")
|
||||
if result and result.returncode == 0:
|
||||
self.logger.info("Wine Component installation command completed successfully.")
|
||||
return True
|
||||
self.logger.info("Wine Component installation command completed.")
|
||||
|
||||
# Verify components were actually installed
|
||||
if self._verify_components_installed(appid, components_to_install):
|
||||
self.logger.info("Component verification successful - all components installed correctly.")
|
||||
return True
|
||||
else:
|
||||
self.logger.error(f"Component verification failed (Attempt {attempt}/{max_attempts})")
|
||||
# Continue to retry
|
||||
else:
|
||||
self.logger.error(f"Protontricks command failed (Attempt {attempt}/{max_attempts}). Return Code: {result.returncode if result else 'N/A'}")
|
||||
self.logger.error(f"Stdout: {result.stdout.strip() if result else ''}")
|
||||
self.logger.error(f"Stderr: {result.stderr.strip() if result else ''}")
|
||||
# Only show stdout/stderr in debug mode to avoid verbose output
|
||||
from ..handlers.config_handler import ConfigHandler
|
||||
config_handler = ConfigHandler()
|
||||
debug_mode = config_handler.get('debug_mode', False)
|
||||
if debug_mode:
|
||||
self.logger.error(f"Stdout: {result.stdout.strip() if result else ''}")
|
||||
self.logger.error(f"Stderr: {result.stderr.strip() if result else ''}")
|
||||
else:
|
||||
# In non-debug mode, only show stderr if it contains actual errors (not verbose winetricks output)
|
||||
if result and result.stderr:
|
||||
stderr_lower = result.stderr.lower()
|
||||
# Filter out verbose winetricks messages
|
||||
if any(keyword in stderr_lower for keyword in ['error', 'failed', 'cannot', 'warning: cannot find']):
|
||||
# Only show actual errors, not "Executing..." messages
|
||||
error_lines = [line for line in result.stderr.strip().split('\n')
|
||||
if any(keyword in line.lower() for keyword in ['error', 'failed', 'cannot', 'warning: cannot find'])
|
||||
and 'executing' not in line.lower()]
|
||||
if error_lines:
|
||||
self.logger.error(f"Stderr (errors only): {' '.join(error_lines)}")
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error during protontricks run (Attempt {attempt}/{max_attempts}): {e}", exc_info=True)
|
||||
self.logger.error(f"Failed to install Wine components after {max_attempts} attempts.")
|
||||
return False
|
||||
|
||||
def _verify_components_installed(self, appid: str, components: List[str]) -> bool:
|
||||
"""
|
||||
Verify that Wine components were actually installed by querying protontricks.
|
||||
|
||||
Args:
|
||||
appid: Steam AppID
|
||||
components: List of components that should be installed
|
||||
|
||||
Returns:
|
||||
bool: True if all critical components are verified, False otherwise
|
||||
"""
|
||||
try:
|
||||
self.logger.info("Verifying installed components...")
|
||||
|
||||
# Run protontricks list-installed to get actual installed components
|
||||
result = self.run_protontricks("--no-bwrap", appid, "list-installed", timeout=30)
|
||||
|
||||
if not result or result.returncode != 0:
|
||||
self.logger.error("Failed to query installed components")
|
||||
self.logger.debug(f"list-installed stderr: {result.stderr if result else 'N/A'}")
|
||||
return False
|
||||
|
||||
installed_output = result.stdout.lower()
|
||||
self.logger.debug(f"Installed components output: {installed_output}")
|
||||
|
||||
# Define critical components that MUST be installed
|
||||
# These are the core components that determine success
|
||||
critical_components = ["vcrun2022", "xact"]
|
||||
|
||||
# Check for critical components
|
||||
missing_critical = []
|
||||
for component in critical_components:
|
||||
if component.lower() not in installed_output:
|
||||
missing_critical.append(component)
|
||||
|
||||
if missing_critical:
|
||||
self.logger.error(f"CRITICAL: Missing essential components: {missing_critical}")
|
||||
self.logger.error("Installation reported success but components are NOT installed")
|
||||
return False
|
||||
|
||||
# Check for requested components (warn but don't fail)
|
||||
missing_requested = []
|
||||
for component in components:
|
||||
# Handle settings like fontsmooth=rgb (just check the base component name)
|
||||
base_component = component.split('=')[0].lower()
|
||||
if base_component not in installed_output and component.lower() not in installed_output:
|
||||
missing_requested.append(component)
|
||||
|
||||
if missing_requested:
|
||||
self.logger.warning(f"Some requested components may not be installed: {missing_requested}")
|
||||
self.logger.warning("This may cause issues, but critical components are present")
|
||||
|
||||
self.logger.info(f"Verification passed - critical components confirmed: {critical_components}")
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error verifying components: {e}", exc_info=True)
|
||||
return False
|
||||
|
||||
def _cleanup_wine_processes(self):
|
||||
"""
|
||||
Internal method to clean up wine processes during component installation
|
||||
"""
|
||||
try:
|
||||
subprocess.run("pgrep -f 'win7|win10|ShowDotFiles|protontricks' | xargs -r kill -9",
|
||||
subprocess.run("pgrep -f 'win7|win10|ShowDotFiles|protontricks' | xargs -r kill -9",
|
||||
shell=True, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)
|
||||
subprocess.run("pkill -9 winetricks",
|
||||
subprocess.run("pkill -9 winetricks",
|
||||
shell=True, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)
|
||||
except Exception as e:
|
||||
logger.error(f"Error cleaning up wine processes: {e}")
|
||||
|
||||
@@ -198,12 +198,15 @@ class ShortcutHandler:
|
||||
if steam_vdf_spec is None:
|
||||
# Try to install steam-vdf using pip
|
||||
print("Installing required dependency (steam-vdf)...")
|
||||
subprocess.check_call([sys.executable, "-m", "pip", "install", "steam-vdf", "--user"])
|
||||
# CRITICAL: Use safe Python executable to prevent AppImage recursive spawning
|
||||
from jackify.backend.handlers.subprocess_utils import get_safe_python_executable
|
||||
python_exe = get_safe_python_executable()
|
||||
subprocess.check_call([python_exe, "-m", "pip", "install", "steam-vdf", "--user"])
|
||||
time.sleep(1) # Give some time for the install to complete
|
||||
|
||||
# Now import it
|
||||
import steam_vdf
|
||||
|
||||
import vdf as steam_vdf
|
||||
|
||||
with open(shortcuts_file, 'rb') as f:
|
||||
shortcuts_data = steam_vdf.load(f)
|
||||
|
||||
@@ -952,7 +955,10 @@ class ShortcutHandler:
|
||||
|
||||
def get_appid_for_shortcut(self, shortcut_name: str, exe_path: Optional[str] = None) -> Optional[str]:
|
||||
"""
|
||||
Find the current AppID for a given shortcut name and (optionally) executable path using protontricks.
|
||||
Find the current AppID for a given shortcut name and (optionally) executable path.
|
||||
|
||||
Primary method: Read directly from shortcuts.vdf (reliable, no external dependencies)
|
||||
Fallback method: Use protontricks (if available)
|
||||
|
||||
Args:
|
||||
shortcut_name (str): The name of the Steam shortcut.
|
||||
@@ -962,15 +968,22 @@ class ShortcutHandler:
|
||||
Optional[str]: The found AppID string, or None if not found or error occurs.
|
||||
"""
|
||||
self.logger.info(f"Attempting to find current AppID for shortcut: '{shortcut_name}' (exe_path: '{exe_path}')")
|
||||
|
||||
try:
|
||||
from .protontricks_handler import ProtontricksHandler # Local import
|
||||
appid = self.get_appid_from_vdf(shortcut_name, exe_path)
|
||||
if appid:
|
||||
self.logger.info(f"Successfully found AppID {appid} from shortcuts.vdf")
|
||||
return appid
|
||||
|
||||
self.logger.info("AppID not found in shortcuts.vdf, trying protontricks as fallback...")
|
||||
from .protontricks_handler import ProtontricksHandler
|
||||
pt_handler = ProtontricksHandler(self.steamdeck)
|
||||
if not pt_handler.detect_protontricks():
|
||||
self.logger.error("Protontricks not detected")
|
||||
self.logger.warning("Protontricks not detected - cannot use as fallback")
|
||||
return None
|
||||
result = pt_handler.run_protontricks("-l")
|
||||
if not result or result.returncode != 0:
|
||||
self.logger.error(f"Protontricks failed to list applications: {result.stderr if result else 'No result'}")
|
||||
self.logger.warning(f"Protontricks fallback failed: {result.stderr if result else 'No result'}")
|
||||
return None
|
||||
# Build a list of all shortcuts
|
||||
found_shortcuts = []
|
||||
@@ -1019,8 +1032,64 @@ class ShortcutHandler:
|
||||
self.logger.exception("Traceback:")
|
||||
return None
|
||||
|
||||
def get_appid_from_vdf(self, shortcut_name: str, exe_path: Optional[str] = None) -> Optional[str]:
|
||||
"""
|
||||
Get AppID directly from shortcuts.vdf by reading the file and matching shortcut name/exe.
|
||||
This is more reliable than using protontricks since it doesn't depend on external tools.
|
||||
|
||||
Args:
|
||||
shortcut_name (str): The name of the Steam shortcut.
|
||||
exe_path (Optional[str]): The path to the executable for additional validation.
|
||||
|
||||
Returns:
|
||||
Optional[str]: The AppID as a string, or None if not found.
|
||||
"""
|
||||
self.logger.info(f"Looking up AppID from shortcuts.vdf for shortcut: '{shortcut_name}' (exe: '{exe_path}')")
|
||||
|
||||
if not self.shortcuts_path or not os.path.isfile(self.shortcuts_path):
|
||||
self.logger.warning(f"Shortcuts.vdf not found at {self.shortcuts_path}")
|
||||
return None
|
||||
|
||||
try:
|
||||
shortcuts_data = VDFHandler.load(self.shortcuts_path, binary=True)
|
||||
if not shortcuts_data or 'shortcuts' not in shortcuts_data:
|
||||
self.logger.warning("No shortcuts found in shortcuts.vdf")
|
||||
return None
|
||||
|
||||
shortcut_name_clean = shortcut_name.strip().lower()
|
||||
|
||||
for idx, shortcut in shortcuts_data['shortcuts'].items():
|
||||
name = shortcut.get('AppName', shortcut.get('appname', '')).strip()
|
||||
|
||||
if name.lower() == shortcut_name_clean:
|
||||
appid = shortcut.get('appid')
|
||||
|
||||
if appid:
|
||||
if exe_path:
|
||||
vdf_exe = shortcut.get('Exe', shortcut.get('exe', '')).strip('"').strip()
|
||||
exe_path_norm = os.path.abspath(os.path.expanduser(exe_path)).lower()
|
||||
vdf_exe_norm = os.path.abspath(os.path.expanduser(vdf_exe)).lower()
|
||||
|
||||
if vdf_exe_norm == exe_path_norm:
|
||||
self.logger.info(f"Found AppID {appid} for shortcut '{name}' with matching exe '{vdf_exe}'")
|
||||
return str(int(appid) & 0xFFFFFFFF)
|
||||
else:
|
||||
self.logger.debug(f"Found shortcut '{name}' but exe doesn't match: '{vdf_exe}' vs '{exe_path}'")
|
||||
continue
|
||||
else:
|
||||
self.logger.info(f"Found AppID {appid} for shortcut '{name}' (no exe validation)")
|
||||
return str(int(appid) & 0xFFFFFFFF)
|
||||
|
||||
self.logger.warning(f"No matching shortcut found in shortcuts.vdf for '{shortcut_name}'")
|
||||
return None
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error reading shortcuts.vdf: {e}")
|
||||
self.logger.exception("Traceback:")
|
||||
return None
|
||||
|
||||
# --- Discovery Methods Moved from ModlistHandler ---
|
||||
|
||||
|
||||
def _scan_shortcuts_for_executable(self, executable_name: str) -> List[Dict[str, str]]:
|
||||
"""
|
||||
Scans the user's shortcuts.vdf file for entries pointing to a specific executable.
|
||||
@@ -1036,7 +1105,7 @@ class ShortcutHandler:
|
||||
matched_shortcuts = []
|
||||
|
||||
if not self.shortcuts_path or not os.path.isfile(self.shortcuts_path):
|
||||
self.logger.error(f"shortcuts.vdf path not found or invalid: {self.shortcuts_path}")
|
||||
self.logger.info(f"No shortcuts.vdf file found at {self.shortcuts_path} - this is normal for new Steam installations")
|
||||
return []
|
||||
|
||||
# Directly process the single shortcuts.vdf file found during init
|
||||
@@ -1159,7 +1228,7 @@ class ShortcutHandler:
|
||||
|
||||
# --- Use the single shortcuts.vdf path found during init ---
|
||||
if not self.shortcuts_path or not os.path.isfile(self.shortcuts_path):
|
||||
self.logger.error(f"shortcuts.vdf path not found or invalid: {self.shortcuts_path}")
|
||||
self.logger.info(f"No shortcuts.vdf file found at {self.shortcuts_path} - this is normal for new Steam installations")
|
||||
return []
|
||||
|
||||
vdf_path = self.shortcuts_path
|
||||
|
||||
@@ -3,17 +3,122 @@ import signal
|
||||
import subprocess
|
||||
import time
|
||||
import resource
|
||||
import sys
|
||||
import shutil
|
||||
|
||||
def get_safe_python_executable():
|
||||
"""
|
||||
Get a safe Python executable for subprocess calls.
|
||||
When running as AppImage, returns system Python instead of AppImage path
|
||||
to prevent recursive AppImage spawning.
|
||||
|
||||
Returns:
|
||||
str: Path to Python executable safe for subprocess calls
|
||||
"""
|
||||
# Check if we're running as AppImage
|
||||
is_appimage = (
|
||||
'APPIMAGE' in os.environ or
|
||||
'APPDIR' in os.environ or
|
||||
(hasattr(sys, 'frozen') and sys.frozen) or
|
||||
(sys.argv[0] and sys.argv[0].endswith('.AppImage'))
|
||||
)
|
||||
|
||||
if is_appimage:
|
||||
# Running as AppImage - use system Python to avoid recursive spawning
|
||||
# Try to find system Python (same logic as AppRun)
|
||||
for cmd in ['python3', 'python3.13', 'python3.12', 'python3.11', 'python3.10', 'python3.9', 'python3.8']:
|
||||
python_path = shutil.which(cmd)
|
||||
if python_path:
|
||||
return python_path
|
||||
# Fallback: if we can't find system Python, this is a problem
|
||||
# But we'll still return sys.executable as last resort
|
||||
return sys.executable
|
||||
else:
|
||||
# Not AppImage - sys.executable is safe
|
||||
return sys.executable
|
||||
|
||||
def get_clean_subprocess_env(extra_env=None):
|
||||
"""
|
||||
Returns a copy of os.environ with PyInstaller and other problematic variables removed.
|
||||
Returns a copy of os.environ with bundled-runtime variables and other problematic entries removed.
|
||||
Optionally merges in extra_env dict.
|
||||
Also ensures bundled tools (lz4, unzip, etc.) are in PATH when running as AppImage.
|
||||
CRITICAL: Preserves system PATH to ensure system tools (like lz4) are available.
|
||||
"""
|
||||
from pathlib import Path
|
||||
|
||||
env = os.environ.copy()
|
||||
# Remove PyInstaller-specific variables
|
||||
|
||||
# Save APPDIR before removing it (we need it to find bundled tools)
|
||||
appdir = env.get('APPDIR')
|
||||
|
||||
# Remove AppImage-specific variables that can confuse subprocess calls
|
||||
# These variables cause subprocesses to be interpreted as new AppImage launches
|
||||
for key in ['APPIMAGE', 'APPDIR', 'ARGV0', 'OWD']:
|
||||
env.pop(key, None)
|
||||
|
||||
# Remove bundle-specific variables
|
||||
for k in list(env):
|
||||
if k.startswith('_MEIPASS'):
|
||||
del env[k]
|
||||
|
||||
# Get current PATH - ensure we preserve system paths
|
||||
current_path = env.get('PATH', '')
|
||||
|
||||
# Ensure common system directories are in PATH if not already present
|
||||
# This is critical for tools like lz4 that might be in /usr/bin, /usr/local/bin, etc.
|
||||
system_paths = ['/usr/bin', '/usr/local/bin', '/bin', '/sbin', '/usr/sbin']
|
||||
path_parts = current_path.split(':') if current_path else []
|
||||
for sys_path in system_paths:
|
||||
if sys_path not in path_parts and os.path.isdir(sys_path):
|
||||
path_parts.append(sys_path)
|
||||
|
||||
# Add bundled tools directory to PATH if running as AppImage
|
||||
# This ensures lz4, unzip, xz, etc. are available to subprocesses
|
||||
# Note: appdir was saved before env cleanup above
|
||||
tools_dir = None
|
||||
|
||||
if appdir:
|
||||
# Running as AppImage - use APPDIR
|
||||
tools_dir = os.path.join(appdir, 'opt', 'jackify', 'tools')
|
||||
# Verify the tools directory exists and contains lz4
|
||||
if not os.path.isdir(tools_dir):
|
||||
tools_dir = None
|
||||
elif not os.path.exists(os.path.join(tools_dir, 'lz4')):
|
||||
# Tools dir exists but lz4 not there - might be a different layout
|
||||
tools_dir = None
|
||||
elif getattr(sys, 'frozen', False):
|
||||
# PyInstaller frozen - try to find tools relative to executable
|
||||
exe_path = Path(sys.executable)
|
||||
# In PyInstaller, sys.executable is the bundled executable
|
||||
# Tools should be in the same directory or a tools subdirectory
|
||||
possible_tools_dirs = [
|
||||
exe_path.parent / 'tools',
|
||||
exe_path.parent / 'opt' / 'jackify' / 'tools',
|
||||
]
|
||||
for possible_dir in possible_tools_dirs:
|
||||
if possible_dir.is_dir() and (possible_dir / 'lz4').exists():
|
||||
tools_dir = str(possible_dir)
|
||||
break
|
||||
|
||||
# Build final PATH: bundled tools first (if any), then original PATH with system paths
|
||||
final_path_parts = []
|
||||
if tools_dir and os.path.isdir(tools_dir):
|
||||
# Prepend tools directory so bundled tools take precedence
|
||||
# This is critical - bundled lz4 must come before system lz4
|
||||
final_path_parts.append(tools_dir)
|
||||
|
||||
# Add all other paths (preserving order, removing duplicates)
|
||||
# Note: AppRun already sets PATH with tools directory, but we ensure it's first
|
||||
seen = set()
|
||||
if tools_dir:
|
||||
seen.add(tools_dir) # Already added, don't add again
|
||||
for path_part in path_parts:
|
||||
if path_part and path_part not in seen:
|
||||
final_path_parts.append(path_part)
|
||||
seen.add(path_part)
|
||||
|
||||
env['PATH'] = ':'.join(final_path_parts)
|
||||
|
||||
# Optionally restore LD_LIBRARY_PATH to system default if needed
|
||||
# (You can add more logic here if you know your system's default)
|
||||
if extra_env:
|
||||
@@ -59,7 +164,11 @@ class ProcessManager:
|
||||
"""
|
||||
def __init__(self, cmd, env=None, cwd=None, text=False, bufsize=0):
|
||||
self.cmd = cmd
|
||||
self.env = env
|
||||
# Default to cleaned environment if None to prevent AppImage variable inheritance
|
||||
if env is None:
|
||||
self.env = get_clean_subprocess_env()
|
||||
else:
|
||||
self.env = env
|
||||
self.cwd = cwd
|
||||
self.text = text
|
||||
self.bufsize = bufsize
|
||||
|
||||
743
jackify/backend/handlers/ttw_installer_handler.py
Normal file
743
jackify/backend/handlers/ttw_installer_handler.py
Normal file
@@ -0,0 +1,743 @@
|
||||
"""
|
||||
TTW_Linux_Installer Handler
|
||||
|
||||
Handles downloading, installation, and execution of TTW_Linux_Installer for TTW installations.
|
||||
Replaces hoolamike for TTW-specific functionality.
|
||||
"""
|
||||
|
||||
import logging
|
||||
import os
|
||||
import subprocess
|
||||
import tarfile
|
||||
import zipfile
|
||||
from pathlib import Path
|
||||
from typing import Optional, Tuple
|
||||
import requests
|
||||
|
||||
from .path_handler import PathHandler
|
||||
from .filesystem_handler import FileSystemHandler
|
||||
from .config_handler import ConfigHandler
|
||||
from .logging_handler import LoggingHandler
|
||||
from .subprocess_utils import get_clean_subprocess_env
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Define default TTW_Linux_Installer paths
|
||||
from jackify.shared.paths import get_jackify_data_dir
|
||||
JACKIFY_BASE_DIR = get_jackify_data_dir()
|
||||
DEFAULT_TTW_INSTALLER_DIR = JACKIFY_BASE_DIR / "TTW_Linux_Installer"
|
||||
TTW_INSTALLER_EXECUTABLE_NAME = "ttw_linux_gui" # Same executable, runs in CLI mode with args
|
||||
|
||||
# GitHub release info
|
||||
TTW_INSTALLER_REPO = "SulfurNitride/TTW_Linux_Installer"
|
||||
TTW_INSTALLER_RELEASE_URL = f"https://api.github.com/repos/{TTW_INSTALLER_REPO}/releases/latest"
|
||||
|
||||
|
||||
class TTWInstallerHandler:
|
||||
"""Handles TTW installation using TTW_Linux_Installer (replaces hoolamike for TTW)."""
|
||||
|
||||
def __init__(self, steamdeck: bool, verbose: bool, filesystem_handler: FileSystemHandler,
|
||||
config_handler: ConfigHandler, menu_handler=None):
|
||||
"""Initialize the handler."""
|
||||
self.steamdeck = steamdeck
|
||||
self.verbose = verbose
|
||||
self.path_handler = PathHandler()
|
||||
self.filesystem_handler = filesystem_handler
|
||||
self.config_handler = config_handler
|
||||
self.menu_handler = menu_handler
|
||||
|
||||
# Set up logging
|
||||
logging_handler = LoggingHandler()
|
||||
logging_handler.rotate_log_for_logger('ttw-install', 'TTW_Install_workflow.log')
|
||||
self.logger = logging_handler.setup_logger('ttw-install', 'TTW_Install_workflow.log')
|
||||
|
||||
# Installation paths
|
||||
self.ttw_installer_dir: Path = DEFAULT_TTW_INSTALLER_DIR
|
||||
self.ttw_installer_executable_path: Optional[Path] = None
|
||||
self.ttw_installer_installed: bool = False
|
||||
|
||||
# Load saved install path from config
|
||||
saved_path_str = self.config_handler.get('ttw_installer_install_path')
|
||||
if saved_path_str and Path(saved_path_str).is_dir():
|
||||
self.ttw_installer_dir = Path(saved_path_str)
|
||||
self.logger.info(f"Loaded TTW_Linux_Installer path from config: {self.ttw_installer_dir}")
|
||||
|
||||
# Check if already installed
|
||||
self._check_installation()
|
||||
|
||||
def _ensure_dirs_exist(self):
|
||||
"""Ensure base directories exist."""
|
||||
self.ttw_installer_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
def _check_installation(self):
|
||||
"""Check if TTW_Linux_Installer is installed at expected location."""
|
||||
self._ensure_dirs_exist()
|
||||
|
||||
potential_exe_path = self.ttw_installer_dir / TTW_INSTALLER_EXECUTABLE_NAME
|
||||
if potential_exe_path.is_file() and os.access(potential_exe_path, os.X_OK):
|
||||
self.ttw_installer_executable_path = potential_exe_path
|
||||
self.ttw_installer_installed = True
|
||||
self.logger.info(f"Found TTW_Linux_Installer at: {self.ttw_installer_executable_path}")
|
||||
else:
|
||||
self.ttw_installer_installed = False
|
||||
self.ttw_installer_executable_path = None
|
||||
self.logger.info(f"TTW_Linux_Installer not found at {potential_exe_path}")
|
||||
|
||||
def install_ttw_installer(self, install_dir: Optional[Path] = None) -> Tuple[bool, str]:
|
||||
"""Download and install TTW_Linux_Installer from GitHub releases.
|
||||
|
||||
Args:
|
||||
install_dir: Optional directory to install to (defaults to ~/Jackify/TTW_Linux_Installer)
|
||||
|
||||
Returns:
|
||||
(success: bool, message: str)
|
||||
"""
|
||||
try:
|
||||
self._ensure_dirs_exist()
|
||||
target_dir = Path(install_dir) if install_dir else self.ttw_installer_dir
|
||||
target_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# Fetch latest release info
|
||||
self.logger.info(f"Fetching latest TTW_Linux_Installer release from {TTW_INSTALLER_RELEASE_URL}")
|
||||
resp = requests.get(TTW_INSTALLER_RELEASE_URL, timeout=15, verify=True)
|
||||
resp.raise_for_status()
|
||||
data = resp.json()
|
||||
release_tag = data.get("tag_name") or data.get("name")
|
||||
|
||||
# Find Linux asset - universal-mpi-installer pattern (can be .zip or .tar.gz)
|
||||
linux_asset = None
|
||||
asset_names = [asset.get("name", "") for asset in data.get("assets", [])]
|
||||
self.logger.info(f"Available release assets: {asset_names}")
|
||||
|
||||
for asset in data.get("assets", []):
|
||||
name = asset.get("name", "").lower()
|
||||
# Look for universal-mpi-installer pattern
|
||||
if "universal-mpi-installer" in name and name.endswith((".zip", ".tar.gz")):
|
||||
linux_asset = asset
|
||||
self.logger.info(f"Found Linux asset: {asset.get('name')}")
|
||||
break
|
||||
|
||||
if not linux_asset:
|
||||
# Log all available assets for debugging
|
||||
all_assets = [asset.get("name", "") for asset in data.get("assets", [])]
|
||||
self.logger.error(f"No suitable Linux asset found. Available assets: {all_assets}")
|
||||
return False, f"No suitable Linux TTW_Linux_Installer asset found in latest release. Available assets: {', '.join(all_assets)}"
|
||||
|
||||
download_url = linux_asset.get("browser_download_url")
|
||||
asset_name = linux_asset.get("name")
|
||||
if not download_url or not asset_name:
|
||||
return False, "Latest release is missing required asset metadata"
|
||||
|
||||
# Download to target directory
|
||||
temp_path = target_dir / asset_name
|
||||
self.logger.info(f"Downloading {asset_name} from {download_url}")
|
||||
if not self.filesystem_handler.download_file(download_url, temp_path, overwrite=True, quiet=True):
|
||||
return False, "Failed to download TTW_Linux_Installer asset"
|
||||
|
||||
# Extract archive (zip or tar.gz)
|
||||
try:
|
||||
self.logger.info(f"Extracting {asset_name} to {target_dir}")
|
||||
if asset_name.lower().endswith('.tar.gz'):
|
||||
with tarfile.open(temp_path, "r:gz") as tf:
|
||||
tf.extractall(path=target_dir)
|
||||
elif asset_name.lower().endswith('.zip'):
|
||||
with zipfile.ZipFile(temp_path, "r") as zf:
|
||||
zf.extractall(path=target_dir)
|
||||
else:
|
||||
return False, f"Unsupported archive format: {asset_name}"
|
||||
finally:
|
||||
try:
|
||||
temp_path.unlink(missing_ok=True) # cleanup
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# Find executable (may be in subdirectory or root)
|
||||
exe_path = target_dir / TTW_INSTALLER_EXECUTABLE_NAME
|
||||
if not exe_path.is_file():
|
||||
# Search for it
|
||||
for p in target_dir.rglob(TTW_INSTALLER_EXECUTABLE_NAME):
|
||||
if p.is_file():
|
||||
exe_path = p
|
||||
break
|
||||
|
||||
if not exe_path.is_file():
|
||||
return False, "TTW_Linux_Installer executable not found after extraction"
|
||||
|
||||
# Set executable permissions
|
||||
try:
|
||||
os.chmod(exe_path, 0o755)
|
||||
except Exception as e:
|
||||
self.logger.warning(f"Failed to chmod +x on {exe_path}: {e}")
|
||||
|
||||
# Update state
|
||||
self.ttw_installer_dir = target_dir
|
||||
self.ttw_installer_executable_path = exe_path
|
||||
self.ttw_installer_installed = True
|
||||
self.config_handler.set('ttw_installer_install_path', str(target_dir))
|
||||
if release_tag:
|
||||
self.config_handler.set('ttw_installer_version', release_tag)
|
||||
|
||||
self.logger.info(f"TTW_Linux_Installer installed successfully at {exe_path}")
|
||||
return True, f"TTW_Linux_Installer installed at {target_dir}"
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error installing TTW_Linux_Installer: {e}", exc_info=True)
|
||||
return False, f"Error installing TTW_Linux_Installer: {e}"
|
||||
|
||||
def get_installed_ttw_installer_version(self) -> Optional[str]:
|
||||
"""Return the installed TTW_Linux_Installer version stored in Jackify config, if any."""
|
||||
try:
|
||||
v = self.config_handler.get('ttw_installer_version')
|
||||
return str(v) if v else None
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
def is_ttw_installer_update_available(self) -> Tuple[bool, Optional[str], Optional[str]]:
|
||||
"""
|
||||
Check GitHub for the latest TTW_Linux_Installer release and compare with installed version.
|
||||
Returns (update_available, installed_version, latest_version).
|
||||
"""
|
||||
installed = self.get_installed_ttw_installer_version()
|
||||
|
||||
# If executable exists but no version is recorded, don't show as "out of date"
|
||||
# This can happen if the executable was installed before version tracking was added
|
||||
if not installed and self.ttw_installer_installed:
|
||||
self.logger.info("TTW_Linux_Installer executable found but no version recorded in config")
|
||||
# Don't treat as update available - just show as "Ready" (unknown version)
|
||||
return (False, None, None)
|
||||
|
||||
try:
|
||||
resp = requests.get(TTW_INSTALLER_RELEASE_URL, timeout=10, verify=True)
|
||||
resp.raise_for_status()
|
||||
latest = resp.json().get('tag_name') or resp.json().get('name')
|
||||
if not latest:
|
||||
return (False, installed, None)
|
||||
if not installed:
|
||||
# No version recorded and executable doesn't exist; treat as not installed
|
||||
return (False, None, str(latest))
|
||||
return (installed != str(latest), installed, str(latest))
|
||||
except Exception as e:
|
||||
self.logger.warning(f"Error checking for TTW_Linux_Installer updates: {e}")
|
||||
return (False, installed, None)
|
||||
|
||||
def install_ttw_backend(self, ttw_mpi_path: Path, ttw_output_path: Path) -> Tuple[bool, str]:
|
||||
"""Install TTW using TTW_Linux_Installer.
|
||||
|
||||
Args:
|
||||
ttw_mpi_path: Path to TTW .mpi file
|
||||
ttw_output_path: Target installation directory
|
||||
|
||||
Returns:
|
||||
(success: bool, message: str)
|
||||
"""
|
||||
self.logger.info("Starting Tale of Two Wastelands installation via TTW_Linux_Installer")
|
||||
|
||||
# Validate parameters
|
||||
if not ttw_mpi_path or not ttw_output_path:
|
||||
return False, "Missing required parameters: ttw_mpi_path and ttw_output_path are required"
|
||||
|
||||
ttw_mpi_path = Path(ttw_mpi_path)
|
||||
ttw_output_path = Path(ttw_output_path)
|
||||
|
||||
# Validate paths
|
||||
if not ttw_mpi_path.exists():
|
||||
return False, f"TTW .mpi file not found: {ttw_mpi_path}"
|
||||
|
||||
if not ttw_mpi_path.is_file():
|
||||
return False, f"TTW .mpi path is not a file: {ttw_mpi_path}"
|
||||
|
||||
if ttw_mpi_path.suffix.lower() != '.mpi':
|
||||
return False, f"TTW path does not have .mpi extension: {ttw_mpi_path}"
|
||||
|
||||
if not ttw_output_path.exists():
|
||||
try:
|
||||
ttw_output_path.mkdir(parents=True, exist_ok=True)
|
||||
except Exception as e:
|
||||
return False, f"Failed to create output directory: {e}"
|
||||
|
||||
# Check installation
|
||||
if not self.ttw_installer_installed:
|
||||
# Try to install automatically
|
||||
self.logger.info("TTW_Linux_Installer not found, attempting to install...")
|
||||
success, message = self.install_ttw_installer()
|
||||
if not success:
|
||||
return False, f"TTW_Linux_Installer not installed and auto-install failed: {message}"
|
||||
|
||||
if not self.ttw_installer_executable_path or not self.ttw_installer_executable_path.is_file():
|
||||
return False, "TTW_Linux_Installer executable not found"
|
||||
|
||||
# Detect game paths
|
||||
required_games = ['Fallout 3', 'Fallout New Vegas']
|
||||
detected_games = self.path_handler.find_vanilla_game_paths()
|
||||
missing_games = [game for game in required_games if game not in detected_games]
|
||||
if missing_games:
|
||||
return False, f"Missing required games: {', '.join(missing_games)}. TTW requires both Fallout 3 and Fallout New Vegas."
|
||||
|
||||
fallout3_path = detected_games.get('Fallout 3')
|
||||
falloutnv_path = detected_games.get('Fallout New Vegas')
|
||||
|
||||
if not fallout3_path or not falloutnv_path:
|
||||
return False, "Could not detect Fallout 3 or Fallout New Vegas installation paths"
|
||||
|
||||
# Construct command - run in CLI mode with arguments
|
||||
cmd = [
|
||||
str(self.ttw_installer_executable_path),
|
||||
"--fo3", str(fallout3_path),
|
||||
"--fnv", str(falloutnv_path),
|
||||
"--mpi", str(ttw_mpi_path),
|
||||
"--output", str(ttw_output_path),
|
||||
"--start"
|
||||
]
|
||||
|
||||
self.logger.info(f"Executing TTW_Linux_Installer: {' '.join(cmd)}")
|
||||
|
||||
try:
|
||||
env = get_clean_subprocess_env()
|
||||
process = subprocess.Popen(
|
||||
cmd,
|
||||
cwd=str(self.ttw_installer_dir),
|
||||
env=env,
|
||||
stdout=subprocess.PIPE,
|
||||
stderr=subprocess.STDOUT,
|
||||
text=True,
|
||||
bufsize=1,
|
||||
universal_newlines=True
|
||||
)
|
||||
|
||||
# Stream output to logger
|
||||
if process.stdout:
|
||||
for line in process.stdout:
|
||||
line = line.rstrip()
|
||||
if line:
|
||||
self.logger.info(f"TTW_Linux_Installer: {line}")
|
||||
|
||||
process.wait()
|
||||
ret = process.returncode
|
||||
|
||||
if ret == 0:
|
||||
self.logger.info("TTW installation completed successfully.")
|
||||
return True, "TTW installation completed successfully!"
|
||||
else:
|
||||
self.logger.error(f"TTW installation process returned non-zero exit code: {ret}")
|
||||
return False, f"TTW installation failed with exit code {ret}"
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error executing TTW_Linux_Installer: {e}", exc_info=True)
|
||||
return False, f"Error executing TTW_Linux_Installer: {e}"
|
||||
|
||||
def start_ttw_installation(self, ttw_mpi_path: Path, ttw_output_path: Path, output_file: Path):
|
||||
"""Start TTW installation process (non-blocking).
|
||||
|
||||
Starts the TTW_Linux_Installer subprocess with output redirected to a file.
|
||||
Returns immediately with process handle. Caller should poll process and read output file.
|
||||
|
||||
Args:
|
||||
ttw_mpi_path: Path to TTW .mpi file
|
||||
ttw_output_path: Target installation directory
|
||||
output_file: Path to file where stdout/stderr will be written
|
||||
|
||||
Returns:
|
||||
(process: subprocess.Popen, error_message: str) - process is None if failed
|
||||
"""
|
||||
self.logger.info("Starting TTW installation (non-blocking mode)")
|
||||
|
||||
# Validate parameters
|
||||
if not ttw_mpi_path or not ttw_output_path:
|
||||
return None, "Missing required parameters: ttw_mpi_path and ttw_output_path are required"
|
||||
|
||||
ttw_mpi_path = Path(ttw_mpi_path)
|
||||
ttw_output_path = Path(ttw_output_path)
|
||||
|
||||
# Validate paths
|
||||
if not ttw_mpi_path.exists():
|
||||
return None, f"TTW .mpi file not found: {ttw_mpi_path}"
|
||||
|
||||
if not ttw_mpi_path.is_file():
|
||||
return None, f"TTW .mpi path is not a file: {ttw_mpi_path}"
|
||||
|
||||
if ttw_mpi_path.suffix.lower() != '.mpi':
|
||||
return None, f"TTW path does not have .mpi extension: {ttw_mpi_path}"
|
||||
|
||||
if not ttw_output_path.exists():
|
||||
try:
|
||||
ttw_output_path.mkdir(parents=True, exist_ok=True)
|
||||
except Exception as e:
|
||||
return None, f"Failed to create output directory: {e}"
|
||||
|
||||
# Check installation
|
||||
if not self.ttw_installer_installed:
|
||||
self.logger.info("TTW_Linux_Installer not found, attempting to install...")
|
||||
success, message = self.install_ttw_installer()
|
||||
if not success:
|
||||
return None, f"TTW_Linux_Installer not installed and auto-install failed: {message}"
|
||||
|
||||
if not self.ttw_installer_executable_path or not self.ttw_installer_executable_path.is_file():
|
||||
return None, "TTW_Linux_Installer executable not found"
|
||||
|
||||
# Detect game paths
|
||||
required_games = ['Fallout 3', 'Fallout New Vegas']
|
||||
detected_games = self.path_handler.find_vanilla_game_paths()
|
||||
missing_games = [game for game in required_games if game not in detected_games]
|
||||
if missing_games:
|
||||
return None, f"Missing required games: {', '.join(missing_games)}. TTW requires both Fallout 3 and Fallout New Vegas."
|
||||
|
||||
fallout3_path = detected_games.get('Fallout 3')
|
||||
falloutnv_path = detected_games.get('Fallout New Vegas')
|
||||
|
||||
if not fallout3_path or not falloutnv_path:
|
||||
return None, "Could not detect Fallout 3 or Fallout New Vegas installation paths"
|
||||
|
||||
# Construct command
|
||||
cmd = [
|
||||
str(self.ttw_installer_executable_path),
|
||||
"--fo3", str(fallout3_path),
|
||||
"--fnv", str(falloutnv_path),
|
||||
"--mpi", str(ttw_mpi_path),
|
||||
"--output", str(ttw_output_path),
|
||||
"--start"
|
||||
]
|
||||
|
||||
self.logger.info(f"Executing TTW_Linux_Installer: {' '.join(cmd)}")
|
||||
|
||||
try:
|
||||
env = get_clean_subprocess_env()
|
||||
|
||||
# Ensure lz4 is in PATH (critical for TTW_Linux_Installer)
|
||||
import shutil
|
||||
appdir = env.get('APPDIR')
|
||||
if appdir:
|
||||
tools_dir = os.path.join(appdir, 'opt', 'jackify', 'tools')
|
||||
bundled_lz4 = os.path.join(tools_dir, 'lz4')
|
||||
if os.path.exists(bundled_lz4) and os.access(bundled_lz4, os.X_OK):
|
||||
current_path = env.get('PATH', '')
|
||||
path_parts = [p for p in current_path.split(':') if p and p != tools_dir]
|
||||
env['PATH'] = f"{tools_dir}:{':'.join(path_parts)}"
|
||||
self.logger.info(f"Added bundled lz4 to PATH: {tools_dir}")
|
||||
|
||||
# Verify lz4 is available
|
||||
lz4_path = shutil.which('lz4', path=env.get('PATH', ''))
|
||||
if not lz4_path:
|
||||
system_lz4 = shutil.which('lz4')
|
||||
if system_lz4:
|
||||
lz4_dir = os.path.dirname(system_lz4)
|
||||
env['PATH'] = f"{lz4_dir}:{env.get('PATH', '')}"
|
||||
self.logger.info(f"Added system lz4 to PATH: {lz4_dir}")
|
||||
else:
|
||||
return None, "lz4 is required but not found in PATH"
|
||||
|
||||
# Open output file for writing
|
||||
output_fh = open(output_file, 'w', encoding='utf-8', buffering=1)
|
||||
|
||||
# Start process with output redirected to file
|
||||
process = subprocess.Popen(
|
||||
cmd,
|
||||
cwd=str(self.ttw_installer_dir),
|
||||
env=env,
|
||||
stdout=output_fh,
|
||||
stderr=subprocess.STDOUT,
|
||||
bufsize=1
|
||||
)
|
||||
|
||||
self.logger.info(f"TTW_Linux_Installer process started (PID: {process.pid}), output to {output_file}")
|
||||
|
||||
# Store file handle so it can be closed later
|
||||
process._output_fh = output_fh
|
||||
|
||||
return process, None
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error starting TTW_Linux_Installer: {e}", exc_info=True)
|
||||
return None, f"Error starting TTW_Linux_Installer: {e}"
|
||||
|
||||
@staticmethod
|
||||
def cleanup_ttw_process(process):
|
||||
"""Clean up after TTW installation process.
|
||||
|
||||
Closes file handles and ensures process is terminated properly.
|
||||
|
||||
Args:
|
||||
process: subprocess.Popen object from start_ttw_installation()
|
||||
"""
|
||||
if process:
|
||||
# Close output file handle if attached
|
||||
if hasattr(process, '_output_fh'):
|
||||
try:
|
||||
process._output_fh.close()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# Terminate if still running
|
||||
if process.poll() is None:
|
||||
try:
|
||||
process.terminate()
|
||||
process.wait(timeout=5)
|
||||
except Exception:
|
||||
try:
|
||||
process.kill()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
def install_ttw_backend_with_output_stream(self, ttw_mpi_path: Path, ttw_output_path: Path, output_callback=None):
|
||||
"""Install TTW with streaming output for GUI (DEPRECATED - use start_ttw_installation instead).
|
||||
|
||||
Args:
|
||||
ttw_mpi_path: Path to TTW .mpi file
|
||||
ttw_output_path: Target installation directory
|
||||
output_callback: Optional callback function(line: str) for real-time output
|
||||
|
||||
Returns:
|
||||
(success: bool, message: str)
|
||||
"""
|
||||
self.logger.info("Starting Tale of Two Wastelands installation via TTW_Linux_Installer (with output stream)")
|
||||
|
||||
# Validate parameters (same as install_ttw_backend)
|
||||
if not ttw_mpi_path or not ttw_output_path:
|
||||
return False, "Missing required parameters: ttw_mpi_path and ttw_output_path are required"
|
||||
|
||||
ttw_mpi_path = Path(ttw_mpi_path)
|
||||
ttw_output_path = Path(ttw_output_path)
|
||||
|
||||
# Validate paths
|
||||
if not ttw_mpi_path.exists():
|
||||
return False, f"TTW .mpi file not found: {ttw_mpi_path}"
|
||||
|
||||
if not ttw_mpi_path.is_file():
|
||||
return False, f"TTW .mpi path is not a file: {ttw_mpi_path}"
|
||||
|
||||
if ttw_mpi_path.suffix.lower() != '.mpi':
|
||||
return False, f"TTW path does not have .mpi extension: {ttw_mpi_path}"
|
||||
|
||||
if not ttw_output_path.exists():
|
||||
try:
|
||||
ttw_output_path.mkdir(parents=True, exist_ok=True)
|
||||
except Exception as e:
|
||||
return False, f"Failed to create output directory: {e}"
|
||||
|
||||
# Check installation
|
||||
if not self.ttw_installer_installed:
|
||||
if output_callback:
|
||||
output_callback("TTW_Linux_Installer not found, installing...")
|
||||
self.logger.info("TTW_Linux_Installer not found, attempting to install...")
|
||||
success, message = self.install_ttw_installer()
|
||||
if not success:
|
||||
return False, f"TTW_Linux_Installer not installed and auto-install failed: {message}"
|
||||
|
||||
if not self.ttw_installer_executable_path or not self.ttw_installer_executable_path.is_file():
|
||||
return False, "TTW_Linux_Installer executable not found"
|
||||
|
||||
# Detect game paths
|
||||
required_games = ['Fallout 3', 'Fallout New Vegas']
|
||||
detected_games = self.path_handler.find_vanilla_game_paths()
|
||||
missing_games = [game for game in required_games if game not in detected_games]
|
||||
if missing_games:
|
||||
return False, f"Missing required games: {', '.join(missing_games)}. TTW requires both Fallout 3 and Fallout New Vegas."
|
||||
|
||||
fallout3_path = detected_games.get('Fallout 3')
|
||||
falloutnv_path = detected_games.get('Fallout New Vegas')
|
||||
|
||||
if not fallout3_path or not falloutnv_path:
|
||||
return False, "Could not detect Fallout 3 or Fallout New Vegas installation paths"
|
||||
|
||||
# Construct command
|
||||
cmd = [
|
||||
str(self.ttw_installer_executable_path),
|
||||
"--fo3", str(fallout3_path),
|
||||
"--fnv", str(falloutnv_path),
|
||||
"--mpi", str(ttw_mpi_path),
|
||||
"--output", str(ttw_output_path),
|
||||
"--start"
|
||||
]
|
||||
|
||||
self.logger.info(f"Executing TTW_Linux_Installer: {' '.join(cmd)}")
|
||||
|
||||
try:
|
||||
env = get_clean_subprocess_env()
|
||||
process = subprocess.Popen(
|
||||
cmd,
|
||||
cwd=str(self.ttw_installer_dir),
|
||||
env=env,
|
||||
stdout=subprocess.PIPE,
|
||||
stderr=subprocess.STDOUT,
|
||||
text=True,
|
||||
bufsize=1,
|
||||
universal_newlines=True
|
||||
)
|
||||
|
||||
# Stream output to both logger and callback
|
||||
if process.stdout:
|
||||
for line in process.stdout:
|
||||
line = line.rstrip()
|
||||
if line:
|
||||
self.logger.info(f"TTW_Linux_Installer: {line}")
|
||||
if output_callback:
|
||||
output_callback(line)
|
||||
|
||||
process.wait()
|
||||
ret = process.returncode
|
||||
|
||||
if ret == 0:
|
||||
self.logger.info("TTW installation completed successfully.")
|
||||
return True, "TTW installation completed successfully!"
|
||||
else:
|
||||
self.logger.error(f"TTW installation process returned non-zero exit code: {ret}")
|
||||
return False, f"TTW installation failed with exit code {ret}"
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error executing TTW_Linux_Installer: {e}", exc_info=True)
|
||||
return False, f"Error executing TTW_Linux_Installer: {e}"
|
||||
|
||||
@staticmethod
|
||||
def integrate_ttw_into_modlist(ttw_output_path: Path, modlist_install_dir: Path, ttw_version: str) -> bool:
|
||||
"""Integrate TTW output into a modlist's MO2 structure
|
||||
|
||||
This method:
|
||||
1. Copies TTW output to the modlist's mods folder
|
||||
2. Updates modlist.txt for all profiles
|
||||
3. Updates plugins.txt with TTW ESMs in correct order
|
||||
|
||||
Args:
|
||||
ttw_output_path: Path to TTW output directory
|
||||
modlist_install_dir: Path to modlist installation directory
|
||||
ttw_version: TTW version string (e.g., "3.4")
|
||||
|
||||
Returns:
|
||||
bool: True if integration successful, False otherwise
|
||||
"""
|
||||
logging_handler = LoggingHandler()
|
||||
logging_handler.rotate_log_for_logger('ttw-install', 'TTW_Install_workflow.log')
|
||||
logger = logging_handler.setup_logger('ttw-install', 'TTW_Install_workflow.log')
|
||||
|
||||
try:
|
||||
import shutil
|
||||
|
||||
# Validate paths
|
||||
if not ttw_output_path.exists():
|
||||
logger.error(f"TTW output path does not exist: {ttw_output_path}")
|
||||
return False
|
||||
|
||||
mods_dir = modlist_install_dir / "mods"
|
||||
profiles_dir = modlist_install_dir / "profiles"
|
||||
|
||||
if not mods_dir.exists() or not profiles_dir.exists():
|
||||
logger.error(f"Invalid modlist directory structure: {modlist_install_dir}")
|
||||
return False
|
||||
|
||||
# Create mod folder name with version
|
||||
mod_folder_name = f"[NoDelete] Tale of Two Wastelands {ttw_version}" if ttw_version else "[NoDelete] Tale of Two Wastelands"
|
||||
target_mod_dir = mods_dir / mod_folder_name
|
||||
|
||||
# Copy TTW output to mods directory
|
||||
logger.info(f"Copying TTW output to {target_mod_dir}")
|
||||
if target_mod_dir.exists():
|
||||
logger.info(f"Removing existing TTW mod at {target_mod_dir}")
|
||||
shutil.rmtree(target_mod_dir)
|
||||
|
||||
shutil.copytree(ttw_output_path, target_mod_dir)
|
||||
logger.info("TTW output copied successfully")
|
||||
|
||||
# TTW ESMs in correct load order
|
||||
ttw_esms = [
|
||||
"Fallout3.esm",
|
||||
"Anchorage.esm",
|
||||
"ThePitt.esm",
|
||||
"BrokenSteel.esm",
|
||||
"PointLookout.esm",
|
||||
"Zeta.esm",
|
||||
"TaleOfTwoWastelands.esm",
|
||||
"YUPTTW.esm"
|
||||
]
|
||||
|
||||
# Process each profile
|
||||
for profile_dir in profiles_dir.iterdir():
|
||||
if not profile_dir.is_dir():
|
||||
continue
|
||||
|
||||
profile_name = profile_dir.name
|
||||
logger.info(f"Processing profile: {profile_name}")
|
||||
|
||||
# Update modlist.txt
|
||||
modlist_file = profile_dir / "modlist.txt"
|
||||
if modlist_file.exists():
|
||||
# Read existing modlist
|
||||
with open(modlist_file, 'r', encoding='utf-8') as f:
|
||||
lines = f.readlines()
|
||||
|
||||
# Find the TTW placeholder separator and insert BEFORE it
|
||||
separator_found = False
|
||||
ttw_mod_line = f"+{mod_folder_name}\n"
|
||||
new_lines = []
|
||||
|
||||
for line in lines:
|
||||
# Skip existing TTW mod entries (but keep separators and other TTW-related mods)
|
||||
# Match patterns: "+[NoDelete] Tale of Two Wastelands", "+[NoDelete] TTW", etc.
|
||||
stripped = line.strip()
|
||||
if stripped.startswith('+') and '[nodelete]' in stripped.lower():
|
||||
# Check if it's the main TTW mod (not other TTW-related mods like "TTW Quick Start")
|
||||
if ('tale of two wastelands' in stripped.lower() and 'quick start' not in stripped.lower() and
|
||||
'loading wheel' not in stripped.lower()) or stripped.lower().startswith('+[nodelete] ttw '):
|
||||
logger.info(f"Removing existing TTW mod entry: {stripped}")
|
||||
continue
|
||||
|
||||
# Insert TTW mod BEFORE the placeholder separator (MO2 order is bottom-up)
|
||||
# Check BEFORE appending so TTW mod appears before separator in file
|
||||
if "put tale of two wastelands mod here" in line.lower() and "_separator" in line.lower():
|
||||
new_lines.append(ttw_mod_line)
|
||||
separator_found = True
|
||||
logger.info(f"Inserted TTW mod before separator: {line.strip()}")
|
||||
|
||||
new_lines.append(line)
|
||||
|
||||
# If no separator found, append at the end
|
||||
if not separator_found:
|
||||
new_lines.append(ttw_mod_line)
|
||||
logger.warning(f"No TTW separator found in {profile_name}, appended to end")
|
||||
|
||||
# Write back
|
||||
with open(modlist_file, 'w', encoding='utf-8') as f:
|
||||
f.writelines(new_lines)
|
||||
|
||||
logger.info(f"Updated modlist.txt for {profile_name}")
|
||||
else:
|
||||
logger.warning(f"modlist.txt not found for profile {profile_name}")
|
||||
|
||||
# Update plugins.txt
|
||||
plugins_file = profile_dir / "plugins.txt"
|
||||
if plugins_file.exists():
|
||||
# Read existing plugins
|
||||
with open(plugins_file, 'r', encoding='utf-8') as f:
|
||||
lines = f.readlines()
|
||||
|
||||
# Remove any existing TTW ESMs
|
||||
ttw_esm_set = set(esm.lower() for esm in ttw_esms)
|
||||
lines = [line for line in lines if line.strip().lower() not in ttw_esm_set]
|
||||
|
||||
# Find CaravanPack.esm and insert TTW ESMs after it
|
||||
insert_index = None
|
||||
for i, line in enumerate(lines):
|
||||
if line.strip().lower() == "caravanpack.esm":
|
||||
insert_index = i + 1
|
||||
break
|
||||
|
||||
if insert_index is not None:
|
||||
# Insert TTW ESMs in correct order
|
||||
for esm in reversed(ttw_esms):
|
||||
lines.insert(insert_index, f"{esm}\n")
|
||||
else:
|
||||
logger.warning(f"CaravanPack.esm not found in {profile_name}, appending TTW ESMs to end")
|
||||
for esm in ttw_esms:
|
||||
lines.append(f"{esm}\n")
|
||||
|
||||
# Write back
|
||||
with open(plugins_file, 'w', encoding='utf-8') as f:
|
||||
f.writelines(lines)
|
||||
|
||||
logger.info(f"Updated plugins.txt for {profile_name}")
|
||||
else:
|
||||
logger.warning(f"plugins.txt not found for profile {profile_name}")
|
||||
|
||||
logger.info("TTW integration completed successfully")
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error integrating TTW into modlist: {e}", exc_info=True)
|
||||
return False
|
||||
@@ -300,7 +300,7 @@ class ValidationHandler:
|
||||
def looks_like_modlist_dir(self, path: Path) -> bool:
|
||||
"""Return True if the directory contains files/folders typical of a modlist install."""
|
||||
expected = [
|
||||
'ModOrganizer.exe', 'profiles', 'mods', 'downloads', '.wabbajack', '.jackify_modlist_marker', 'ModOrganizer.ini'
|
||||
'ModOrganizer.exe', 'profiles', 'mods', '.wabbajack', '.jackify_modlist_marker', 'ModOrganizer.ini'
|
||||
]
|
||||
for item in expected:
|
||||
if (path / item).exists():
|
||||
|
||||
@@ -1196,7 +1196,8 @@ class InstallWabbajackHandler:
|
||||
"""Displays the final success message and next steps."""
|
||||
# Basic log file path (assuming standard location)
|
||||
# TODO: Get log file path more reliably if needed
|
||||
log_path = Path.home() / "Jackify" / "logs" / "jackify-cli.log"
|
||||
from jackify.shared.paths import get_jackify_logs_dir
|
||||
log_path = get_jackify_logs_dir() / "jackify-cli.log"
|
||||
|
||||
print("\n───────────────────────────────────────────────────────────────────")
|
||||
print(f"{COLOR_INFO}Wabbajack Installation Completed Successfully!{COLOR_RESET}")
|
||||
|
||||
@@ -200,40 +200,55 @@ class WineUtils:
|
||||
@staticmethod
|
||||
def _get_sd_card_mounts():
|
||||
"""
|
||||
Dynamically detect all current SD card mount points
|
||||
Returns list of mount point paths
|
||||
Detect SD card mount points using df.
|
||||
Returns list of actual mount paths from /run/media (e.g., /run/media/deck/MicroSD).
|
||||
"""
|
||||
try:
|
||||
import subprocess
|
||||
result = subprocess.run(['df', '-h'], capture_output=True, text=True, timeout=5)
|
||||
sd_mounts = []
|
||||
for line in result.stdout.split('\n'):
|
||||
# Look for common SD card mount patterns
|
||||
if '/run/media' in line or ('/mnt' in line and 'sdcard' in line.lower()):
|
||||
parts = line.split()
|
||||
if len(parts) >= 6: # df output has 6+ columns
|
||||
mount_point = parts[-1] # Last column is mount point
|
||||
if mount_point.startswith(('/run/media', '/mnt')):
|
||||
sd_mounts.append(mount_point)
|
||||
return sd_mounts
|
||||
except Exception:
|
||||
# Fallback to common patterns if df fails
|
||||
return ['/run/media/mmcblk0p1', '/run/media/deck']
|
||||
import subprocess
|
||||
import re
|
||||
|
||||
result = subprocess.run(['df', '-h'], capture_output=True, text=True, timeout=5)
|
||||
sd_mounts = []
|
||||
|
||||
for line in result.stdout.split('\n'):
|
||||
if '/run/media' in line:
|
||||
parts = line.split()
|
||||
if len(parts) >= 6:
|
||||
mount_point = parts[-1] # Last column is the mount point
|
||||
if mount_point.startswith('/run/media/'):
|
||||
sd_mounts.append(mount_point)
|
||||
|
||||
# Sort by length (longest first) to match most specific paths first
|
||||
sd_mounts.sort(key=len, reverse=True)
|
||||
logger.debug(f"Detected SD card mounts from df: {sd_mounts}")
|
||||
return sd_mounts
|
||||
|
||||
@staticmethod
|
||||
def _strip_sdcard_path(path):
|
||||
"""
|
||||
Strip any detected SD card mount prefix from paths
|
||||
Handles both /run/media/mmcblk0p1 and /run/media/deck/UUID patterns
|
||||
Strip SD card mount prefix from path.
|
||||
Handles both /run/media/mmcblk0p1 and /run/media/deck/UUID patterns.
|
||||
Pattern: /run/media/deck/UUID/Games/... becomes /Games/...
|
||||
Pattern: /run/media/mmcblk0p1/Games/... becomes /Games/...
|
||||
"""
|
||||
sd_mounts = WineUtils._get_sd_card_mounts()
|
||||
import re
|
||||
|
||||
for mount in sd_mounts:
|
||||
if path.startswith(mount):
|
||||
# Strip the mount prefix and ensure proper leading slash
|
||||
relative_path = path[len(mount):].lstrip('/')
|
||||
return "/" + relative_path if relative_path else "/"
|
||||
# Pattern 1: /run/media/deck/UUID/... strip everything up to and including UUID
|
||||
# This matches the bash: "${path#*/run/media/deck/*/*}"
|
||||
deck_pattern = r'^/run/media/deck/[^/]+(/.*)?$'
|
||||
match = re.match(deck_pattern, path)
|
||||
if match:
|
||||
stripped = match.group(1) if match.group(1) else "/"
|
||||
logger.debug(f"Stripped SD card path (deck pattern): {path} -> {stripped}")
|
||||
return stripped
|
||||
|
||||
# Pattern 2: /run/media/mmcblk0p1/... strip /run/media/mmcblk0p1
|
||||
# This matches the bash: "${path#*mmcblk0p1}"
|
||||
if path.startswith('/run/media/mmcblk0p1/'):
|
||||
stripped = path.replace('/run/media/mmcblk0p1', '', 1)
|
||||
logger.debug(f"Stripped SD card path (mmcblk pattern): {path} -> {stripped}")
|
||||
return stripped
|
||||
|
||||
# No SD card pattern matched
|
||||
return path
|
||||
|
||||
@staticmethod
|
||||
@@ -668,7 +683,10 @@ class WineUtils:
|
||||
# Add standard compatibility tool locations (covers edge cases like Flatpak)
|
||||
compatibility_paths.extend([
|
||||
Path.home() / ".steam/root/compatibilitytools.d",
|
||||
Path.home() / ".var/app/com.valvesoftware.Steam/.local/share/Steam/compatibilitytools.d"
|
||||
Path.home() / ".var/app/com.valvesoftware.Steam/.local/share/Steam/compatibilitytools.d",
|
||||
# Flatpak GE-Proton extension paths
|
||||
Path.home() / ".var/app/com.valvesoftware.Steam.CompatibilityTool.Proton-GE/.local/share/Steam/compatibilitytools.d",
|
||||
Path.home() / ".var/app/com.valvesoftware.Steam/.local/share/Steam/compatibilitytools.d/GE-Proton"
|
||||
])
|
||||
# Special handling for Proton 9: try all possible directory names
|
||||
if proton_version.strip().startswith("Proton 9"):
|
||||
@@ -822,7 +840,12 @@ class WineUtils:
|
||||
"""
|
||||
compat_paths = [
|
||||
Path.home() / ".steam/steam/compatibilitytools.d",
|
||||
Path.home() / ".local/share/Steam/compatibilitytools.d"
|
||||
Path.home() / ".local/share/Steam/compatibilitytools.d",
|
||||
Path.home() / ".steam/root/compatibilitytools.d",
|
||||
Path.home() / ".var/app/com.valvesoftware.Steam/.local/share/Steam/compatibilitytools.d",
|
||||
# Flatpak GE-Proton extension paths
|
||||
Path.home() / ".var/app/com.valvesoftware.Steam.CompatibilityTool.Proton-GE/.local/share/Steam/compatibilitytools.d",
|
||||
Path.home() / ".var/app/com.valvesoftware.Steam/.local/share/Steam/compatibilitytools.d/GE-Proton"
|
||||
]
|
||||
|
||||
# Return only existing paths
|
||||
@@ -993,8 +1016,8 @@ class WineUtils:
|
||||
seen_names.add(version['name'])
|
||||
|
||||
if unique_versions:
|
||||
logger.info(f"Found {len(unique_versions)} total Proton version(s)")
|
||||
logger.info(f"Best available: {unique_versions[0]['name']} ({unique_versions[0]['type']})")
|
||||
logger.debug(f"Found {len(unique_versions)} total Proton version(s)")
|
||||
logger.debug(f"Best available: {unique_versions[0]['name']} ({unique_versions[0]['type']})")
|
||||
else:
|
||||
logger.warning("No Proton versions found")
|
||||
|
||||
|
||||
@@ -9,7 +9,7 @@ import os
|
||||
import subprocess
|
||||
import logging
|
||||
from pathlib import Path
|
||||
from typing import Optional, List
|
||||
from typing import Optional, List, Callable
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@@ -110,10 +110,16 @@ class WinetricksHandler:
|
||||
self.logger.error(f"Error testing winetricks: {e}")
|
||||
return False
|
||||
|
||||
def install_wine_components(self, wineprefix: str, game_var: str, specific_components: Optional[List[str]] = None) -> bool:
|
||||
def install_wine_components(self, wineprefix: str, game_var: str, specific_components: Optional[List[str]] = None, status_callback: Optional[Callable[[str], None]] = None) -> bool:
|
||||
"""
|
||||
Install the specified Wine components into the given prefix using winetricks.
|
||||
If specific_components is None, use the default set (fontsmooth=rgb, xact, xact_x64, vcrun2022).
|
||||
|
||||
Args:
|
||||
wineprefix: Path to Wine prefix
|
||||
game_var: Game name for logging
|
||||
specific_components: Optional list of specific components to install
|
||||
status_callback: Optional callback function(status_message: str) for progress updates
|
||||
"""
|
||||
if not self.is_available():
|
||||
self.logger.error("Winetricks is not available")
|
||||
@@ -137,6 +143,8 @@ class WinetricksHandler:
|
||||
from ..handlers.wine_utils import WineUtils
|
||||
|
||||
config = ConfigHandler()
|
||||
# Use Install Proton for component installation/texture processing
|
||||
# get_proton_path() returns the Install Proton path
|
||||
user_proton_path = config.get_proton_path()
|
||||
|
||||
# If user selected a specific Proton, try that first
|
||||
@@ -162,21 +170,27 @@ class WinetricksHandler:
|
||||
else:
|
||||
self.logger.warning(f"User-selected Proton no longer exists: {user_proton_path}")
|
||||
|
||||
# Fall back to auto-detection if user selection failed or is 'auto'
|
||||
# Only auto-detect if user explicitly chose 'auto'
|
||||
if not wine_binary:
|
||||
self.logger.info("Falling back to automatic Proton detection")
|
||||
best_proton = WineUtils.select_best_proton()
|
||||
if best_proton:
|
||||
wine_binary = WineUtils.find_proton_binary(best_proton['name'])
|
||||
self.logger.info(f"Auto-selected Proton: {best_proton['name']} at {best_proton['path']}")
|
||||
else:
|
||||
# Enhanced debugging for Proton detection failure
|
||||
self.logger.error("Auto-detection failed - no Proton versions found")
|
||||
available_versions = WineUtils.scan_all_proton_versions()
|
||||
if available_versions:
|
||||
self.logger.error(f"Available Proton versions: {[v['name'] for v in available_versions]}")
|
||||
if user_proton_path == 'auto':
|
||||
self.logger.info("Auto-detecting Proton (user selected 'auto')")
|
||||
best_proton = WineUtils.select_best_proton()
|
||||
if best_proton:
|
||||
wine_binary = WineUtils.find_proton_binary(best_proton['name'])
|
||||
self.logger.info(f"Auto-selected Proton: {best_proton['name']} at {best_proton['path']}")
|
||||
else:
|
||||
self.logger.error("No Proton versions detected in standard Steam locations")
|
||||
# Enhanced debugging for Proton detection failure
|
||||
self.logger.error("Auto-detection failed - no Proton versions found")
|
||||
available_versions = WineUtils.scan_all_proton_versions()
|
||||
if available_versions:
|
||||
self.logger.error(f"Available Proton versions: {[v['name'] for v in available_versions]}")
|
||||
else:
|
||||
self.logger.error("No Proton versions detected in standard Steam locations")
|
||||
else:
|
||||
# User selected a specific Proton but validation failed - this is an ERROR
|
||||
self.logger.error(f"Cannot use configured Proton: {user_proton_path}")
|
||||
self.logger.error("Please check Settings and ensure the Proton version still exists")
|
||||
return False
|
||||
|
||||
if not wine_binary:
|
||||
self.logger.error("Cannot run winetricks: No compatible Proton version found")
|
||||
@@ -260,37 +274,43 @@ class WinetricksHandler:
|
||||
|
||||
if not all_components:
|
||||
self.logger.info("No Wine components to install.")
|
||||
if status_callback:
|
||||
status_callback("No Wine components to install")
|
||||
return True
|
||||
|
||||
# Reorder components for proper installation sequence
|
||||
components_to_install = self._reorder_components_for_installation(all_components)
|
||||
self.logger.info(f"WINEPREFIX: {wineprefix}, Game: {game_var}, Ordered Components: {components_to_install}")
|
||||
|
||||
# Show status with component list
|
||||
if status_callback:
|
||||
components_list = ', '.join(components_to_install)
|
||||
status_callback(f"Installing Wine components: {components_list}")
|
||||
|
||||
# Check user preference for component installation method
|
||||
from ..handlers.config_handler import ConfigHandler
|
||||
config_handler = ConfigHandler()
|
||||
use_winetricks = config_handler.get('use_winetricks_for_components', True)
|
||||
|
||||
# Get component installation method with migration
|
||||
method = config_handler.get('component_installation_method', 'winetricks')
|
||||
|
||||
# Legacy .NET Framework versions that are problematic in Wine/Proton
|
||||
# DISABLED in v0.1.6.2: Universal registry fixes replace dotnet4.x installation
|
||||
# legacy_dotnet_versions = ['dotnet40', 'dotnet472', 'dotnet48']
|
||||
legacy_dotnet_versions = [] # ALL dotnet4.x versions disabled - universal registry fixes handle compatibility
|
||||
# Migrate bundled_protontricks to system_protontricks (no longer supported)
|
||||
if method == 'bundled_protontricks':
|
||||
self.logger.warning("Bundled protontricks no longer supported, migrating to system_protontricks")
|
||||
method = 'system_protontricks'
|
||||
config_handler.set('component_installation_method', 'system_protontricks')
|
||||
|
||||
# Check if any legacy .NET Framework versions are present
|
||||
has_legacy_dotnet = any(comp in components_to_install for comp in legacy_dotnet_versions)
|
||||
# Choose installation method based on user preference
|
||||
if method == 'system_protontricks':
|
||||
self.logger.info("Using system protontricks for all components")
|
||||
return self._install_components_protontricks_only(components_to_install, wineprefix, game_var, status_callback)
|
||||
# else: method == 'winetricks' (default behavior continues below)
|
||||
|
||||
# Choose installation method based on user preference and components
|
||||
# HYBRID APPROACH MOSTLY DISABLED: dotnet40/dotnet472 replaced with universal registry fixes
|
||||
if has_legacy_dotnet:
|
||||
legacy_found = [comp for comp in legacy_dotnet_versions if comp in components_to_install]
|
||||
self.logger.info(f"Using hybrid approach: protontricks for legacy .NET versions {legacy_found} (reliable), {'winetricks' if use_winetricks else 'protontricks'} for other components")
|
||||
return self._install_components_hybrid_approach(components_to_install, wineprefix, game_var, use_winetricks)
|
||||
elif not use_winetricks:
|
||||
self.logger.info("Using legacy approach: protontricks for all components")
|
||||
return self._install_components_protontricks_only(components_to_install, wineprefix, game_var)
|
||||
|
||||
# For non-dotnet40 installations, install all components together (faster)
|
||||
# Install all components together with winetricks (faster)
|
||||
max_attempts = 3
|
||||
winetricks_failed = False
|
||||
last_error_details = None
|
||||
|
||||
for attempt in range(1, max_attempts + 1):
|
||||
if attempt > 1:
|
||||
self.logger.warning(f"Retrying component installation (attempt {attempt}/{max_attempts})...")
|
||||
@@ -301,9 +321,40 @@ class WinetricksHandler:
|
||||
cmd = [self.winetricks_path, '--unattended'] + components_to_install
|
||||
|
||||
self.logger.debug(f"Running: {' '.join(cmd)}")
|
||||
self.logger.debug(f"Environment WINE={env.get('WINE', 'NOT SET')}")
|
||||
self.logger.debug(f"Environment DISPLAY={env.get('DISPLAY', 'NOT SET')}")
|
||||
self.logger.debug(f"Environment WINEPREFIX={env.get('WINEPREFIX', 'NOT SET')}")
|
||||
|
||||
# Enhanced diagnostics for bundled winetricks
|
||||
self.logger.debug("=== Winetricks Environment Diagnostics ===")
|
||||
self.logger.debug(f"Bundled winetricks path: {self.winetricks_path}")
|
||||
self.logger.debug(f"Winetricks exists: {os.path.exists(self.winetricks_path)}")
|
||||
self.logger.debug(f"Winetricks executable: {os.access(self.winetricks_path, os.X_OK)}")
|
||||
if os.path.exists(self.winetricks_path):
|
||||
try:
|
||||
winetricks_stat = os.stat(self.winetricks_path)
|
||||
self.logger.debug(f"Winetricks permissions: {oct(winetricks_stat.st_mode)}")
|
||||
self.logger.debug(f"Winetricks size: {winetricks_stat.st_size} bytes")
|
||||
except Exception as stat_err:
|
||||
self.logger.debug(f"Could not stat winetricks: {stat_err}")
|
||||
|
||||
self.logger.debug(f"WINE binary: {env.get('WINE', 'NOT SET')}")
|
||||
wine_binary = env.get('WINE', '')
|
||||
if wine_binary and os.path.exists(wine_binary):
|
||||
self.logger.debug(f"WINE binary exists: True")
|
||||
else:
|
||||
self.logger.debug(f"WINE binary exists: False")
|
||||
|
||||
self.logger.debug(f"WINEPREFIX: {env.get('WINEPREFIX', 'NOT SET')}")
|
||||
wineprefix = env.get('WINEPREFIX', '')
|
||||
if wineprefix and os.path.exists(wineprefix):
|
||||
self.logger.debug(f"WINEPREFIX exists: True")
|
||||
self.logger.debug(f"WINEPREFIX/pfx exists: {os.path.exists(os.path.join(wineprefix, 'pfx'))}")
|
||||
else:
|
||||
self.logger.debug(f"WINEPREFIX exists: False")
|
||||
|
||||
self.logger.debug(f"DISPLAY: {env.get('DISPLAY', 'NOT SET')}")
|
||||
self.logger.debug(f"WINETRICKS_CACHE: {env.get('WINETRICKS_CACHE', 'NOT SET')}")
|
||||
self.logger.debug(f"Components to install: {components_to_install}")
|
||||
self.logger.debug("==========================================")
|
||||
|
||||
result = subprocess.run(
|
||||
cmd,
|
||||
env=env,
|
||||
@@ -315,137 +366,155 @@ class WinetricksHandler:
|
||||
|
||||
self.logger.debug(f"Winetricks output: {result.stdout}")
|
||||
if result.returncode == 0:
|
||||
self.logger.info("Wine Component installation command completed successfully.")
|
||||
# Set Windows 10 mode after component installation (matches legacy script timing)
|
||||
self._set_windows_10_mode(wineprefix, env.get('WINE', ''))
|
||||
return True
|
||||
else:
|
||||
# Special handling for dotnet40 verification issue (mimics protontricks behavior)
|
||||
if "dotnet40" in components_to_install and "ngen.exe not found" in result.stderr:
|
||||
self.logger.warning("dotnet40 verification warning (common in Steam Proton prefixes)")
|
||||
self.logger.info("Checking if dotnet40 was actually installed...")
|
||||
self.logger.info("Wine Component installation command completed.")
|
||||
|
||||
# Check if dotnet40 appears in winetricks.log (indicates successful installation)
|
||||
log_path = os.path.join(wineprefix, 'winetricks.log')
|
||||
if os.path.exists(log_path):
|
||||
try:
|
||||
with open(log_path, 'r') as f:
|
||||
log_content = f.read()
|
||||
if 'dotnet40' in log_content:
|
||||
self.logger.info("dotnet40 found in winetricks.log - installation succeeded despite verification warning")
|
||||
return True
|
||||
except Exception as e:
|
||||
self.logger.warning(f"Could not read winetricks.log: {e}")
|
||||
# Verify components were actually installed
|
||||
if self._verify_components_installed(wineprefix, components_to_install, env):
|
||||
self.logger.info("Component verification successful - all components installed correctly.")
|
||||
components_list = ', '.join(components_to_install)
|
||||
if status_callback:
|
||||
status_callback(f"Wine components installed and verified: {components_list}")
|
||||
# Set Windows 10 mode after component installation (matches legacy script timing)
|
||||
self._set_windows_10_mode(wineprefix, env.get('WINE', ''))
|
||||
return True
|
||||
else:
|
||||
self.logger.error(f"Component verification failed (Attempt {attempt}/{max_attempts})")
|
||||
# Continue to retry
|
||||
else:
|
||||
# Store detailed error information for fallback diagnostics
|
||||
last_error_details = {
|
||||
'returncode': result.returncode,
|
||||
'stdout': result.stdout.strip(),
|
||||
'stderr': result.stderr.strip(),
|
||||
'attempt': attempt
|
||||
}
|
||||
|
||||
self.logger.error(f"Winetricks command failed (Attempt {attempt}/{max_attempts}). Return Code: {result.returncode}")
|
||||
self.logger.error(f"Stdout: {result.stdout.strip()}")
|
||||
self.logger.error(f"Stderr: {result.stderr.strip()}")
|
||||
|
||||
# Enhanced error diagnostics with actionable information
|
||||
stderr_lower = result.stderr.lower()
|
||||
stdout_lower = result.stdout.lower()
|
||||
|
||||
if "command not found" in stderr_lower or "no such file" in stderr_lower:
|
||||
self.logger.error("DIAGNOSTIC: Winetricks or dependency binary not found")
|
||||
self.logger.error(" - Bundled winetricks may be missing dependencies")
|
||||
self.logger.error(" - Will attempt protontricks fallback if all attempts fail")
|
||||
elif "permission denied" in stderr_lower:
|
||||
self.logger.error("DIAGNOSTIC: Permission issue detected")
|
||||
self.logger.error(f" - Check permissions on: {self.winetricks_path}")
|
||||
self.logger.error(f" - Check permissions on WINEPREFIX: {env.get('WINEPREFIX', 'N/A')}")
|
||||
elif "timeout" in stderr_lower:
|
||||
self.logger.error("DIAGNOSTIC: Timeout issue detected during component download/install")
|
||||
elif "sha256sum mismatch" in stderr_lower or "sha256sum" in stdout_lower:
|
||||
self.logger.error("DIAGNOSTIC: Checksum verification failed")
|
||||
self.logger.error(" - Component download may be corrupted")
|
||||
self.logger.error(" - Network issue or upstream file change")
|
||||
elif "curl" in stderr_lower or "wget" in stderr_lower:
|
||||
self.logger.error("DIAGNOSTIC: Download tool (curl/wget) issue")
|
||||
self.logger.error(" - Network connectivity problem or missing download tool")
|
||||
elif "cabextract" in stderr_lower:
|
||||
self.logger.error("DIAGNOSTIC: cabextract missing or failed")
|
||||
self.logger.error(" - Required for extracting Windows cabinet files")
|
||||
elif "unzip" in stderr_lower:
|
||||
self.logger.error("DIAGNOSTIC: unzip missing or failed")
|
||||
self.logger.error(" - Required for extracting zip archives")
|
||||
else:
|
||||
self.logger.error("DIAGNOSTIC: Unknown winetricks failure")
|
||||
self.logger.error(" - Check full logs for details")
|
||||
self.logger.error(" - Will attempt protontricks fallback if all attempts fail")
|
||||
|
||||
winetricks_failed = True
|
||||
|
||||
except subprocess.TimeoutExpired as e:
|
||||
self.logger.error(f"Winetricks timed out (Attempt {attempt}/{max_attempts}): {e}")
|
||||
last_error_details = {'error': 'timeout', 'attempt': attempt}
|
||||
winetricks_failed = True
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error during winetricks run (Attempt {attempt}/{max_attempts}): {e}", exc_info=True)
|
||||
last_error_details = {'error': str(e), 'attempt': attempt}
|
||||
winetricks_failed = True
|
||||
|
||||
# All winetricks attempts failed - try automatic fallback to protontricks
|
||||
if winetricks_failed:
|
||||
self.logger.error(f"Winetricks failed after {max_attempts} attempts.")
|
||||
|
||||
# Network diagnostics before fallback (non-fatal)
|
||||
self.logger.warning("=" * 80)
|
||||
self.logger.warning("NETWORK DIAGNOSTICS: Testing connectivity to component download sources...")
|
||||
try:
|
||||
# Check if curl is available
|
||||
curl_check = subprocess.run(['which', 'curl'], capture_output=True, timeout=5)
|
||||
if curl_check.returncode == 0:
|
||||
# Test Microsoft download servers (used by winetricks for .NET, VC runtimes, DirectX)
|
||||
test_result = subprocess.run(['curl', '-I', '--max-time', '10', 'https://download.microsoft.com'],
|
||||
capture_output=True, text=True, timeout=15)
|
||||
if test_result.returncode == 0:
|
||||
self.logger.warning("Can reach download.microsoft.com")
|
||||
else:
|
||||
self.logger.error("Cannot reach download.microsoft.com - network/DNS issue likely")
|
||||
self.logger.error(f" Curl exit code: {test_result.returncode}")
|
||||
if test_result.stderr:
|
||||
self.logger.error(f" Curl error: {test_result.stderr.strip()}")
|
||||
else:
|
||||
self.logger.warning("curl not available, skipping network diagnostic test")
|
||||
except Exception as e:
|
||||
self.logger.warning(f"Network diagnostic test skipped: {e}")
|
||||
self.logger.warning("=" * 80)
|
||||
|
||||
# Check if protontricks is available for fallback using centralized handler
|
||||
try:
|
||||
from .protontricks_handler import ProtontricksHandler
|
||||
steamdeck = os.path.exists('/home/deck')
|
||||
protontricks_handler = ProtontricksHandler(steamdeck)
|
||||
protontricks_available = protontricks_handler.detect_protontricks()
|
||||
|
||||
if protontricks_available:
|
||||
self.logger.warning("=" * 80)
|
||||
self.logger.warning("AUTOMATIC FALLBACK: Winetricks failed, attempting protontricks fallback...")
|
||||
self.logger.warning(f"Last winetricks error: {last_error_details}")
|
||||
self.logger.warning("=" * 80)
|
||||
|
||||
# Attempt fallback to protontricks
|
||||
fallback_success = self._install_components_protontricks_only(components_to_install, wineprefix, game_var, status_callback)
|
||||
|
||||
if fallback_success:
|
||||
self.logger.info("SUCCESS: Protontricks fallback succeeded where winetricks failed")
|
||||
return True
|
||||
else:
|
||||
self.logger.error("FAILURE: Both winetricks and protontricks fallback failed")
|
||||
return False
|
||||
else:
|
||||
self.logger.error("Protontricks not available for fallback")
|
||||
self.logger.error(f"Final winetricks error details: {last_error_details}")
|
||||
return False
|
||||
except Exception as e:
|
||||
self.logger.error(f"Could not check for protontricks fallback: {e}")
|
||||
return False
|
||||
|
||||
self.logger.error(f"Failed to install Wine components after {max_attempts} attempts.")
|
||||
return False
|
||||
|
||||
def _reorder_components_for_installation(self, components: list) -> list:
|
||||
"""
|
||||
Reorder components for proper installation sequence.
|
||||
Critical: dotnet40 must be installed before dotnet6/dotnet7 to avoid conflicts.
|
||||
Reorder components for proper installation sequence if needed.
|
||||
Currently returns components in original order.
|
||||
"""
|
||||
# Simple reordering: dotnet40 first, then everything else
|
||||
reordered = []
|
||||
|
||||
# Add dotnet40 first if it exists
|
||||
if "dotnet40" in components:
|
||||
reordered.append("dotnet40")
|
||||
|
||||
# Add all other components in original order
|
||||
for component in components:
|
||||
if component != "dotnet40":
|
||||
reordered.append(component)
|
||||
|
||||
if reordered != components:
|
||||
self.logger.info(f"Reordered for dotnet40 compatibility: {reordered}")
|
||||
|
||||
return reordered
|
||||
|
||||
def _prepare_prefix_for_dotnet(self, wineprefix: str, wine_binary: str) -> bool:
|
||||
"""
|
||||
Prepare the Wine prefix for .NET installation by mimicking protontricks preprocessing.
|
||||
This removes mono components and specific symlinks that interfere with .NET installation.
|
||||
"""
|
||||
try:
|
||||
env = os.environ.copy()
|
||||
env['WINEDEBUG'] = '-all'
|
||||
env['WINEPREFIX'] = wineprefix
|
||||
|
||||
# Step 1: Remove mono components (mimics protontricks behavior)
|
||||
self.logger.info("Preparing prefix for .NET installation: removing mono")
|
||||
mono_result = subprocess.run([
|
||||
self.winetricks_path,
|
||||
'-q',
|
||||
'remove_mono'
|
||||
], env=env, capture_output=True, text=True, timeout=300)
|
||||
|
||||
if mono_result.returncode != 0:
|
||||
self.logger.warning(f"Mono removal warning (non-critical): {mono_result.stderr}")
|
||||
|
||||
# Step 2: Set Windows version to XP (protontricks uses winxp for dotnet40)
|
||||
self.logger.info("Setting Windows version to XP for .NET compatibility")
|
||||
winxp_result = subprocess.run([
|
||||
self.winetricks_path,
|
||||
'-q',
|
||||
'winxp'
|
||||
], env=env, capture_output=True, text=True, timeout=300)
|
||||
|
||||
if winxp_result.returncode != 0:
|
||||
self.logger.warning(f"Windows XP setting warning: {winxp_result.stderr}")
|
||||
|
||||
# Step 3: Remove mscoree.dll symlinks (critical for .NET installation)
|
||||
self.logger.info("Removing problematic mscoree.dll symlinks")
|
||||
dosdevices_path = os.path.join(wineprefix, 'dosdevices', 'c:')
|
||||
mscoree_paths = [
|
||||
os.path.join(dosdevices_path, 'windows', 'syswow64', 'mscoree.dll'),
|
||||
os.path.join(dosdevices_path, 'windows', 'system32', 'mscoree.dll')
|
||||
]
|
||||
|
||||
for dll_path in mscoree_paths:
|
||||
if os.path.exists(dll_path) or os.path.islink(dll_path):
|
||||
try:
|
||||
os.remove(dll_path)
|
||||
self.logger.debug(f"Removed symlink: {dll_path}")
|
||||
except Exception as e:
|
||||
self.logger.warning(f"Could not remove {dll_path}: {e}")
|
||||
|
||||
self.logger.info("Prefix preparation complete for .NET installation")
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error preparing prefix for .NET: {e}")
|
||||
return False
|
||||
return components
|
||||
|
||||
def _install_components_separately(self, components: list, wineprefix: str, wine_binary: str, base_env: dict) -> bool:
|
||||
"""
|
||||
Install components separately like protontricks does.
|
||||
This is necessary when dotnet40 is present to avoid component conflicts.
|
||||
Install components separately for maximum compatibility.
|
||||
"""
|
||||
self.logger.info(f"Installing {len(components)} components separately (protontricks style)")
|
||||
self.logger.info(f"Installing {len(components)} components separately")
|
||||
|
||||
for i, component in enumerate(components, 1):
|
||||
self.logger.info(f"Installing component {i}/{len(components)}: {component}")
|
||||
|
||||
# Prepare environment for this component
|
||||
env = base_env.copy()
|
||||
|
||||
# Special preprocessing for dotnet40 only
|
||||
if component == "dotnet40":
|
||||
self.logger.info("Applying dotnet40 preprocessing")
|
||||
if not self._prepare_prefix_for_dotnet(wineprefix, wine_binary):
|
||||
self.logger.error("Failed to prepare prefix for dotnet40")
|
||||
return False
|
||||
else:
|
||||
# For non-dotnet40 components, install in standard mode (Windows 10 will be set after all components)
|
||||
self.logger.debug(f"Installing {component} in standard mode")
|
||||
env['WINEPREFIX'] = wineprefix
|
||||
env['WINE'] = wine_binary
|
||||
|
||||
# Install this component
|
||||
max_attempts = 3
|
||||
@@ -458,9 +527,6 @@ class WinetricksHandler:
|
||||
|
||||
try:
|
||||
cmd = [self.winetricks_path, '--unattended', component]
|
||||
env['WINEPREFIX'] = wineprefix
|
||||
env['WINE'] = wine_binary
|
||||
|
||||
self.logger.debug(f"Running: {' '.join(cmd)}")
|
||||
|
||||
result = subprocess.run(
|
||||
@@ -476,22 +542,6 @@ class WinetricksHandler:
|
||||
component_success = True
|
||||
break
|
||||
else:
|
||||
# Special handling for dotnet40 verification issue
|
||||
if component == "dotnet40" and "ngen.exe not found" in result.stderr:
|
||||
self.logger.warning("dotnet40 verification warning (expected in Steam Proton)")
|
||||
|
||||
# Check winetricks.log for actual success
|
||||
log_path = os.path.join(wineprefix, 'winetricks.log')
|
||||
if os.path.exists(log_path):
|
||||
try:
|
||||
with open(log_path, 'r') as f:
|
||||
if 'dotnet40' in f.read():
|
||||
self.logger.info("dotnet40 confirmed in winetricks.log")
|
||||
component_success = True
|
||||
break
|
||||
except Exception as e:
|
||||
self.logger.warning(f"Could not read winetricks.log: {e}")
|
||||
|
||||
self.logger.error(f"{component} failed (attempt {attempt}): {result.stderr.strip()}")
|
||||
self.logger.debug(f"Full stdout for {component}: {result.stdout.strip()}")
|
||||
|
||||
@@ -503,128 +553,10 @@ class WinetricksHandler:
|
||||
return False
|
||||
|
||||
self.logger.info("All components installed successfully using separate sessions")
|
||||
# Set Windows 10 mode after all component installation (matches legacy script timing)
|
||||
# Set Windows 10 mode after all component installation
|
||||
self._set_windows_10_mode(wineprefix, env.get('WINE', ''))
|
||||
return True
|
||||
|
||||
def _install_components_hybrid_approach(self, components: list, wineprefix: str, game_var: str, use_winetricks: bool = True) -> bool:
|
||||
"""
|
||||
Hybrid approach: Install legacy .NET Framework versions with protontricks (reliable),
|
||||
then install remaining components with winetricks OR protontricks based on user preference.
|
||||
|
||||
Args:
|
||||
components: List of all components to install
|
||||
wineprefix: Wine prefix path
|
||||
game_var: Game variable for AppID detection
|
||||
use_winetricks: Whether to use winetricks for non-legacy components
|
||||
|
||||
Returns:
|
||||
bool: True if all installations succeeded, False otherwise
|
||||
"""
|
||||
self.logger.info("Starting hybrid installation approach")
|
||||
|
||||
# Legacy .NET Framework versions that need protontricks
|
||||
legacy_dotnet_versions = ['dotnet40', 'dotnet472', 'dotnet48']
|
||||
|
||||
# Separate legacy .NET (protontricks) from other components (winetricks)
|
||||
protontricks_components = [comp for comp in components if comp in legacy_dotnet_versions]
|
||||
other_components = [comp for comp in components if comp not in legacy_dotnet_versions]
|
||||
|
||||
self.logger.info(f"Protontricks components: {protontricks_components}")
|
||||
self.logger.info(f"Other components: {other_components}")
|
||||
|
||||
# Step 1: Install legacy .NET Framework versions with protontricks if present
|
||||
if protontricks_components:
|
||||
self.logger.info(f"Installing legacy .NET versions {protontricks_components} using protontricks...")
|
||||
if not self._install_legacy_dotnet_with_protontricks(protontricks_components, wineprefix, game_var):
|
||||
self.logger.error(f"Failed to install {protontricks_components} with protontricks")
|
||||
return False
|
||||
self.logger.info(f"{protontricks_components} installation completed successfully with protontricks")
|
||||
|
||||
# Step 2: Install remaining components if any
|
||||
if other_components:
|
||||
if use_winetricks:
|
||||
self.logger.info(f"Installing remaining components with winetricks: {other_components}")
|
||||
# Use existing winetricks logic for other components
|
||||
env = self._prepare_winetricks_environment(wineprefix)
|
||||
if not env:
|
||||
return False
|
||||
return self._install_components_with_winetricks(other_components, wineprefix, env)
|
||||
else:
|
||||
self.logger.info(f"Installing remaining components with protontricks: {other_components}")
|
||||
return self._install_components_protontricks_only(other_components, wineprefix, game_var)
|
||||
|
||||
self.logger.info("Hybrid component installation completed successfully")
|
||||
# Set Windows 10 mode after all component installation (matches legacy script timing)
|
||||
wine_binary = self._get_wine_binary_for_prefix(wineprefix)
|
||||
self._set_windows_10_mode(wineprefix, wine_binary)
|
||||
return True
|
||||
|
||||
def _install_legacy_dotnet_with_protontricks(self, legacy_components: list, wineprefix: str, game_var: str) -> bool:
|
||||
"""
|
||||
Install legacy .NET Framework versions using protontricks (known to work more reliably).
|
||||
|
||||
Args:
|
||||
legacy_components: List of legacy .NET components to install (dotnet40, dotnet472, dotnet48)
|
||||
wineprefix: Wine prefix path
|
||||
game_var: Game variable for AppID detection
|
||||
|
||||
Returns:
|
||||
bool: True if installation succeeded, False otherwise
|
||||
"""
|
||||
try:
|
||||
# Extract AppID from wineprefix path (e.g., /path/to/compatdata/123456789/pfx -> 123456789)
|
||||
appid = None
|
||||
if 'compatdata' in wineprefix:
|
||||
# Standard Steam compatdata structure
|
||||
path_parts = Path(wineprefix).parts
|
||||
for i, part in enumerate(path_parts):
|
||||
if part == 'compatdata' and i + 1 < len(path_parts):
|
||||
potential_appid = path_parts[i + 1]
|
||||
if potential_appid.isdigit():
|
||||
appid = potential_appid
|
||||
break
|
||||
|
||||
if not appid:
|
||||
self.logger.error(f"Could not extract AppID from wineprefix path: {wineprefix}")
|
||||
return False
|
||||
|
||||
self.logger.info(f"Using AppID {appid} for protontricks dotnet40 installation")
|
||||
|
||||
# Import and use protontricks handler
|
||||
from .protontricks_handler import ProtontricksHandler
|
||||
|
||||
# Determine if we're on Steam Deck (for protontricks handler)
|
||||
steamdeck = os.path.exists('/home/deck')
|
||||
|
||||
protontricks_handler = ProtontricksHandler(steamdeck, logger=self.logger)
|
||||
|
||||
# Detect protontricks availability
|
||||
if not protontricks_handler.detect_protontricks():
|
||||
self.logger.error(f"Protontricks not available for legacy .NET installation: {legacy_components}")
|
||||
return False
|
||||
|
||||
# Install legacy .NET components using protontricks
|
||||
success = protontricks_handler.install_wine_components(appid, game_var, legacy_components)
|
||||
|
||||
if success:
|
||||
self.logger.info(f"Legacy .NET components {legacy_components} installed successfully with protontricks")
|
||||
|
||||
# Enable dotfiles and symlinks for the prefix
|
||||
if protontricks_handler.enable_dotfiles(appid):
|
||||
self.logger.info("Enabled dotfiles and symlinks support")
|
||||
else:
|
||||
self.logger.warning("Failed to enable dotfiles/symlinks (non-critical)")
|
||||
|
||||
return True
|
||||
else:
|
||||
self.logger.error(f"Legacy .NET components {legacy_components} installation failed with protontricks")
|
||||
return False
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error installing legacy .NET components {legacy_components} with protontricks: {e}", exc_info=True)
|
||||
return False
|
||||
|
||||
def _prepare_winetricks_environment(self, wineprefix: str) -> Optional[dict]:
|
||||
"""
|
||||
Prepare the environment for winetricks installation.
|
||||
@@ -662,9 +594,15 @@ class WinetricksHandler:
|
||||
wine_binary = ge_proton_wine
|
||||
|
||||
if not wine_binary:
|
||||
best_proton = WineUtils.select_best_proton()
|
||||
if best_proton:
|
||||
wine_binary = WineUtils.find_proton_binary(best_proton['name'])
|
||||
if user_proton_path == 'auto':
|
||||
self.logger.info("Auto-detecting Proton (user selected 'auto')")
|
||||
best_proton = WineUtils.select_best_proton()
|
||||
if best_proton:
|
||||
wine_binary = WineUtils.find_proton_binary(best_proton['name'])
|
||||
else:
|
||||
# User selected a specific Proton but validation failed
|
||||
self.logger.error(f"Cannot prepare winetricks environment: configured Proton not found: {user_proton_path}")
|
||||
return None
|
||||
|
||||
if not wine_binary or not (os.path.exists(wine_binary) and os.access(wine_binary, os.X_OK)):
|
||||
self.logger.error(f"Cannot prepare winetricks environment: No compatible Proton found")
|
||||
@@ -732,11 +670,18 @@ class WinetricksHandler:
|
||||
)
|
||||
|
||||
if result.returncode == 0:
|
||||
self.logger.info(f"Winetricks components installed successfully: {components}")
|
||||
# Set Windows 10 mode after component installation (matches legacy script timing)
|
||||
wine_binary = env.get('WINE', '')
|
||||
self._set_windows_10_mode(env.get('WINEPREFIX', ''), wine_binary)
|
||||
return True
|
||||
self.logger.info(f"Winetricks components installation command completed.")
|
||||
|
||||
# Verify components were actually installed
|
||||
if self._verify_components_installed(wineprefix, components, env):
|
||||
self.logger.info("Component verification successful - all components installed correctly.")
|
||||
# Set Windows 10 mode after component installation (matches legacy script timing)
|
||||
wine_binary = env.get('WINE', '')
|
||||
self._set_windows_10_mode(env.get('WINEPREFIX', ''), wine_binary)
|
||||
return True
|
||||
else:
|
||||
self.logger.error(f"Component verification failed (attempt {attempt})")
|
||||
# Continue to retry
|
||||
else:
|
||||
self.logger.error(f"Winetricks failed (attempt {attempt}): {result.stderr.strip()}")
|
||||
|
||||
@@ -769,13 +714,18 @@ class WinetricksHandler:
|
||||
except Exception as e:
|
||||
self.logger.warning(f"Error setting Windows 10 mode: {e}")
|
||||
|
||||
def _install_components_protontricks_only(self, components: list, wineprefix: str, game_var: str) -> bool:
|
||||
def _install_components_protontricks_only(self, components: list, wineprefix: str, game_var: str, status_callback: Optional[Callable[[str], None]] = None) -> bool:
|
||||
"""
|
||||
Legacy approach: Install all components using protontricks only.
|
||||
Install all components using protontricks only.
|
||||
This matches the behavior of the original bash script.
|
||||
|
||||
Args:
|
||||
components: List of components to install
|
||||
wineprefix: Path to wine prefix
|
||||
game_var: Game variable name
|
||||
"""
|
||||
try:
|
||||
self.logger.info(f"Installing all components with protontricks (legacy method): {components}")
|
||||
self.logger.info(f"Installing all components with system protontricks: {components}")
|
||||
|
||||
# Import protontricks handler
|
||||
from ..handlers.protontricks_handler import ProtontricksHandler
|
||||
@@ -798,6 +748,9 @@ class WinetricksHandler:
|
||||
return False
|
||||
|
||||
# Install all components using protontricks
|
||||
components_list = ', '.join(components)
|
||||
if status_callback:
|
||||
status_callback(f"Installing Wine components via protontricks: {components_list}")
|
||||
success = protontricks_handler.install_wine_components(appid, game_var, components)
|
||||
|
||||
if success:
|
||||
@@ -869,17 +822,87 @@ class WinetricksHandler:
|
||||
elif os.path.exists(ge_proton_wine):
|
||||
wine_binary = ge_proton_wine
|
||||
|
||||
# Fall back to auto-detection if user selection failed or is 'auto'
|
||||
# Only auto-detect if user explicitly chose 'auto'
|
||||
if not wine_binary:
|
||||
best_proton = WineUtils.select_best_proton()
|
||||
if best_proton:
|
||||
wine_binary = WineUtils.find_proton_binary(best_proton['name'])
|
||||
if user_proton_path == 'auto':
|
||||
self.logger.info("Auto-detecting Proton (user selected 'auto')")
|
||||
best_proton = WineUtils.select_best_proton()
|
||||
if best_proton:
|
||||
wine_binary = WineUtils.find_proton_binary(best_proton['name'])
|
||||
else:
|
||||
# User selected a specific Proton but validation failed
|
||||
self.logger.error(f"Configured Proton not found: {user_proton_path}")
|
||||
return ""
|
||||
|
||||
return wine_binary if wine_binary else ""
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error getting wine binary for prefix: {e}")
|
||||
return ""
|
||||
|
||||
def _verify_components_installed(self, wineprefix: str, components: List[str], env: dict) -> bool:
|
||||
"""
|
||||
Verify that Wine components were actually installed by checking winetricks.log.
|
||||
|
||||
Args:
|
||||
wineprefix: Wine prefix path
|
||||
components: List of components that should be installed
|
||||
env: Environment variables (includes WINE path)
|
||||
|
||||
Returns:
|
||||
bool: True if all critical components are verified, False otherwise
|
||||
"""
|
||||
try:
|
||||
self.logger.info("Verifying installed components...")
|
||||
|
||||
# Check winetricks.log file for installed components
|
||||
winetricks_log = os.path.join(wineprefix, 'winetricks.log')
|
||||
|
||||
if not os.path.exists(winetricks_log):
|
||||
self.logger.error(f"winetricks.log not found at {winetricks_log}")
|
||||
return False
|
||||
|
||||
try:
|
||||
with open(winetricks_log, 'r', encoding='utf-8', errors='ignore') as f:
|
||||
log_content = f.read().lower()
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to read winetricks.log: {e}")
|
||||
return False
|
||||
|
||||
self.logger.debug(f"winetricks.log length: {len(log_content)} bytes")
|
||||
|
||||
# Define critical components that MUST be installed
|
||||
critical_components = ["vcrun2022", "xact"]
|
||||
|
||||
# Check for critical components
|
||||
missing_critical = []
|
||||
for component in critical_components:
|
||||
if component.lower() not in log_content:
|
||||
missing_critical.append(component)
|
||||
|
||||
if missing_critical:
|
||||
self.logger.error(f"CRITICAL: Missing essential components: {missing_critical}")
|
||||
self.logger.error("Installation reported success but components are NOT in winetricks.log")
|
||||
return False
|
||||
|
||||
# Check for requested components (warn but don't fail)
|
||||
missing_requested = []
|
||||
for component in components:
|
||||
# Handle settings like fontsmooth=rgb (just check the base component name)
|
||||
base_component = component.split('=')[0].lower()
|
||||
if base_component not in log_content and component.lower() not in log_content:
|
||||
missing_requested.append(component)
|
||||
|
||||
if missing_requested:
|
||||
self.logger.warning(f"Some requested components may not be installed: {missing_requested}")
|
||||
self.logger.warning("This may cause issues, but critical components are present")
|
||||
|
||||
self.logger.info(f"Verification passed - critical components confirmed: {critical_components}")
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error verifying components: {e}", exc_info=True)
|
||||
return False
|
||||
|
||||
def _cleanup_wine_processes(self):
|
||||
"""
|
||||
Internal method to clean up wine processes during component installation
|
||||
|
||||
@@ -68,7 +68,9 @@ class SystemInfo:
|
||||
steam_root: Optional[Path] = None
|
||||
steam_user_id: Optional[str] = None
|
||||
proton_version: Optional[str] = None
|
||||
|
||||
is_flatpak_steam: bool = False
|
||||
is_native_steam: bool = False
|
||||
|
||||
def to_dict(self) -> Dict[str, Any]:
|
||||
"""Convert to dictionary."""
|
||||
return {
|
||||
@@ -76,4 +78,6 @@ class SystemInfo:
|
||||
'steam_root': str(self.steam_root) if self.steam_root else None,
|
||||
'steam_user_id': self.steam_user_id,
|
||||
'proton_version': self.proton_version,
|
||||
'is_flatpak_steam': self.is_flatpak_steam,
|
||||
'is_native_steam': self.is_native_steam,
|
||||
}
|
||||
216
jackify/backend/models/modlist_metadata.py
Normal file
216
jackify/backend/models/modlist_metadata.py
Normal file
@@ -0,0 +1,216 @@
|
||||
"""
|
||||
Data models for modlist metadata from jackify-engine JSON output.
|
||||
|
||||
These models match the JSON schema documented in MODLIST_METADATA_IMPLEMENTATION.md
|
||||
"""
|
||||
from dataclasses import dataclass, field
|
||||
from typing import List, Optional
|
||||
from datetime import datetime
|
||||
|
||||
|
||||
@dataclass
|
||||
class ModlistImages:
|
||||
"""Image URLs for modlist (small thumbnail and large banner)"""
|
||||
small: str
|
||||
large: str
|
||||
|
||||
|
||||
@dataclass
|
||||
class ModlistLinks:
|
||||
"""External links associated with the modlist"""
|
||||
image: Optional[str] = None
|
||||
readme: Optional[str] = None
|
||||
download: Optional[str] = None
|
||||
discordURL: Optional[str] = None
|
||||
websiteURL: Optional[str] = None
|
||||
|
||||
|
||||
@dataclass
|
||||
class ModlistSizes:
|
||||
"""Size information for modlist downloads and installation"""
|
||||
downloadSize: int
|
||||
downloadSizeFormatted: str
|
||||
installSize: int
|
||||
installSizeFormatted: str
|
||||
totalSize: int
|
||||
totalSizeFormatted: str
|
||||
numberOfArchives: int
|
||||
numberOfInstalledFiles: int
|
||||
|
||||
|
||||
@dataclass
|
||||
class ModlistValidation:
|
||||
"""Validation status from Wabbajack build server (optional)"""
|
||||
failed: int = 0
|
||||
passed: int = 0
|
||||
updating: int = 0
|
||||
mirrored: int = 0
|
||||
modListIsMissing: bool = False
|
||||
hasFailures: bool = False
|
||||
|
||||
|
||||
@dataclass
|
||||
class ModlistMetadata:
|
||||
"""Complete modlist metadata from jackify-engine"""
|
||||
# Basic information
|
||||
title: str
|
||||
description: str
|
||||
author: str
|
||||
maintainers: List[str]
|
||||
namespacedName: str
|
||||
repositoryName: str
|
||||
machineURL: str
|
||||
|
||||
# Game information
|
||||
game: str
|
||||
gameHumanFriendly: str
|
||||
|
||||
# Status flags
|
||||
official: bool
|
||||
nsfw: bool
|
||||
utilityList: bool
|
||||
forceDown: bool
|
||||
imageContainsTitle: bool
|
||||
|
||||
# Version information
|
||||
version: Optional[str] = None
|
||||
displayVersionOnlyInInstallerView: bool = False
|
||||
|
||||
# Dates
|
||||
dateCreated: Optional[str] = None # ISO8601 format
|
||||
dateUpdated: Optional[str] = None # ISO8601 format
|
||||
|
||||
# Categorization
|
||||
tags: List[str] = field(default_factory=list)
|
||||
|
||||
# Nested objects
|
||||
links: Optional[ModlistLinks] = None
|
||||
sizes: Optional[ModlistSizes] = None
|
||||
images: Optional[ModlistImages] = None
|
||||
|
||||
# Optional data (only if flags specified)
|
||||
validation: Optional[ModlistValidation] = None
|
||||
mods: List[str] = field(default_factory=list)
|
||||
|
||||
def is_available(self) -> bool:
|
||||
"""Check if modlist is available for installation"""
|
||||
if self.forceDown:
|
||||
return False
|
||||
if self.validation and self.validation.hasFailures:
|
||||
return False
|
||||
return True
|
||||
|
||||
def is_broken(self) -> bool:
|
||||
"""Check if modlist has validation failures"""
|
||||
return self.validation.hasFailures if self.validation else False
|
||||
|
||||
def get_date_updated_datetime(self) -> Optional[datetime]:
|
||||
"""Parse dateUpdated string to datetime object"""
|
||||
if not self.dateUpdated:
|
||||
return None
|
||||
try:
|
||||
return datetime.fromisoformat(self.dateUpdated.replace('Z', '+00:00'))
|
||||
except (ValueError, AttributeError):
|
||||
return None
|
||||
|
||||
def get_date_created_datetime(self) -> Optional[datetime]:
|
||||
"""Parse dateCreated string to datetime object"""
|
||||
if not self.dateCreated:
|
||||
return None
|
||||
try:
|
||||
return datetime.fromisoformat(self.dateCreated.replace('Z', '+00:00'))
|
||||
except (ValueError, AttributeError):
|
||||
return None
|
||||
|
||||
|
||||
@dataclass
|
||||
class ModlistMetadataResponse:
|
||||
"""Root response object from jackify-engine list-modlists --json"""
|
||||
metadataVersion: str
|
||||
timestamp: str # ISO8601 format
|
||||
count: int
|
||||
modlists: List[ModlistMetadata]
|
||||
|
||||
def get_timestamp_datetime(self) -> Optional[datetime]:
|
||||
"""Parse timestamp string to datetime object"""
|
||||
try:
|
||||
return datetime.fromisoformat(self.timestamp.replace('Z', '+00:00'))
|
||||
except (ValueError, AttributeError):
|
||||
return None
|
||||
|
||||
def filter_by_game(self, game: str) -> List[ModlistMetadata]:
|
||||
"""Filter modlists by game name"""
|
||||
return [m for m in self.modlists if m.game.lower() == game.lower()]
|
||||
|
||||
def filter_available_only(self) -> List[ModlistMetadata]:
|
||||
"""Filter to only available (non-broken, non-forced-down) modlists"""
|
||||
return [m for m in self.modlists if m.is_available()]
|
||||
|
||||
def filter_by_tag(self, tag: str) -> List[ModlistMetadata]:
|
||||
"""Filter modlists by tag"""
|
||||
return [m for m in self.modlists if tag.lower() in [t.lower() for t in m.tags]]
|
||||
|
||||
def filter_official_only(self) -> List[ModlistMetadata]:
|
||||
"""Filter to only official modlists"""
|
||||
return [m for m in self.modlists if m.official]
|
||||
|
||||
def search(self, query: str) -> List[ModlistMetadata]:
|
||||
"""Search modlists by title, description, or author"""
|
||||
query_lower = query.lower()
|
||||
return [
|
||||
m for m in self.modlists
|
||||
if query_lower in m.title.lower()
|
||||
or query_lower in m.description.lower()
|
||||
or query_lower in m.author.lower()
|
||||
]
|
||||
|
||||
|
||||
def parse_modlist_metadata_from_dict(data: dict) -> ModlistMetadata:
|
||||
"""Parse a modlist metadata dictionary into ModlistMetadata object"""
|
||||
# Parse nested objects
|
||||
images = ModlistImages(**data['images']) if 'images' in data and data['images'] else None
|
||||
links = ModlistLinks(**data['links']) if 'links' in data and data['links'] else None
|
||||
sizes = ModlistSizes(**data['sizes']) if 'sizes' in data and data['sizes'] else None
|
||||
validation = ModlistValidation(**data['validation']) if 'validation' in data and data['validation'] else None
|
||||
|
||||
# Create ModlistMetadata with nested objects
|
||||
metadata = ModlistMetadata(
|
||||
title=data['title'],
|
||||
description=data['description'],
|
||||
author=data['author'],
|
||||
maintainers=data.get('maintainers', []),
|
||||
namespacedName=data['namespacedName'],
|
||||
repositoryName=data['repositoryName'],
|
||||
machineURL=data['machineURL'],
|
||||
game=data['game'],
|
||||
gameHumanFriendly=data['gameHumanFriendly'],
|
||||
official=data['official'],
|
||||
nsfw=data['nsfw'],
|
||||
utilityList=data['utilityList'],
|
||||
forceDown=data['forceDown'],
|
||||
imageContainsTitle=data['imageContainsTitle'],
|
||||
version=data.get('version'),
|
||||
displayVersionOnlyInInstallerView=data.get('displayVersionOnlyInInstallerView', False),
|
||||
dateCreated=data.get('dateCreated'),
|
||||
dateUpdated=data.get('dateUpdated'),
|
||||
tags=data.get('tags', []),
|
||||
links=links,
|
||||
sizes=sizes,
|
||||
images=images,
|
||||
validation=validation,
|
||||
mods=data.get('mods', [])
|
||||
)
|
||||
|
||||
return metadata
|
||||
|
||||
|
||||
def parse_modlist_metadata_response(data: dict) -> ModlistMetadataResponse:
|
||||
"""Parse the full JSON response from jackify-engine into ModlistMetadataResponse"""
|
||||
modlists = [parse_modlist_metadata_from_dict(m) for m in data.get('modlists', [])]
|
||||
|
||||
return ModlistMetadataResponse(
|
||||
metadataVersion=data['metadataVersion'],
|
||||
timestamp=data['timestamp'],
|
||||
count=data['count'],
|
||||
modlists=modlists
|
||||
)
|
||||
@@ -29,9 +29,11 @@ class AutomatedPrefixService:
|
||||
and direct Proton wrapper integration.
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
self.scripts_dir = Path.home() / "Jackify/scripts"
|
||||
def __init__(self, system_info=None):
|
||||
from jackify.shared.paths import get_jackify_data_dir
|
||||
self.scripts_dir = get_jackify_data_dir() / "scripts"
|
||||
self.scripts_dir.mkdir(parents=True, exist_ok=True)
|
||||
self.system_info = system_info
|
||||
# Use shared timing for consistency across services
|
||||
|
||||
def _get_progress_timestamp(self):
|
||||
@@ -491,54 +493,54 @@ exit"""
|
||||
def detect_actual_prefix_appid(self, initial_appid: int, shortcut_name: str) -> Optional[int]:
|
||||
"""
|
||||
After Steam restart, detect the actual prefix AppID that was created.
|
||||
Use protontricks -l to find the actual positive AppID.
|
||||
|
||||
Uses direct VDF file reading to find the actual AppID.
|
||||
|
||||
Args:
|
||||
initial_appid: The initial (negative) AppID from shortcuts.vdf
|
||||
shortcut_name: Name of the shortcut for logging
|
||||
|
||||
|
||||
Returns:
|
||||
The actual (positive) AppID of the created prefix, or None if not found
|
||||
"""
|
||||
try:
|
||||
logger.info(f"Using protontricks -l to detect actual AppID for shortcut: {shortcut_name}")
|
||||
|
||||
# Wait up to 30 seconds for the shortcut to appear in protontricks
|
||||
logger.info(f"Using VDF to detect actual AppID for shortcut: {shortcut_name}")
|
||||
|
||||
# Wait up to 30 seconds for Steam to process the shortcut
|
||||
for i in range(30):
|
||||
try:
|
||||
# Use the existing protontricks handler
|
||||
from jackify.backend.handlers.protontricks_handler import ProtontricksHandler
|
||||
protontricks_handler = ProtontricksHandler(steamdeck or False)
|
||||
result = protontricks_handler.run_protontricks('-l')
|
||||
|
||||
if result.returncode == 0:
|
||||
lines = result.stdout.strip().split('\n')
|
||||
|
||||
# Look for our shortcut name in the protontricks output
|
||||
for line in lines:
|
||||
if shortcut_name in line and 'Non-Steam shortcut:' in line:
|
||||
# Extract AppID from line like "Non-Steam shortcut: Tuxborn (3106560878)"
|
||||
if '(' in line and ')' in line:
|
||||
appid_str = line.split('(')[1].split(')')[0]
|
||||
actual_appid = int(appid_str)
|
||||
logger.info(f" Found shortcut in protontricks: {line.strip()}")
|
||||
logger.info(f" Initial AppID: {initial_appid}")
|
||||
logger.info(f" Actual AppID: {actual_appid}")
|
||||
return actual_appid
|
||||
|
||||
logger.debug(f"Shortcut '{shortcut_name}' not found in protontricks yet (attempt {i+1}/30)")
|
||||
time.sleep(1)
|
||||
|
||||
except subprocess.TimeoutExpired:
|
||||
logger.warning(f"protontricks -l timed out on attempt {i+1}")
|
||||
from ..handlers.shortcut_handler import ShortcutHandler
|
||||
from ..handlers.path_handler import PathHandler
|
||||
|
||||
path_handler = PathHandler()
|
||||
shortcuts_path = path_handler._find_shortcuts_vdf()
|
||||
|
||||
if shortcuts_path:
|
||||
from ..handlers.vdf_handler import VDFHandler
|
||||
shortcuts_data = VDFHandler.load(shortcuts_path, binary=True)
|
||||
|
||||
if shortcuts_data and 'shortcuts' in shortcuts_data:
|
||||
for idx, shortcut in shortcuts_data['shortcuts'].items():
|
||||
app_name = shortcut.get('AppName', shortcut.get('appname', '')).strip()
|
||||
|
||||
if app_name.lower() == shortcut_name.lower():
|
||||
appid = shortcut.get('appid')
|
||||
if appid:
|
||||
actual_appid = int(appid) & 0xFFFFFFFF
|
||||
logger.info(f"Found shortcut '{app_name}' in shortcuts.vdf")
|
||||
logger.info(f" Initial AppID (signed): {initial_appid}")
|
||||
logger.info(f" Actual AppID (unsigned): {actual_appid}")
|
||||
return actual_appid
|
||||
|
||||
logger.debug(f"Shortcut '{shortcut_name}' not found in VDF yet (attempt {i+1}/30)")
|
||||
time.sleep(1)
|
||||
|
||||
except Exception as e:
|
||||
logger.warning(f"Error running protontricks -l on attempt {i+1}: {e}")
|
||||
logger.warning(f"Error reading shortcuts.vdf on attempt {i+1}: {e}")
|
||||
time.sleep(1)
|
||||
|
||||
logger.error(f"Shortcut '{shortcut_name}' not found in protontricks after 30 seconds")
|
||||
|
||||
logger.error(f"Shortcut '{shortcut_name}' not found in shortcuts.vdf after 30 seconds")
|
||||
return None
|
||||
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error detecting actual prefix AppID: {e}")
|
||||
return None
|
||||
@@ -546,13 +548,15 @@ exit"""
|
||||
def restart_steam(self) -> bool:
|
||||
"""
|
||||
Restart Steam using the robust service method.
|
||||
|
||||
|
||||
Returns:
|
||||
True if successful, False otherwise
|
||||
"""
|
||||
try:
|
||||
from .steam_restart_service import robust_steam_restart
|
||||
return robust_steam_restart(progress_callback=None, timeout=60)
|
||||
# Use system_info if available (backward compatibility)
|
||||
system_info = getattr(self, 'system_info', None)
|
||||
return robust_steam_restart(progress_callback=None, timeout=60, system_info=system_info)
|
||||
except Exception as e:
|
||||
logger.error(f"Error restarting Steam: {e}")
|
||||
return False
|
||||
@@ -746,7 +750,8 @@ echo Creating Proton prefix...
|
||||
timeout /t 3 /nobreak >nul
|
||||
echo Prefix creation complete.
|
||||
"""
|
||||
batch_path = Path.home() / "Jackify/temp_prefix_creation.bat"
|
||||
from jackify.shared.paths import get_jackify_data_dir
|
||||
batch_path = get_jackify_data_dir() / "temp_prefix_creation.bat"
|
||||
batch_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
with open(batch_path, 'w') as f:
|
||||
@@ -929,22 +934,35 @@ echo Prefix creation complete.
|
||||
# Get or create CompatToolMapping
|
||||
if 'CompatToolMapping' not in config_data['Software']['Valve']['Steam']:
|
||||
config_data['Software']['Valve']['Steam']['CompatToolMapping'] = {}
|
||||
|
||||
# Set the Proton version for this AppID
|
||||
config_data['Software']['Valve']['Steam']['CompatToolMapping'][str(appid)] = proton_version
|
||||
|
||||
# Set the Proton version for this AppID using Steam's expected format
|
||||
# Steam requires a dict with 'name', 'config', and 'priority' keys
|
||||
config_data['Software']['Valve']['Steam']['CompatToolMapping'][str(appid)] = {
|
||||
'name': proton_version,
|
||||
'config': '',
|
||||
'priority': '250'
|
||||
}
|
||||
|
||||
# Write back to file (text format)
|
||||
with open(config_path, 'w') as f:
|
||||
vdf.dump(config_data, f)
|
||||
|
||||
|
||||
# Ensure file is fully written to disk before Steam restart
|
||||
import os
|
||||
os.fsync(f.fileno()) if hasattr(f, 'fileno') else None
|
||||
|
||||
logger.info(f"Set Proton version {proton_version} for AppID {appid}")
|
||||
debug_print(f"[DEBUG] Set Proton version {proton_version} for AppID {appid} in config.vdf")
|
||||
|
||||
|
||||
# Small delay to ensure filesystem write completes
|
||||
import time
|
||||
time.sleep(0.5)
|
||||
|
||||
# Verify it was set correctly
|
||||
with open(config_path, 'r') as f:
|
||||
verify_data = vdf.load(f)
|
||||
actual_value = verify_data.get('Software', {}).get('Valve', {}).get('Steam', {}).get('CompatToolMapping', {}).get(str(appid))
|
||||
debug_print(f"[DEBUG] Verification: AppID {appid} -> {actual_value}")
|
||||
compat_mapping = verify_data.get('Software', {}).get('Valve', {}).get('Steam', {}).get('CompatToolMapping', {}).get(str(appid))
|
||||
debug_print(f"[DEBUG] Verification: AppID {appid} -> {compat_mapping}")
|
||||
|
||||
return True
|
||||
|
||||
@@ -1045,7 +1063,18 @@ echo Prefix creation complete.
|
||||
env = os.environ.copy()
|
||||
env['STEAM_COMPAT_DATA_PATH'] = str(prefix_path)
|
||||
env['STEAM_COMPAT_APP_ID'] = str(positive_appid) # Use positive AppID for environment
|
||||
env['STEAM_COMPAT_CLIENT_INSTALL_PATH'] = str(Path.home() / ".local/share/Steam")
|
||||
|
||||
# Determine correct Steam root based on installation type
|
||||
from ..handlers.path_handler import PathHandler
|
||||
path_handler = PathHandler()
|
||||
steam_library = path_handler.find_steam_library()
|
||||
if steam_library and steam_library.name == "common":
|
||||
# Extract Steam root from library path: .../Steam/steamapps/common -> .../Steam
|
||||
steam_root = steam_library.parent.parent
|
||||
env['STEAM_COMPAT_CLIENT_INSTALL_PATH'] = str(steam_root)
|
||||
else:
|
||||
# Fallback to legacy path if detection fails
|
||||
env['STEAM_COMPAT_CLIENT_INSTALL_PATH'] = str(Path.home() / ".local/share/Steam")
|
||||
|
||||
# Build the command
|
||||
cmd = [
|
||||
@@ -1109,7 +1138,10 @@ echo Prefix creation complete.
|
||||
|
||||
def _get_compatdata_path_for_appid(self, appid: int) -> Optional[Path]:
|
||||
"""
|
||||
Get the compatdata path for a given AppID using existing Jackify functions.
|
||||
Get the compatdata path for a given AppID.
|
||||
|
||||
First tries to find existing compatdata, then constructs path from libraryfolders.vdf
|
||||
for creating new prefixes.
|
||||
|
||||
Args:
|
||||
appid: The AppID to get the path for
|
||||
@@ -1117,22 +1149,32 @@ echo Prefix creation complete.
|
||||
Returns:
|
||||
Path to the compatdata directory, or None if not found
|
||||
"""
|
||||
# Use existing Jackify path detection
|
||||
from ..handlers.path_handler import PathHandler
|
||||
|
||||
# First, try to find existing compatdata
|
||||
compatdata_path = PathHandler.find_compat_data(str(appid))
|
||||
if compatdata_path:
|
||||
return compatdata_path
|
||||
|
||||
# Fallback: construct the path manually
|
||||
possible_bases = [
|
||||
# Prefix doesn't exist yet - determine where to create it from libraryfolders.vdf
|
||||
library_paths = PathHandler.get_all_steam_library_paths()
|
||||
if library_paths:
|
||||
# Use the first library (typically the default library)
|
||||
# Construct compatdata path: library_path/steamapps/compatdata/appid
|
||||
first_library = library_paths[0]
|
||||
compatdata_base = first_library / "steamapps" / "compatdata"
|
||||
return compatdata_base / str(appid)
|
||||
|
||||
# Only fallback if VDF parsing completely fails
|
||||
logger.warning("Could not get library paths from libraryfolders.vdf, using fallback locations")
|
||||
fallback_bases = [
|
||||
Path.home() / ".var/app/com.valvesoftware.Steam/data/Steam/steamapps/compatdata",
|
||||
Path.home() / ".var/app/com.valvesoftware.Steam/.local/share/Steam/steamapps/compatdata",
|
||||
Path.home() / ".steam/steam/steamapps/compatdata",
|
||||
Path.home() / ".local/share/Steam/steamapps/compatdata",
|
||||
Path.home() / ".var/app/com.valvesoftware.Steam/home/.steam/steam/steamapps/compatdata",
|
||||
Path.home() / ".var/app/com.valvesoftware.Steam/home/.local/share/Steam/steamapps/compatdata",
|
||||
]
|
||||
|
||||
for base_path in possible_bases:
|
||||
for base_path in fallback_bases:
|
||||
if base_path.is_dir():
|
||||
return base_path / str(appid)
|
||||
|
||||
@@ -2666,9 +2708,40 @@ echo Prefix creation complete.
|
||||
True if successful, False otherwise
|
||||
"""
|
||||
try:
|
||||
steam_root = Path.home() / ".steam/steam"
|
||||
compatdata_dir = steam_root / "steamapps/compatdata"
|
||||
proton_common_dir = steam_root / "steamapps/common"
|
||||
# Determine Steam locations based on installation type
|
||||
from ..handlers.path_handler import PathHandler
|
||||
path_handler = PathHandler()
|
||||
all_libraries = path_handler.get_all_steam_library_paths()
|
||||
|
||||
# Check if we have Flatpak Steam by looking for .var/app/com.valvesoftware.Steam in library paths
|
||||
is_flatpak_steam = any('.var/app/com.valvesoftware.Steam' in str(lib) for lib in all_libraries)
|
||||
|
||||
if is_flatpak_steam and all_libraries:
|
||||
# Flatpak Steam: Use the actual library root from libraryfolders.vdf
|
||||
# Compatdata should be in the library root, not the client root
|
||||
flatpak_library_root = all_libraries[0] # Use first library (typically the default)
|
||||
flatpak_client_root = flatpak_library_root.parent.parent / ".steam/steam"
|
||||
|
||||
if not flatpak_library_root.is_dir():
|
||||
logger.error(
|
||||
f"Flatpak Steam library root does not exist: {flatpak_library_root}"
|
||||
)
|
||||
return False
|
||||
|
||||
steam_root = flatpak_client_root if flatpak_client_root.is_dir() else flatpak_library_root
|
||||
# CRITICAL: compatdata must be in the library root, not client root
|
||||
compatdata_dir = flatpak_library_root / "steamapps/compatdata"
|
||||
proton_common_dir = flatpak_library_root / "steamapps/common"
|
||||
else:
|
||||
# Native Steam (or unknown): fall back to legacy ~/.steam/steam layout
|
||||
steam_root = Path.home() / ".steam/steam"
|
||||
compatdata_dir = steam_root / "steamapps/compatdata"
|
||||
proton_common_dir = steam_root / "steamapps/common"
|
||||
|
||||
# Ensure compatdata root exists and is a directory we actually want to use
|
||||
if not compatdata_dir.is_dir():
|
||||
logger.error(f"Compatdata root does not exist: {compatdata_dir}. Aborting prefix creation.")
|
||||
return False
|
||||
|
||||
# Find a Proton wrapper to use
|
||||
proton_path = self._find_proton_binary(proton_common_dir)
|
||||
@@ -2686,9 +2759,9 @@ echo Prefix creation complete.
|
||||
env['WINEDEBUG'] = '-all'
|
||||
env['WINEDLLOVERRIDES'] = 'msdia80.dll=n;conhost.exe=d;cmd.exe=d'
|
||||
|
||||
# Create the compatdata directory
|
||||
# Create the compatdata directory for this AppID (but never the whole tree)
|
||||
compat_dir = compatdata_dir / str(abs(appid))
|
||||
compat_dir.mkdir(parents=True, exist_ok=True)
|
||||
compat_dir.mkdir(exist_ok=True)
|
||||
|
||||
logger.info(f"Creating Proton prefix for AppID {appid}")
|
||||
logger.info(f"STEAM_COMPAT_CLIENT_INSTALL_PATH={env['STEAM_COMPAT_CLIENT_INSTALL_PATH']}")
|
||||
@@ -2697,9 +2770,18 @@ echo Prefix creation complete.
|
||||
# Run proton run wineboot -u to initialize the prefix
|
||||
cmd = [str(proton_path), 'run', 'wineboot', '-u']
|
||||
logger.info(f"Running: {' '.join(cmd)}")
|
||||
|
||||
|
||||
# Adjust timeout for SD card installations on Steam Deck (slower I/O)
|
||||
from ..services.platform_detection_service import PlatformDetectionService
|
||||
platform_service = PlatformDetectionService.get_instance()
|
||||
is_steamdeck_sdcard = (platform_service.is_steamdeck and
|
||||
str(proton_path).startswith('/run/media/'))
|
||||
timeout = 180 if is_steamdeck_sdcard else 60
|
||||
if is_steamdeck_sdcard:
|
||||
logger.info(f"Using extended timeout ({timeout}s) for Steam Deck SD card Proton installation")
|
||||
|
||||
# Use jackify-engine's approach: UseShellExecute=false, CreateNoWindow=true equivalent
|
||||
result = subprocess.run(cmd, env=env, capture_output=True, text=True, timeout=60,
|
||||
result = subprocess.run(cmd, env=env, capture_output=True, text=True, timeout=timeout,
|
||||
shell=False, creationflags=getattr(subprocess, 'CREATE_NO_WINDOW', 0))
|
||||
logger.info(f"Proton exit code: {result.returncode}")
|
||||
|
||||
@@ -2895,10 +2977,21 @@ echo Prefix creation complete.
|
||||
"""Find a Steam game installation path by AppID and common names"""
|
||||
import os
|
||||
from pathlib import Path
|
||||
|
||||
# Get Steam libraries from libraryfolders.vdf
|
||||
steam_config_path = Path.home() / ".steam/steam/config/libraryfolders.vdf"
|
||||
if not steam_config_path.exists():
|
||||
|
||||
# Get Steam libraries from libraryfolders.vdf - check multiple possible locations
|
||||
possible_config_paths = [
|
||||
Path.home() / ".steam/steam/config/libraryfolders.vdf",
|
||||
Path.home() / ".local/share/Steam/config/libraryfolders.vdf",
|
||||
Path.home() / ".var/app/com.valvesoftware.Steam/.local/share/Steam/config/libraryfolders.vdf" # Flatpak
|
||||
]
|
||||
|
||||
steam_config_path = None
|
||||
for path in possible_config_paths:
|
||||
if path.exists():
|
||||
steam_config_path = path
|
||||
break
|
||||
|
||||
if not steam_config_path:
|
||||
return None
|
||||
|
||||
steam_libraries = []
|
||||
@@ -3011,7 +3104,7 @@ echo Prefix creation complete.
|
||||
'/v', 'mscoree', '/t', 'REG_SZ', '/d', 'native', '/f'
|
||||
]
|
||||
|
||||
result1 = subprocess.run(cmd1, env=env, capture_output=True, text=True)
|
||||
result1 = subprocess.run(cmd1, env=env, capture_output=True, text=True, errors='replace')
|
||||
if result1.returncode == 0:
|
||||
logger.info("Successfully applied mscoree=native DLL override")
|
||||
else:
|
||||
@@ -3026,7 +3119,7 @@ echo Prefix creation complete.
|
||||
'/v', 'OnlyUseLatestCLR', '/t', 'REG_DWORD', '/d', '1', '/f'
|
||||
]
|
||||
|
||||
result2 = subprocess.run(cmd2, env=env, capture_output=True, text=True)
|
||||
result2 = subprocess.run(cmd2, env=env, capture_output=True, text=True, errors='replace')
|
||||
if result2.returncode == 0:
|
||||
logger.info("Successfully applied OnlyUseLatestCLR=1 registry entry")
|
||||
else:
|
||||
|
||||
455
jackify/backend/services/modlist_gallery_service.py
Normal file
455
jackify/backend/services/modlist_gallery_service.py
Normal file
@@ -0,0 +1,455 @@
|
||||
"""
|
||||
Service for fetching and managing modlist metadata for the gallery view.
|
||||
|
||||
Handles jackify-engine integration, caching, and image management.
|
||||
"""
|
||||
import json
|
||||
import subprocess
|
||||
import time
|
||||
import threading
|
||||
from pathlib import Path
|
||||
from typing import Optional, List, Dict
|
||||
from datetime import datetime, timedelta
|
||||
import urllib.request
|
||||
|
||||
from jackify.backend.models.modlist_metadata import (
|
||||
ModlistMetadataResponse,
|
||||
ModlistMetadata,
|
||||
parse_modlist_metadata_response
|
||||
)
|
||||
from jackify.backend.core.modlist_operations import get_jackify_engine_path
|
||||
from jackify.backend.handlers.config_handler import ConfigHandler
|
||||
from jackify.shared.paths import get_jackify_data_dir
|
||||
|
||||
|
||||
class ModlistGalleryService:
|
||||
"""Service for fetching and caching modlist metadata from jackify-engine"""
|
||||
|
||||
# REMOVED: CACHE_VALIDITY_DAYS - metadata is now always fetched fresh from engine
|
||||
# Images are still cached indefinitely (managed separately)
|
||||
# CRITICAL: Thread lock to prevent concurrent engine calls that could cause recursive spawning
|
||||
_engine_call_lock = threading.Lock()
|
||||
|
||||
def __init__(self):
|
||||
"""Initialize the gallery service"""
|
||||
self.config_handler = ConfigHandler()
|
||||
# Cache directories in Jackify Data Directory
|
||||
jackify_data_dir = get_jackify_data_dir()
|
||||
self.CACHE_DIR = jackify_data_dir / "modlist-cache" / "metadata"
|
||||
self.IMAGE_CACHE_DIR = jackify_data_dir / "modlist-cache" / "images"
|
||||
self.METADATA_CACHE_FILE = self.CACHE_DIR / "modlist_metadata.json"
|
||||
self._ensure_cache_dirs()
|
||||
# Tag metadata caches (avoid refetching per render)
|
||||
self._tag_mappings_cache: Optional[Dict[str, str]] = None
|
||||
self._tag_mapping_lookup: Optional[Dict[str, str]] = None
|
||||
self._allowed_tags_cache: Optional[set] = None
|
||||
self._allowed_tags_lookup: Optional[Dict[str, str]] = None
|
||||
|
||||
def _ensure_cache_dirs(self):
|
||||
"""Create cache directories if they don't exist"""
|
||||
self.CACHE_DIR.mkdir(parents=True, exist_ok=True)
|
||||
self.IMAGE_CACHE_DIR.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
def fetch_modlist_metadata(
|
||||
self,
|
||||
include_validation: bool = True,
|
||||
include_search_index: bool = False,
|
||||
sort_by: str = "title",
|
||||
force_refresh: bool = False
|
||||
) -> Optional[ModlistMetadataResponse]:
|
||||
"""
|
||||
Fetch modlist metadata from jackify-engine.
|
||||
|
||||
NOTE: Metadata is ALWAYS fetched fresh from the engine to ensure up-to-date
|
||||
version numbers and sizes for frequently-updated modlists. Only images are cached.
|
||||
|
||||
Args:
|
||||
include_validation: Include validation status (slower)
|
||||
include_search_index: Include mod search index (slower)
|
||||
sort_by: Sort order (title, size, date)
|
||||
force_refresh: Deprecated parameter (kept for API compatibility)
|
||||
|
||||
Returns:
|
||||
ModlistMetadataResponse or None if fetch fails
|
||||
"""
|
||||
# Always fetch fresh data from jackify-engine
|
||||
# The engine itself is fast (~1-2 seconds) and always gets latest metadata
|
||||
try:
|
||||
metadata = self._fetch_from_engine(
|
||||
include_validation=include_validation,
|
||||
include_search_index=include_search_index,
|
||||
sort_by=sort_by
|
||||
)
|
||||
|
||||
# Still save to cache as a fallback for offline scenarios
|
||||
if metadata:
|
||||
self._save_to_cache(metadata)
|
||||
|
||||
return metadata
|
||||
|
||||
except Exception as e:
|
||||
print(f"Error fetching modlist metadata: {e}")
|
||||
print("Falling back to cached metadata (may be outdated)")
|
||||
# Fall back to cache if network/engine fails
|
||||
return self._load_from_cache()
|
||||
|
||||
def _fetch_from_engine(
|
||||
self,
|
||||
include_validation: bool,
|
||||
include_search_index: bool,
|
||||
sort_by: str
|
||||
) -> Optional[ModlistMetadataResponse]:
|
||||
"""Call jackify-engine to fetch modlist metadata"""
|
||||
# CRITICAL: Use thread lock to prevent concurrent engine calls
|
||||
# Multiple simultaneous calls could cause recursive spawning issues
|
||||
with self._engine_call_lock:
|
||||
# CRITICAL: Get engine path BEFORE cleaning environment
|
||||
# get_jackify_engine_path() may need APPDIR to locate the engine
|
||||
engine_path = get_jackify_engine_path()
|
||||
if not engine_path:
|
||||
raise FileNotFoundError("jackify-engine not found")
|
||||
|
||||
# Build command
|
||||
cmd = [str(engine_path), "list-modlists", "--json", "--sort-by", sort_by]
|
||||
|
||||
if include_validation:
|
||||
cmd.append("--include-validation-status")
|
||||
|
||||
if include_search_index:
|
||||
cmd.append("--include-search-index")
|
||||
|
||||
# Execute command
|
||||
# CRITICAL: Use centralized clean environment to prevent AppImage recursive spawning
|
||||
# This must happen AFTER engine path resolution
|
||||
from jackify.backend.handlers.subprocess_utils import get_clean_subprocess_env
|
||||
clean_env = get_clean_subprocess_env()
|
||||
|
||||
result = subprocess.run(
|
||||
cmd,
|
||||
capture_output=True,
|
||||
text=True,
|
||||
timeout=300, # 5 minute timeout for large data
|
||||
env=clean_env
|
||||
)
|
||||
|
||||
if result.returncode != 0:
|
||||
raise RuntimeError(f"jackify-engine failed: {result.stderr}")
|
||||
|
||||
# Parse JSON response - skip progress messages and extract JSON
|
||||
# jackify-engine prints progress to stdout before the JSON
|
||||
stdout = result.stdout.strip()
|
||||
|
||||
# Find the start of JSON (first '{' on its own line)
|
||||
lines = stdout.split('\n')
|
||||
json_start = 0
|
||||
for i, line in enumerate(lines):
|
||||
if line.strip().startswith('{'):
|
||||
json_start = i
|
||||
break
|
||||
|
||||
json_text = '\n'.join(lines[json_start:])
|
||||
data = json.loads(json_text)
|
||||
return parse_modlist_metadata_response(data)
|
||||
|
||||
def _load_from_cache(self) -> Optional[ModlistMetadataResponse]:
|
||||
"""Load metadata from cache file"""
|
||||
if not self.METADATA_CACHE_FILE.exists():
|
||||
return None
|
||||
|
||||
try:
|
||||
with open(self.METADATA_CACHE_FILE, 'r', encoding='utf-8') as f:
|
||||
data = json.load(f)
|
||||
return parse_modlist_metadata_response(data)
|
||||
except Exception as e:
|
||||
print(f"Error loading cache: {e}")
|
||||
return None
|
||||
|
||||
def _save_to_cache(self, metadata: ModlistMetadataResponse):
|
||||
"""Save metadata to cache file"""
|
||||
try:
|
||||
# Convert to dict for JSON serialization
|
||||
data = {
|
||||
'metadataVersion': metadata.metadataVersion,
|
||||
'timestamp': metadata.timestamp,
|
||||
'count': metadata.count,
|
||||
'modlists': [self._metadata_to_dict(m) for m in metadata.modlists]
|
||||
}
|
||||
|
||||
with open(self.METADATA_CACHE_FILE, 'w', encoding='utf-8') as f:
|
||||
json.dump(data, f, indent=2)
|
||||
|
||||
except Exception as e:
|
||||
print(f"Error saving cache: {e}")
|
||||
|
||||
def _metadata_to_dict(self, metadata: ModlistMetadata) -> dict:
|
||||
"""Convert ModlistMetadata to dict for JSON serialization"""
|
||||
result = {
|
||||
'title': metadata.title,
|
||||
'description': metadata.description,
|
||||
'author': metadata.author,
|
||||
'maintainers': metadata.maintainers,
|
||||
'namespacedName': metadata.namespacedName,
|
||||
'repositoryName': metadata.repositoryName,
|
||||
'machineURL': metadata.machineURL,
|
||||
'game': metadata.game,
|
||||
'gameHumanFriendly': metadata.gameHumanFriendly,
|
||||
'official': metadata.official,
|
||||
'nsfw': metadata.nsfw,
|
||||
'utilityList': metadata.utilityList,
|
||||
'forceDown': metadata.forceDown,
|
||||
'imageContainsTitle': metadata.imageContainsTitle,
|
||||
'version': metadata.version,
|
||||
'displayVersionOnlyInInstallerView': metadata.displayVersionOnlyInInstallerView,
|
||||
'dateCreated': metadata.dateCreated,
|
||||
'dateUpdated': metadata.dateUpdated,
|
||||
'tags': metadata.tags,
|
||||
'mods': metadata.mods
|
||||
}
|
||||
|
||||
if metadata.images:
|
||||
result['images'] = {
|
||||
'small': metadata.images.small,
|
||||
'large': metadata.images.large
|
||||
}
|
||||
|
||||
if metadata.links:
|
||||
result['links'] = {
|
||||
'image': metadata.links.image,
|
||||
'readme': metadata.links.readme,
|
||||
'download': metadata.links.download,
|
||||
'discordURL': metadata.links.discordURL,
|
||||
'websiteURL': metadata.links.websiteURL
|
||||
}
|
||||
|
||||
if metadata.sizes:
|
||||
result['sizes'] = {
|
||||
'downloadSize': metadata.sizes.downloadSize,
|
||||
'downloadSizeFormatted': metadata.sizes.downloadSizeFormatted,
|
||||
'installSize': metadata.sizes.installSize,
|
||||
'installSizeFormatted': metadata.sizes.installSizeFormatted,
|
||||
'totalSize': metadata.sizes.totalSize,
|
||||
'totalSizeFormatted': metadata.sizes.totalSizeFormatted,
|
||||
'numberOfArchives': metadata.sizes.numberOfArchives,
|
||||
'numberOfInstalledFiles': metadata.sizes.numberOfInstalledFiles
|
||||
}
|
||||
|
||||
if metadata.validation:
|
||||
result['validation'] = {
|
||||
'failed': metadata.validation.failed,
|
||||
'passed': metadata.validation.passed,
|
||||
'updating': metadata.validation.updating,
|
||||
'mirrored': metadata.validation.mirrored,
|
||||
'modListIsMissing': metadata.validation.modListIsMissing,
|
||||
'hasFailures': metadata.validation.hasFailures
|
||||
}
|
||||
|
||||
return result
|
||||
|
||||
def download_images(
|
||||
self,
|
||||
game_filter: Optional[str] = None,
|
||||
size: str = "both",
|
||||
overwrite: bool = False
|
||||
) -> bool:
|
||||
"""
|
||||
Download modlist images to cache using jackify-engine.
|
||||
|
||||
Args:
|
||||
game_filter: Filter by game name (None = all games)
|
||||
size: Image size to download (small, large, both)
|
||||
overwrite: Overwrite existing images
|
||||
|
||||
Returns:
|
||||
True if successful, False otherwise
|
||||
"""
|
||||
# Build command (engine path will be resolved inside lock)
|
||||
cmd = [
|
||||
"placeholder", # Will be replaced with actual engine path
|
||||
"download-modlist-images",
|
||||
"--output", str(self.IMAGE_CACHE_DIR),
|
||||
"--size", size
|
||||
]
|
||||
|
||||
if game_filter:
|
||||
cmd.extend(["--game", game_filter])
|
||||
|
||||
if overwrite:
|
||||
cmd.append("--overwrite")
|
||||
|
||||
# Execute command
|
||||
try:
|
||||
# CRITICAL: Use thread lock to prevent concurrent engine calls
|
||||
with self._engine_call_lock:
|
||||
# CRITICAL: Get engine path BEFORE cleaning environment
|
||||
# get_jackify_engine_path() may need APPDIR to locate the engine
|
||||
engine_path = get_jackify_engine_path()
|
||||
if not engine_path:
|
||||
return False
|
||||
|
||||
# Update cmd with resolved engine path
|
||||
cmd[0] = str(engine_path)
|
||||
|
||||
# CRITICAL: Use centralized clean environment to prevent AppImage recursive spawning
|
||||
# This must happen AFTER engine path resolution
|
||||
from jackify.backend.handlers.subprocess_utils import get_clean_subprocess_env
|
||||
clean_env = get_clean_subprocess_env()
|
||||
|
||||
result = subprocess.run(
|
||||
cmd,
|
||||
capture_output=True,
|
||||
text=True,
|
||||
timeout=3600, # 1 hour timeout for downloads
|
||||
env=clean_env
|
||||
)
|
||||
return result.returncode == 0
|
||||
except Exception as e:
|
||||
print(f"Error downloading images: {e}")
|
||||
return False
|
||||
|
||||
def get_cached_image_path(self, metadata: ModlistMetadata, size: str = "large") -> Optional[Path]:
|
||||
"""
|
||||
Get path to cached image for a modlist (only if it exists).
|
||||
|
||||
Args:
|
||||
metadata: Modlist metadata
|
||||
size: Image size (small or large)
|
||||
|
||||
Returns:
|
||||
Path to cached image or None if not cached
|
||||
"""
|
||||
filename = f"{metadata.machineURL}_{size}.webp"
|
||||
image_path = self.IMAGE_CACHE_DIR / metadata.repositoryName / filename
|
||||
|
||||
if image_path.exists():
|
||||
return image_path
|
||||
return None
|
||||
|
||||
def get_image_cache_path(self, metadata: ModlistMetadata, size: str = "large") -> Path:
|
||||
"""
|
||||
Get path where image should be cached (always returns path, even if file doesn't exist).
|
||||
|
||||
Args:
|
||||
metadata: Modlist metadata
|
||||
size: Image size (small or large)
|
||||
|
||||
Returns:
|
||||
Path where image should be cached
|
||||
"""
|
||||
filename = f"{metadata.machineURL}_{size}.webp"
|
||||
return self.IMAGE_CACHE_DIR / metadata.repositoryName / filename
|
||||
|
||||
def get_image_url(self, metadata: ModlistMetadata, size: str = "large") -> Optional[str]:
|
||||
"""
|
||||
Get image URL for a modlist.
|
||||
|
||||
Args:
|
||||
metadata: Modlist metadata
|
||||
size: Image size (small or large)
|
||||
|
||||
Returns:
|
||||
Image URL or None if images not available
|
||||
"""
|
||||
if not metadata.images:
|
||||
return None
|
||||
|
||||
return metadata.images.large if size == "large" else metadata.images.small
|
||||
|
||||
def clear_cache(self):
|
||||
"""Clear all cached metadata and images"""
|
||||
if self.METADATA_CACHE_FILE.exists():
|
||||
self.METADATA_CACHE_FILE.unlink()
|
||||
|
||||
# Clear image cache
|
||||
if self.IMAGE_CACHE_DIR.exists():
|
||||
import shutil
|
||||
shutil.rmtree(self.IMAGE_CACHE_DIR)
|
||||
self.IMAGE_CACHE_DIR.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
def get_installed_modlists(self) -> List[str]:
|
||||
"""
|
||||
Get list of installed modlist machine URLs.
|
||||
|
||||
Returns:
|
||||
List of machine URLs for installed modlists
|
||||
"""
|
||||
# TODO: Integrate with existing modlist database/config
|
||||
# For now, return empty list - will be implemented when integrated with existing modlist tracking
|
||||
return []
|
||||
|
||||
def is_modlist_installed(self, machine_url: str) -> bool:
|
||||
"""Check if a modlist is installed"""
|
||||
return machine_url in self.get_installed_modlists()
|
||||
|
||||
def load_tag_mappings(self) -> Dict[str, str]:
|
||||
"""
|
||||
Load tag mappings from Wabbajack GitHub repository.
|
||||
Maps variant tag names to canonical tag names.
|
||||
|
||||
Returns:
|
||||
Dictionary mapping variant tags to canonical tags
|
||||
"""
|
||||
url = "https://raw.githubusercontent.com/wabbajack-tools/mod-lists/master/tag_mappings.json"
|
||||
try:
|
||||
with urllib.request.urlopen(url, timeout=10) as response:
|
||||
data = json.loads(response.read().decode('utf-8'))
|
||||
return data
|
||||
except Exception as e:
|
||||
print(f"Warning: Could not load tag mappings: {e}")
|
||||
return {}
|
||||
|
||||
def load_allowed_tags(self) -> set:
|
||||
"""
|
||||
Load allowed tags from Wabbajack GitHub repository.
|
||||
|
||||
Returns:
|
||||
Set of allowed tag names (preserving original case)
|
||||
"""
|
||||
url = "https://raw.githubusercontent.com/wabbajack-tools/mod-lists/master/allowed_tags.json"
|
||||
try:
|
||||
with urllib.request.urlopen(url, timeout=10) as response:
|
||||
data = json.loads(response.read().decode('utf-8'))
|
||||
return set(data) # Return as set preserving original case
|
||||
except Exception as e:
|
||||
print(f"Warning: Could not load allowed tags: {e}")
|
||||
return set()
|
||||
|
||||
def _ensure_tag_metadata(self):
|
||||
"""Ensure tag mappings/allowed tags (and lookups) are cached."""
|
||||
if self._tag_mappings_cache is None:
|
||||
self._tag_mappings_cache = self.load_tag_mappings()
|
||||
if self._tag_mapping_lookup is None:
|
||||
self._tag_mapping_lookup = {k.lower(): v for k, v in self._tag_mappings_cache.items()}
|
||||
if self._allowed_tags_cache is None:
|
||||
self._allowed_tags_cache = self.load_allowed_tags()
|
||||
if self._allowed_tags_lookup is None:
|
||||
self._allowed_tags_lookup = {tag.lower(): tag for tag in self._allowed_tags_cache}
|
||||
|
||||
def normalize_tag_value(self, tag: str) -> str:
|
||||
"""
|
||||
Normalize a tag to its canonical display form using Wabbajack mappings.
|
||||
Returns the normalized tag (original casing preserved when possible).
|
||||
"""
|
||||
if not tag:
|
||||
return ""
|
||||
self._ensure_tag_metadata()
|
||||
tag_key = tag.strip().lower()
|
||||
if not tag_key:
|
||||
return ""
|
||||
canonical = self._tag_mapping_lookup.get(tag_key, tag.strip())
|
||||
# Prefer allowed tag casing if available
|
||||
return self._allowed_tags_lookup.get(canonical.lower(), canonical)
|
||||
|
||||
def normalize_tags_for_display(self, tags: Optional[List[str]]) -> List[str]:
|
||||
"""Normalize a list of tags for UI display (deduped, canonical casing)."""
|
||||
if not tags:
|
||||
return []
|
||||
self._ensure_tag_metadata()
|
||||
normalized = []
|
||||
seen = set()
|
||||
for tag in tags:
|
||||
normalized_tag = self.normalize_tag_value(tag)
|
||||
key = normalized_tag.lower()
|
||||
if key and key not in seen:
|
||||
normalized.append(normalized_tag)
|
||||
seen.add(key)
|
||||
return normalized
|
||||
@@ -34,8 +34,10 @@ class ModlistService:
|
||||
"""Lazy initialization of modlist handler."""
|
||||
if self._modlist_handler is None:
|
||||
from ..handlers.modlist_handler import ModlistHandler
|
||||
# Initialize with proper dependencies
|
||||
self._modlist_handler = ModlistHandler()
|
||||
from ..services.platform_detection_service import PlatformDetectionService
|
||||
# Initialize with proper dependencies and centralized Steam Deck detection
|
||||
platform_service = PlatformDetectionService.get_instance()
|
||||
self._modlist_handler = ModlistHandler(steamdeck=platform_service.is_steamdeck)
|
||||
return self._modlist_handler
|
||||
|
||||
def _get_wabbajack_handler(self):
|
||||
@@ -283,8 +285,9 @@ class ModlistService:
|
||||
output_callback(f"Jackify Install Engine not found or not executable at: {engine_path}")
|
||||
return False
|
||||
|
||||
# Build command (copied from working code)
|
||||
cmd = [engine_path, 'install']
|
||||
# Build command (copied from working code)
|
||||
cmd = [engine_path, 'install', '--show-file-progress']
|
||||
|
||||
modlist_value = context.get('modlist_value')
|
||||
if modlist_value and modlist_value.endswith('.wabbajack') and os.path.isfile(modlist_value):
|
||||
cmd += ['-w', modlist_value]
|
||||
@@ -324,8 +327,10 @@ class ModlistService:
|
||||
else:
|
||||
output_callback(f"File descriptor limit warning: {message}")
|
||||
|
||||
# Subprocess call (copied from working code)
|
||||
proc = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, text=False, env=None, cwd=engine_dir)
|
||||
# Subprocess call with cleaned environment to prevent AppImage variable inheritance
|
||||
from jackify.backend.handlers.subprocess_utils import get_clean_subprocess_env
|
||||
clean_env = get_clean_subprocess_env()
|
||||
proc = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, text=False, env=clean_env, cwd=engine_dir)
|
||||
|
||||
# Output processing (copied from working code)
|
||||
buffer = b''
|
||||
|
||||
@@ -135,6 +135,9 @@ class NativeSteamOperationsService:
|
||||
steam_locations = [
|
||||
Path.home() / ".steam/steam",
|
||||
Path.home() / ".local/share/Steam",
|
||||
# Flatpak Steam - direct data directory
|
||||
Path.home() / ".var/app/com.valvesoftware.Steam/.local/share/Steam",
|
||||
# Flatpak Steam - symlinked home paths
|
||||
Path.home() / ".var/app/com.valvesoftware.Steam/home/.steam/steam",
|
||||
Path.home() / ".var/app/com.valvesoftware.Steam/home/.local/share/Steam"
|
||||
]
|
||||
@@ -161,6 +164,9 @@ class NativeSteamOperationsService:
|
||||
standard_locations = [
|
||||
Path.home() / ".steam/steam/steamapps/compatdata",
|
||||
Path.home() / ".local/share/Steam/steamapps/compatdata",
|
||||
# Flatpak Steam - direct data directory
|
||||
Path.home() / ".var/app/com.valvesoftware.Steam/.local/share/Steam/steamapps/compatdata",
|
||||
# Flatpak Steam - symlinked home paths
|
||||
Path.home() / ".var/app/com.valvesoftware.Steam/home/.steam/steam/steamapps/compatdata",
|
||||
Path.home() / ".var/app/com.valvesoftware.Steam/home/.local/share/Steam/steamapps/compatdata"
|
||||
]
|
||||
@@ -171,7 +177,7 @@ class NativeSteamOperationsService:
|
||||
|
||||
# Also check additional Steam libraries via libraryfolders.vdf
|
||||
try:
|
||||
from jackify.shared.paths import PathHandler
|
||||
from jackify.backend.handlers.path_handler import PathHandler
|
||||
all_steam_libs = PathHandler.get_all_steam_library_paths()
|
||||
|
||||
for lib_path in all_steam_libs:
|
||||
|
||||
@@ -33,7 +33,9 @@ class NativeSteamService:
|
||||
self.steam_paths = [
|
||||
Path.home() / ".steam" / "steam",
|
||||
Path.home() / ".local" / "share" / "Steam",
|
||||
Path.home() / ".var" / "app" / "com.valvesoftware.Steam" / ".local" / "share" / "Steam"
|
||||
Path.home() / ".var" / "app" / "com.valvesoftware.Steam" / "data" / "Steam",
|
||||
Path.home() / ".var" / "app" / "com.valvesoftware.Steam" / ".local" / "share" / "Steam",
|
||||
Path.home() / ".var" / "app" / "com.valvesoftware.Steam" / "home" / ".local" / "share" / "Steam"
|
||||
]
|
||||
self.steam_path = None
|
||||
self.userdata_path = None
|
||||
@@ -54,8 +56,20 @@ class NativeSteamService:
|
||||
# Step 2: Parse loginusers.vdf to get the most recent user (SteamID64)
|
||||
steamid64 = self._get_most_recent_user_from_loginusers()
|
||||
if not steamid64:
|
||||
logger.error("Could not determine most recent Steam user from loginusers.vdf")
|
||||
return False
|
||||
logger.warning("Could not determine most recent Steam user from loginusers.vdf, trying fallback method")
|
||||
# Fallback: Look for existing user directories in userdata
|
||||
steamid3 = self._find_user_from_userdata_directory()
|
||||
if steamid3:
|
||||
logger.info(f"Found Steam user using userdata directory fallback: SteamID3={steamid3}")
|
||||
# Skip the conversion step since we already have SteamID3
|
||||
self.user_id = str(steamid3)
|
||||
self.user_config_path = self.userdata_path / str(steamid3) / "config"
|
||||
logger.info(f"Steam user set up via fallback: {self.user_id}")
|
||||
logger.info(f"User config path: {self.user_config_path}")
|
||||
return True
|
||||
else:
|
||||
logger.error("Could not determine Steam user using any method")
|
||||
return False
|
||||
|
||||
# Step 3: Convert SteamID64 to SteamID3 (userdata directory format)
|
||||
steamid3 = self._convert_steamid64_to_steamid3(steamid64)
|
||||
@@ -467,14 +481,34 @@ class NativeSteamService:
|
||||
Returns:
|
||||
(success, app_id) - Success status and the AppID
|
||||
"""
|
||||
# Auto-detect best Proton version if none provided
|
||||
# Use Game Proton from settings for shortcut creation (not Install Proton)
|
||||
if proton_version is None:
|
||||
try:
|
||||
from jackify.backend.core.modlist_operations import _get_user_proton_version
|
||||
proton_version = _get_user_proton_version()
|
||||
logger.info(f"Auto-detected Proton version: {proton_version}")
|
||||
from jackify.backend.handlers.config_handler import ConfigHandler
|
||||
config_handler = ConfigHandler()
|
||||
game_proton_path = config_handler.get_game_proton_path()
|
||||
|
||||
if game_proton_path and game_proton_path != 'auto':
|
||||
# User has selected Game Proton - use it
|
||||
proton_version = os.path.basename(game_proton_path)
|
||||
# Convert to Steam format
|
||||
if not proton_version.startswith('GE-Proton'):
|
||||
proton_version = proton_version.lower().replace(' - ', '_').replace(' ', '_').replace('-', '_')
|
||||
if not proton_version.startswith('proton'):
|
||||
proton_version = f"proton_{proton_version}"
|
||||
logger.info(f"Using Game Proton from settings: {proton_version}")
|
||||
else:
|
||||
# Fallback to auto-detect if Game Proton not set
|
||||
from jackify.backend.handlers.wine_utils import WineUtils
|
||||
best_proton = WineUtils.select_best_proton()
|
||||
if best_proton:
|
||||
proton_version = best_proton['name']
|
||||
logger.info(f"Auto-detected Game Proton: {proton_version}")
|
||||
else:
|
||||
proton_version = "proton_experimental"
|
||||
logger.warning("Failed to auto-detect Game Proton, falling back to experimental")
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to auto-detect Proton, falling back to experimental: {e}")
|
||||
logger.warning(f"Failed to get Game Proton, falling back to experimental: {e}")
|
||||
proton_version = "proton_experimental"
|
||||
|
||||
logger.info(f"Creating shortcut with Proton: '{app_name}' -> '{proton_version}'")
|
||||
|
||||
258
jackify/backend/services/nexus_auth_service.py
Normal file
258
jackify/backend/services/nexus_auth_service.py
Normal file
@@ -0,0 +1,258 @@
|
||||
#!/usr/bin/env python3
|
||||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
Nexus Authentication Service
|
||||
Unified service for Nexus authentication using OAuth or API key fallback
|
||||
"""
|
||||
|
||||
import logging
|
||||
from typing import Optional, Tuple
|
||||
from .nexus_oauth_service import NexusOAuthService
|
||||
from ..handlers.oauth_token_handler import OAuthTokenHandler
|
||||
from .api_key_service import APIKeyService
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class NexusAuthService:
|
||||
"""
|
||||
Unified authentication service for Nexus Mods
|
||||
Handles OAuth 2.0 (preferred) with API key fallback (legacy)
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
"""Initialize authentication service"""
|
||||
self.oauth_service = NexusOAuthService()
|
||||
self.token_handler = OAuthTokenHandler()
|
||||
self.api_key_service = APIKeyService()
|
||||
logger.debug("NexusAuthService initialized")
|
||||
|
||||
def get_auth_token(self) -> Optional[str]:
|
||||
"""
|
||||
Get authentication token, preferring OAuth over API key
|
||||
|
||||
Returns:
|
||||
Access token or API key, or None if no authentication available
|
||||
"""
|
||||
# Try OAuth first
|
||||
oauth_token = self._get_oauth_token()
|
||||
if oauth_token:
|
||||
logger.debug("Using OAuth token for authentication")
|
||||
return oauth_token
|
||||
|
||||
# Fall back to API key
|
||||
api_key = self.api_key_service.get_saved_api_key()
|
||||
if api_key:
|
||||
logger.debug("Using API key for authentication (OAuth not available)")
|
||||
return api_key
|
||||
|
||||
logger.warning("No authentication available (neither OAuth nor API key)")
|
||||
return None
|
||||
|
||||
def _get_oauth_token(self) -> Optional[str]:
|
||||
"""
|
||||
Get OAuth access token, refreshing if needed
|
||||
|
||||
Returns:
|
||||
Valid access token or None
|
||||
"""
|
||||
# Check if we have a stored token
|
||||
if not self.token_handler.has_token():
|
||||
logger.debug("No OAuth token stored")
|
||||
return None
|
||||
|
||||
# Check if token is expired (15 minute buffer for long installs)
|
||||
if self.token_handler.is_token_expired(buffer_minutes=15):
|
||||
logger.info("OAuth token expiring soon, attempting refresh")
|
||||
|
||||
# Try to refresh
|
||||
refresh_token = self.token_handler.get_refresh_token()
|
||||
if refresh_token:
|
||||
new_token_data = self.oauth_service.refresh_token(refresh_token)
|
||||
|
||||
if new_token_data:
|
||||
# Save refreshed token
|
||||
self.token_handler.save_token({'oauth': new_token_data})
|
||||
logger.info("OAuth token refreshed successfully")
|
||||
return new_token_data.get('access_token')
|
||||
else:
|
||||
logger.warning("Token refresh failed, OAuth token invalid")
|
||||
# Delete invalid token
|
||||
self.token_handler.delete_token()
|
||||
return None
|
||||
else:
|
||||
logger.warning("No refresh token available")
|
||||
return None
|
||||
|
||||
# Token is valid, return it
|
||||
return self.token_handler.get_access_token()
|
||||
|
||||
def is_authenticated(self) -> bool:
|
||||
"""
|
||||
Check if user is authenticated via OAuth or API key
|
||||
|
||||
Returns:
|
||||
True if authenticated
|
||||
"""
|
||||
return self.get_auth_token() is not None
|
||||
|
||||
def get_auth_method(self) -> Optional[str]:
|
||||
"""
|
||||
Get current authentication method
|
||||
|
||||
Returns:
|
||||
'oauth', 'api_key', or None
|
||||
"""
|
||||
# Check OAuth first
|
||||
oauth_token = self._get_oauth_token()
|
||||
if oauth_token:
|
||||
return 'oauth'
|
||||
|
||||
# Check API key
|
||||
api_key = self.api_key_service.get_saved_api_key()
|
||||
if api_key:
|
||||
return 'api_key'
|
||||
|
||||
return None
|
||||
|
||||
def get_auth_status(self) -> Tuple[bool, str, Optional[str]]:
|
||||
"""
|
||||
Get detailed authentication status
|
||||
|
||||
Returns:
|
||||
Tuple of (authenticated, method, username)
|
||||
- authenticated: True if authenticated
|
||||
- method: 'oauth', 'oauth_expired', 'api_key', or 'none'
|
||||
- username: Username if available (OAuth only), or None
|
||||
"""
|
||||
# Check if OAuth token exists
|
||||
if self.token_handler.has_token():
|
||||
# Check if refresh token is likely expired (hasn't been refreshed in 30+ days)
|
||||
token_info = self.token_handler.get_token_info()
|
||||
if token_info.get('refresh_token_likely_expired'):
|
||||
logger.warning("Refresh token likely expired (30+ days old), user should re-authorize")
|
||||
return False, 'oauth_expired', None
|
||||
|
||||
# Try OAuth
|
||||
oauth_token = self._get_oauth_token()
|
||||
if oauth_token:
|
||||
# Try to get username from userinfo
|
||||
user_info = self.oauth_service.get_user_info(oauth_token)
|
||||
username = user_info.get('name') if user_info else None
|
||||
return True, 'oauth', username
|
||||
elif self.token_handler.has_token():
|
||||
# Had token but couldn't get valid access token (refresh failed)
|
||||
logger.warning("OAuth token refresh failed, token may be invalid")
|
||||
return False, 'oauth_expired', None
|
||||
|
||||
# Try API key
|
||||
api_key = self.api_key_service.get_saved_api_key()
|
||||
if api_key:
|
||||
return True, 'api_key', None
|
||||
|
||||
return False, 'none', None
|
||||
|
||||
def authorize_oauth(self, show_browser_message_callback=None) -> bool:
|
||||
"""
|
||||
Perform OAuth authorization flow
|
||||
|
||||
Args:
|
||||
show_browser_message_callback: Optional callback for browser messages
|
||||
|
||||
Returns:
|
||||
True if authorization successful
|
||||
"""
|
||||
logger.info("Starting OAuth authorization")
|
||||
|
||||
token_data = self.oauth_service.authorize(show_browser_message_callback)
|
||||
|
||||
if token_data:
|
||||
# Save token
|
||||
success = self.token_handler.save_token({'oauth': token_data})
|
||||
if success:
|
||||
logger.info("OAuth authorization completed successfully")
|
||||
return True
|
||||
else:
|
||||
logger.error("Failed to save OAuth token")
|
||||
return False
|
||||
else:
|
||||
logger.error("OAuth authorization failed")
|
||||
return False
|
||||
|
||||
def revoke_oauth(self) -> bool:
|
||||
"""
|
||||
Revoke OAuth authorization by deleting stored token
|
||||
|
||||
Returns:
|
||||
True if revoked successfully
|
||||
"""
|
||||
logger.info("Revoking OAuth authorization")
|
||||
return self.token_handler.delete_token()
|
||||
|
||||
def save_api_key(self, api_key: str) -> bool:
|
||||
"""
|
||||
Save API key (legacy fallback)
|
||||
|
||||
Args:
|
||||
api_key: Nexus API key
|
||||
|
||||
Returns:
|
||||
True if saved successfully
|
||||
"""
|
||||
return self.api_key_service.save_api_key(api_key)
|
||||
|
||||
def validate_api_key(self, api_key: Optional[str] = None) -> Tuple[bool, Optional[str]]:
|
||||
"""
|
||||
Validate API key against Nexus API
|
||||
|
||||
Args:
|
||||
api_key: Optional API key to validate (uses stored if not provided)
|
||||
|
||||
Returns:
|
||||
Tuple of (valid, username_or_error)
|
||||
"""
|
||||
return self.api_key_service.validate_api_key(api_key)
|
||||
|
||||
def ensure_valid_auth(self) -> Optional[str]:
|
||||
"""
|
||||
Ensure we have valid authentication, refreshing if needed
|
||||
This should be called before any Nexus operation
|
||||
|
||||
Returns:
|
||||
Valid auth token (OAuth access token or API key), or None
|
||||
"""
|
||||
auth_token = self.get_auth_token()
|
||||
|
||||
if not auth_token:
|
||||
logger.warning("No authentication available for Nexus operation")
|
||||
|
||||
return auth_token
|
||||
|
||||
def get_auth_for_engine(self) -> Optional[str]:
|
||||
"""
|
||||
Get authentication token for jackify-engine
|
||||
Same as ensure_valid_auth() - engine uses NEXUS_API_KEY env var for both OAuth and API keys
|
||||
(This matches upstream Wabbajack behavior)
|
||||
|
||||
Returns:
|
||||
Valid auth token to pass via NEXUS_API_KEY environment variable, or None
|
||||
"""
|
||||
return self.ensure_valid_auth()
|
||||
|
||||
def clear_all_auth(self) -> bool:
|
||||
"""
|
||||
Clear all authentication (both OAuth and API key)
|
||||
Useful for testing or switching accounts
|
||||
|
||||
Returns:
|
||||
True if any auth was cleared
|
||||
"""
|
||||
oauth_cleared = self.token_handler.delete_token()
|
||||
api_key_cleared = self.api_key_service.clear_api_key()
|
||||
|
||||
if oauth_cleared or api_key_cleared:
|
||||
logger.info("Cleared all Nexus authentication")
|
||||
return True
|
||||
else:
|
||||
logger.debug("No authentication to clear")
|
||||
return False
|
||||
759
jackify/backend/services/nexus_oauth_service.py
Normal file
759
jackify/backend/services/nexus_oauth_service.py
Normal file
@@ -0,0 +1,759 @@
|
||||
#!/usr/bin/env python3
|
||||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
Nexus OAuth Service
|
||||
Handles OAuth 2.0 authentication flow with Nexus Mods using PKCE
|
||||
"""
|
||||
|
||||
import os
|
||||
import base64
|
||||
import hashlib
|
||||
import secrets
|
||||
import webbrowser
|
||||
import urllib.parse
|
||||
from http.server import HTTPServer, BaseHTTPRequestHandler
|
||||
import requests
|
||||
import json
|
||||
import threading
|
||||
import ssl
|
||||
import tempfile
|
||||
import logging
|
||||
import time
|
||||
import subprocess
|
||||
from typing import Optional, Tuple, Dict
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class NexusOAuthService:
|
||||
"""
|
||||
Handles OAuth 2.0 authentication with Nexus Mods
|
||||
Uses PKCE flow with system browser and localhost callback
|
||||
"""
|
||||
|
||||
# OAuth Configuration
|
||||
CLIENT_ID = "jackify"
|
||||
AUTH_URL = "https://users.nexusmods.com/oauth/authorize"
|
||||
TOKEN_URL = "https://users.nexusmods.com/oauth/token"
|
||||
USERINFO_URL = "https://users.nexusmods.com/oauth/userinfo"
|
||||
SCOPES = "public openid profile"
|
||||
|
||||
# Redirect configuration (custom protocol scheme - no SSL cert needed!)
|
||||
# Requires jackify:// protocol handler to be registered with OS
|
||||
REDIRECT_URI = "jackify://oauth/callback"
|
||||
|
||||
# Callback timeout (5 minutes)
|
||||
CALLBACK_TIMEOUT = 300
|
||||
|
||||
def __init__(self):
|
||||
"""Initialize OAuth service"""
|
||||
self._auth_code = None
|
||||
self._auth_state = None
|
||||
self._auth_error = None
|
||||
self._server_done = threading.Event()
|
||||
|
||||
# Ensure jackify:// protocol is registered on first use
|
||||
self._ensure_protocol_registered()
|
||||
|
||||
def _generate_pkce_params(self) -> Tuple[str, str, str]:
|
||||
"""
|
||||
Generate PKCE code verifier, challenge, and state
|
||||
|
||||
Returns:
|
||||
Tuple of (code_verifier, code_challenge, state)
|
||||
"""
|
||||
# Generate code verifier (43-128 characters, base64url encoded)
|
||||
code_verifier = base64.urlsafe_b64encode(
|
||||
os.urandom(32)
|
||||
).decode('utf-8').rstrip('=')
|
||||
|
||||
# Generate code challenge (SHA256 hash of verifier, base64url encoded)
|
||||
code_challenge = base64.urlsafe_b64encode(
|
||||
hashlib.sha256(code_verifier.encode('utf-8')).digest()
|
||||
).decode('utf-8').rstrip('=')
|
||||
|
||||
# Generate state for CSRF protection
|
||||
state = secrets.token_urlsafe(32)
|
||||
|
||||
return code_verifier, code_challenge, state
|
||||
|
||||
def _ensure_protocol_registered(self) -> bool:
|
||||
"""
|
||||
Ensure jackify:// protocol is registered with the OS
|
||||
|
||||
Returns:
|
||||
True if registration successful or already registered
|
||||
"""
|
||||
import subprocess
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
if not sys.platform.startswith('linux'):
|
||||
logger.debug("Protocol registration only needed on Linux")
|
||||
return True
|
||||
|
||||
try:
|
||||
# Ensure desktop file exists and has correct Exec path
|
||||
desktop_file = Path.home() / ".local" / "share" / "applications" / "com.jackify.app.desktop"
|
||||
|
||||
# Get environment for AppImage detection
|
||||
env = os.environ
|
||||
|
||||
# Determine executable path (DEV mode vs AppImage)
|
||||
# Check multiple indicators for AppImage execution
|
||||
is_appimage = (
|
||||
getattr(sys, 'frozen', False) or # PyInstaller frozen
|
||||
'APPIMAGE' in env or # AppImage environment variable
|
||||
'APPDIR' in env or # AppImage directory variable
|
||||
(sys.argv[0] and sys.argv[0].endswith('.AppImage')) # Executable name
|
||||
)
|
||||
|
||||
if is_appimage:
|
||||
# Running from AppImage - use the AppImage path directly
|
||||
# CRITICAL: Never use -m flag in AppImage mode - it causes __main__.py windows
|
||||
if 'APPIMAGE' in env:
|
||||
# APPIMAGE env var gives us the exact path to the AppImage
|
||||
exec_path = env['APPIMAGE']
|
||||
logger.info(f"Using APPIMAGE env var: {exec_path}")
|
||||
elif sys.argv[0] and Path(sys.argv[0]).exists():
|
||||
# Use sys.argv[0] if it's a valid path
|
||||
exec_path = str(Path(sys.argv[0]).resolve())
|
||||
logger.info(f"Using resolved sys.argv[0]: {exec_path}")
|
||||
else:
|
||||
# Fallback to sys.argv[0] as-is
|
||||
exec_path = sys.argv[0]
|
||||
logger.warning(f"Using sys.argv[0] as fallback: {exec_path}")
|
||||
else:
|
||||
# Running from source (DEV mode)
|
||||
# Need to ensure we run from the correct directory
|
||||
src_dir = Path(__file__).parent.parent.parent.parent # Go up to src/
|
||||
exec_path = f"cd {src_dir} && {sys.executable} -m jackify.frontends.gui"
|
||||
logger.info(f"DEV mode exec path: {exec_path}")
|
||||
logger.info(f"Source directory: {src_dir}")
|
||||
|
||||
# Check if desktop file needs creation or update
|
||||
needs_update = False
|
||||
if not desktop_file.exists():
|
||||
needs_update = True
|
||||
logger.info("Creating desktop file for protocol handler")
|
||||
else:
|
||||
# Check if Exec path matches current mode
|
||||
current_content = desktop_file.read_text()
|
||||
if f"Exec={exec_path} %u" not in current_content:
|
||||
needs_update = True
|
||||
logger.info(f"Updating desktop file with new Exec path: {exec_path}")
|
||||
|
||||
if needs_update:
|
||||
desktop_file.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# Build desktop file content with proper working directory
|
||||
if is_appimage:
|
||||
# AppImage doesn't need working directory
|
||||
desktop_content = f"""[Desktop Entry]
|
||||
Type=Application
|
||||
Name=Jackify
|
||||
Comment=Wabbajack modlist manager for Linux
|
||||
Exec={exec_path} %u
|
||||
Icon=com.jackify.app
|
||||
Terminal=false
|
||||
Categories=Game;Utility;
|
||||
MimeType=x-scheme-handler/jackify;
|
||||
"""
|
||||
else:
|
||||
# DEV mode needs working directory set to src/
|
||||
# exec_path already contains the correct format: "cd {src_dir} && {sys.executable} -m jackify.frontends.gui"
|
||||
src_dir = Path(__file__).parent.parent.parent.parent # Go up to src/
|
||||
desktop_content = f"""[Desktop Entry]
|
||||
Type=Application
|
||||
Name=Jackify
|
||||
Comment=Wabbajack modlist manager for Linux
|
||||
Exec={exec_path} %u
|
||||
Icon=com.jackify.app
|
||||
Terminal=false
|
||||
Categories=Game;Utility;
|
||||
MimeType=x-scheme-handler/jackify;
|
||||
Path={src_dir}
|
||||
"""
|
||||
|
||||
desktop_file.write_text(desktop_content)
|
||||
logger.info(f"Desktop file written: {desktop_file}")
|
||||
logger.info(f"Exec path: {exec_path}")
|
||||
logger.info(f"AppImage mode: {is_appimage}")
|
||||
|
||||
# Always ensure full registration (don't trust xdg-settings alone)
|
||||
# PopOS/Ubuntu need mimeapps.list even if xdg-settings says registered
|
||||
logger.info("Registering jackify:// protocol handler")
|
||||
|
||||
# Update MIME cache (required for Firefox dialog)
|
||||
apps_dir = Path.home() / ".local" / "share" / "applications"
|
||||
subprocess.run(
|
||||
['update-desktop-database', str(apps_dir)],
|
||||
capture_output=True,
|
||||
timeout=10
|
||||
)
|
||||
|
||||
# Set as default handler using xdg-mime (Firefox compatibility)
|
||||
subprocess.run(
|
||||
['xdg-mime', 'default', 'com.jackify.app.desktop', 'x-scheme-handler/jackify'],
|
||||
capture_output=True,
|
||||
timeout=10
|
||||
)
|
||||
|
||||
# Also use xdg-settings as backup (some systems need both)
|
||||
subprocess.run(
|
||||
['xdg-settings', 'set', 'default-url-scheme-handler', 'jackify', 'com.jackify.app.desktop'],
|
||||
capture_output=True,
|
||||
timeout=10
|
||||
)
|
||||
|
||||
# Manually ensure entry in mimeapps.list (PopOS/Ubuntu require this for GIO)
|
||||
mimeapps_path = Path.home() / ".config" / "mimeapps.list"
|
||||
try:
|
||||
# Read existing content
|
||||
if mimeapps_path.exists():
|
||||
content = mimeapps_path.read_text()
|
||||
else:
|
||||
mimeapps_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
content = "[Default Applications]\n"
|
||||
|
||||
# Add jackify handler if not present
|
||||
if 'x-scheme-handler/jackify=' not in content:
|
||||
if '[Default Applications]' not in content:
|
||||
content = "[Default Applications]\n" + content
|
||||
|
||||
# Insert after [Default Applications] line
|
||||
lines = content.split('\n')
|
||||
for i, line in enumerate(lines):
|
||||
if line.strip() == '[Default Applications]':
|
||||
lines.insert(i + 1, 'x-scheme-handler/jackify=com.jackify.app.desktop')
|
||||
break
|
||||
|
||||
content = '\n'.join(lines)
|
||||
mimeapps_path.write_text(content)
|
||||
logger.info("Added jackify handler to mimeapps.list")
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to update mimeapps.list: {e}")
|
||||
|
||||
logger.info("jackify:// protocol registered successfully")
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to register jackify:// protocol: {e}")
|
||||
return False
|
||||
|
||||
def _generate_self_signed_cert(self) -> Tuple[Optional[str], Optional[str]]:
|
||||
"""
|
||||
Generate self-signed certificate for HTTPS localhost
|
||||
|
||||
Returns:
|
||||
Tuple of (cert_file_path, key_file_path) or (None, None) on failure
|
||||
"""
|
||||
try:
|
||||
from cryptography import x509
|
||||
from cryptography.x509.oid import NameOID
|
||||
from cryptography.hazmat.primitives import hashes
|
||||
from cryptography.hazmat.primitives.asymmetric import rsa
|
||||
from cryptography.hazmat.primitives import serialization
|
||||
import datetime
|
||||
import ipaddress
|
||||
|
||||
logger.info("Generating self-signed certificate for OAuth callback")
|
||||
|
||||
# Generate private key
|
||||
private_key = rsa.generate_private_key(
|
||||
public_exponent=65537,
|
||||
key_size=2048,
|
||||
)
|
||||
|
||||
# Create certificate
|
||||
subject = issuer = x509.Name([
|
||||
x509.NameAttribute(NameOID.COUNTRY_NAME, "US"),
|
||||
x509.NameAttribute(NameOID.ORGANIZATION_NAME, "Jackify"),
|
||||
x509.NameAttribute(NameOID.COMMON_NAME, self.REDIRECT_HOST),
|
||||
])
|
||||
|
||||
cert = x509.CertificateBuilder().subject_name(
|
||||
subject
|
||||
).issuer_name(
|
||||
issuer
|
||||
).public_key(
|
||||
private_key.public_key()
|
||||
).serial_number(
|
||||
x509.random_serial_number()
|
||||
).not_valid_before(
|
||||
datetime.datetime.now(datetime.UTC)
|
||||
).not_valid_after(
|
||||
datetime.datetime.now(datetime.UTC) + datetime.timedelta(days=365)
|
||||
).add_extension(
|
||||
x509.SubjectAlternativeName([
|
||||
x509.IPAddress(ipaddress.IPv4Address(self.REDIRECT_HOST)),
|
||||
]),
|
||||
critical=False,
|
||||
).sign(private_key, hashes.SHA256())
|
||||
|
||||
# Save to temp files
|
||||
temp_dir = tempfile.mkdtemp()
|
||||
cert_file = os.path.join(temp_dir, "oauth_cert.pem")
|
||||
key_file = os.path.join(temp_dir, "oauth_key.pem")
|
||||
|
||||
with open(cert_file, "wb") as f:
|
||||
f.write(cert.public_bytes(serialization.Encoding.PEM))
|
||||
|
||||
with open(key_file, "wb") as f:
|
||||
f.write(private_key.private_bytes(
|
||||
encoding=serialization.Encoding.PEM,
|
||||
format=serialization.PrivateFormat.TraditionalOpenSSL,
|
||||
encryption_algorithm=serialization.NoEncryption()
|
||||
))
|
||||
|
||||
return cert_file, key_file
|
||||
|
||||
except ImportError:
|
||||
logger.error("cryptography package not installed - required for OAuth")
|
||||
return None, None
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to generate SSL certificate: {e}")
|
||||
return None, None
|
||||
|
||||
def _build_authorization_url(self, code_challenge: str, state: str) -> str:
|
||||
"""
|
||||
Build OAuth authorization URL
|
||||
|
||||
Args:
|
||||
code_challenge: PKCE code challenge
|
||||
state: CSRF protection state
|
||||
|
||||
Returns:
|
||||
Authorization URL
|
||||
"""
|
||||
params = {
|
||||
'response_type': 'code',
|
||||
'client_id': self.CLIENT_ID,
|
||||
'redirect_uri': self.REDIRECT_URI,
|
||||
'scope': self.SCOPES,
|
||||
'code_challenge': code_challenge,
|
||||
'code_challenge_method': 'S256',
|
||||
'state': state
|
||||
}
|
||||
|
||||
return f"{self.AUTH_URL}?{urllib.parse.urlencode(params)}"
|
||||
|
||||
def _create_callback_handler(self):
|
||||
"""Create HTTP request handler class for OAuth callback"""
|
||||
service = self
|
||||
|
||||
class OAuthCallbackHandler(BaseHTTPRequestHandler):
|
||||
"""HTTP request handler for OAuth callback"""
|
||||
|
||||
def log_message(self, format, *args):
|
||||
"""Log OAuth callback requests"""
|
||||
logger.debug(f"OAuth callback: {format % args}")
|
||||
|
||||
def do_GET(self):
|
||||
"""Handle GET request from OAuth redirect"""
|
||||
logger.info(f"OAuth callback received: {self.path}")
|
||||
|
||||
# Parse query parameters
|
||||
parsed = urllib.parse.urlparse(self.path)
|
||||
params = urllib.parse.parse_qs(parsed.query)
|
||||
|
||||
# Ignore favicon and other non-OAuth requests
|
||||
if parsed.path == '/favicon.ico':
|
||||
self.send_response(404)
|
||||
self.end_headers()
|
||||
return
|
||||
|
||||
if 'code' in params:
|
||||
service._auth_code = params['code'][0]
|
||||
service._auth_state = params.get('state', [None])[0]
|
||||
logger.info(f"OAuth authorization code received: {service._auth_code[:10]}...")
|
||||
|
||||
# Send success response
|
||||
self.send_response(200)
|
||||
self.send_header('Content-type', 'text/html')
|
||||
self.end_headers()
|
||||
|
||||
html = """
|
||||
<html>
|
||||
<head><title>Authorization Successful</title></head>
|
||||
<body style="font-family: Arial, sans-serif; text-align: center; padding: 50px;">
|
||||
<h1>Authorization Successful!</h1>
|
||||
<p>You can close this window and return to Jackify.</p>
|
||||
<script>setTimeout(function() { window.close(); }, 3000);</script>
|
||||
</body>
|
||||
</html>
|
||||
"""
|
||||
self.wfile.write(html.encode())
|
||||
|
||||
elif 'error' in params:
|
||||
service._auth_error = params['error'][0]
|
||||
error_desc = params.get('error_description', ['Unknown error'])[0]
|
||||
|
||||
# Send error response
|
||||
self.send_response(200)
|
||||
self.send_header('Content-type', 'text/html')
|
||||
self.end_headers()
|
||||
|
||||
html = f"""
|
||||
<html>
|
||||
<head><title>Authorization Failed</title></head>
|
||||
<body style="font-family: Arial, sans-serif; text-align: center; padding: 50px;">
|
||||
<h1>Authorization Failed</h1>
|
||||
<p>Error: {service._auth_error}</p>
|
||||
<p>{error_desc}</p>
|
||||
<p>You can close this window and try again in Jackify.</p>
|
||||
</body>
|
||||
</html>
|
||||
"""
|
||||
self.wfile.write(html.encode())
|
||||
else:
|
||||
# Unexpected callback format
|
||||
logger.warning(f"OAuth callback with no code or error: {params}")
|
||||
self.send_response(400)
|
||||
self.send_header('Content-type', 'text/html')
|
||||
self.end_headers()
|
||||
html = """
|
||||
<html>
|
||||
<head><title>Invalid Request</title></head>
|
||||
<body style="font-family: Arial, sans-serif; text-align: center; padding: 50px;">
|
||||
<h1>Invalid OAuth Callback</h1>
|
||||
<p>You can close this window.</p>
|
||||
</body>
|
||||
</html>
|
||||
"""
|
||||
self.wfile.write(html.encode())
|
||||
|
||||
# Signal server to shut down
|
||||
service._server_done.set()
|
||||
logger.debug("OAuth callback handler signaled server to shut down")
|
||||
|
||||
return OAuthCallbackHandler
|
||||
|
||||
def _wait_for_callback(self) -> bool:
|
||||
"""
|
||||
Wait for OAuth callback via jackify:// protocol handler
|
||||
|
||||
Returns:
|
||||
True if callback received, False on timeout
|
||||
"""
|
||||
from pathlib import Path
|
||||
import time
|
||||
|
||||
callback_file = Path.home() / ".config" / "jackify" / "oauth_callback.tmp"
|
||||
|
||||
# Delete any old callback file
|
||||
if callback_file.exists():
|
||||
callback_file.unlink()
|
||||
|
||||
logger.info("Waiting for OAuth callback via jackify:// protocol")
|
||||
|
||||
# Poll for callback file with periodic user feedback
|
||||
start_time = time.time()
|
||||
last_reminder = 0
|
||||
while (time.time() - start_time) < self.CALLBACK_TIMEOUT:
|
||||
if callback_file.exists():
|
||||
try:
|
||||
# Read callback data
|
||||
lines = callback_file.read_text().strip().split('\n')
|
||||
if len(lines) >= 2:
|
||||
self._auth_code = lines[0]
|
||||
self._auth_state = lines[1]
|
||||
logger.info(f"OAuth callback received: code={self._auth_code[:10]}...")
|
||||
|
||||
# Clean up
|
||||
callback_file.unlink()
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to read callback file: {e}")
|
||||
return False
|
||||
|
||||
# Show periodic reminder about protocol handler
|
||||
elapsed = time.time() - start_time
|
||||
if elapsed - last_reminder > 30: # Every 30 seconds
|
||||
logger.info(f"Still waiting for OAuth callback... ({int(elapsed)}s elapsed)")
|
||||
if elapsed > 60:
|
||||
logger.warning(
|
||||
"If you see a blank browser tab or popup blocker, "
|
||||
"check for browser notifications asking to 'Open Jackify'"
|
||||
)
|
||||
last_reminder = elapsed
|
||||
|
||||
time.sleep(0.5) # Poll every 500ms
|
||||
|
||||
logger.error(f"OAuth callback timeout after {self.CALLBACK_TIMEOUT} seconds")
|
||||
logger.error(
|
||||
"Protocol handler may not be working. Check:\n"
|
||||
" 1. Browser asked 'Open Jackify?' and you clicked Allow\n"
|
||||
" 2. No popup blocker notifications\n"
|
||||
" 3. Desktop file exists: ~/.local/share/applications/com.jackify.app.desktop"
|
||||
)
|
||||
return False
|
||||
|
||||
def _send_desktop_notification(self, title: str, message: str):
|
||||
"""
|
||||
Send desktop notification if available
|
||||
|
||||
Args:
|
||||
title: Notification title
|
||||
message: Notification message
|
||||
"""
|
||||
try:
|
||||
# Try notify-send (Linux)
|
||||
subprocess.run(
|
||||
['notify-send', title, message],
|
||||
check=False,
|
||||
stdout=subprocess.DEVNULL,
|
||||
stderr=subprocess.DEVNULL,
|
||||
timeout=2
|
||||
)
|
||||
except (FileNotFoundError, subprocess.TimeoutExpired):
|
||||
pass
|
||||
|
||||
def _exchange_code_for_token(
|
||||
self,
|
||||
auth_code: str,
|
||||
code_verifier: str
|
||||
) -> Optional[Dict]:
|
||||
"""
|
||||
Exchange authorization code for access token
|
||||
|
||||
Args:
|
||||
auth_code: Authorization code from callback
|
||||
code_verifier: PKCE code verifier
|
||||
|
||||
Returns:
|
||||
Token response dict or None on failure
|
||||
"""
|
||||
data = {
|
||||
'grant_type': 'authorization_code',
|
||||
'client_id': self.CLIENT_ID,
|
||||
'redirect_uri': self.REDIRECT_URI,
|
||||
'code': auth_code,
|
||||
'code_verifier': code_verifier
|
||||
}
|
||||
|
||||
try:
|
||||
response = requests.post(self.TOKEN_URL, data=data, timeout=10)
|
||||
|
||||
if response.status_code == 200:
|
||||
token_data = response.json()
|
||||
logger.info("Successfully exchanged authorization code for token")
|
||||
return token_data
|
||||
else:
|
||||
logger.error(f"Token exchange failed: {response.status_code} - {response.text}")
|
||||
return None
|
||||
|
||||
except requests.RequestException as e:
|
||||
logger.error(f"Token exchange request failed: {e}")
|
||||
return None
|
||||
|
||||
def refresh_token(self, refresh_token: str) -> Optional[Dict]:
|
||||
"""
|
||||
Refresh an access token using refresh token
|
||||
|
||||
Args:
|
||||
refresh_token: Refresh token from previous authentication
|
||||
|
||||
Returns:
|
||||
New token response dict or None on failure
|
||||
"""
|
||||
data = {
|
||||
'grant_type': 'refresh_token',
|
||||
'client_id': self.CLIENT_ID,
|
||||
'refresh_token': refresh_token
|
||||
}
|
||||
|
||||
try:
|
||||
response = requests.post(self.TOKEN_URL, data=data, timeout=10)
|
||||
|
||||
if response.status_code == 200:
|
||||
token_data = response.json()
|
||||
logger.info("Successfully refreshed access token")
|
||||
return token_data
|
||||
else:
|
||||
logger.error(f"Token refresh failed: {response.status_code} - {response.text}")
|
||||
return None
|
||||
|
||||
except requests.RequestException as e:
|
||||
logger.error(f"Token refresh request failed: {e}")
|
||||
return None
|
||||
|
||||
def get_user_info(self, access_token: str) -> Optional[Dict]:
|
||||
"""
|
||||
Get user information using access token
|
||||
|
||||
Args:
|
||||
access_token: OAuth access token
|
||||
|
||||
Returns:
|
||||
User info dict or None on failure
|
||||
"""
|
||||
headers = {
|
||||
'Authorization': f'Bearer {access_token}'
|
||||
}
|
||||
|
||||
try:
|
||||
response = requests.get(self.USERINFO_URL, headers=headers, timeout=10)
|
||||
|
||||
if response.status_code == 200:
|
||||
user_info = response.json()
|
||||
logger.info(f"Retrieved user info for: {user_info.get('name', 'unknown')}")
|
||||
return user_info
|
||||
else:
|
||||
logger.error(f"User info request failed: {response.status_code}")
|
||||
return None
|
||||
|
||||
except requests.RequestException as e:
|
||||
logger.error(f"User info request failed: {e}")
|
||||
return None
|
||||
|
||||
def authorize(self, show_browser_message_callback=None) -> Optional[Dict]:
|
||||
"""
|
||||
Perform full OAuth authorization flow
|
||||
|
||||
Args:
|
||||
show_browser_message_callback: Optional callback to display message about browser opening
|
||||
|
||||
Returns:
|
||||
Token response dict or None on failure
|
||||
"""
|
||||
logger.info("Starting Nexus OAuth authorization flow")
|
||||
|
||||
# Reset state
|
||||
self._auth_code = None
|
||||
self._auth_state = None
|
||||
self._auth_error = None
|
||||
self._server_done.clear()
|
||||
|
||||
# Generate PKCE parameters
|
||||
code_verifier, code_challenge, state = self._generate_pkce_params()
|
||||
logger.debug(f"Generated PKCE parameters (state: {state[:10]}...)")
|
||||
|
||||
# Build authorization URL
|
||||
auth_url = self._build_authorization_url(code_challenge, state)
|
||||
|
||||
# Open browser
|
||||
logger.info("Opening browser for authorisation")
|
||||
|
||||
try:
|
||||
# When running from AppImage, we need to clean the environment to avoid
|
||||
# library conflicts with system tools (xdg-open, kde-open, etc.)
|
||||
import os
|
||||
import subprocess
|
||||
|
||||
env = os.environ.copy()
|
||||
|
||||
# Remove AppImage-specific environment variables that can cause conflicts
|
||||
# These variables inject AppImage's bundled libraries into child processes
|
||||
appimage_vars = [
|
||||
'LD_LIBRARY_PATH',
|
||||
'PYTHONPATH',
|
||||
'PYTHONHOME',
|
||||
'QT_PLUGIN_PATH',
|
||||
'QML2_IMPORT_PATH',
|
||||
]
|
||||
|
||||
# Check if we're running from AppImage
|
||||
if 'APPIMAGE' in env or 'APPDIR' in env:
|
||||
logger.debug("Running from AppImage - cleaning environment for browser launch")
|
||||
for var in appimage_vars:
|
||||
if var in env:
|
||||
del env[var]
|
||||
logger.debug(f"Removed {var} from browser environment")
|
||||
|
||||
# Use Popen instead of run to avoid waiting for browser to close
|
||||
# xdg-open may not return until the browser closes, which could be never
|
||||
try:
|
||||
process = subprocess.Popen(
|
||||
['xdg-open', auth_url],
|
||||
env=env,
|
||||
stdout=subprocess.DEVNULL,
|
||||
stderr=subprocess.DEVNULL,
|
||||
start_new_session=True # Detach from parent process
|
||||
)
|
||||
# Give it a moment to fail if it's going to fail
|
||||
import time
|
||||
time.sleep(0.5)
|
||||
|
||||
# Check if process is still running or has exited successfully
|
||||
poll_result = process.poll()
|
||||
if poll_result is None:
|
||||
# Process still running - browser is opening/open
|
||||
logger.info("Browser opened successfully via xdg-open (process running)")
|
||||
browser_opened = True
|
||||
elif poll_result == 0:
|
||||
# Process exited successfully
|
||||
logger.info("Browser opened successfully via xdg-open (exit code 0)")
|
||||
browser_opened = True
|
||||
else:
|
||||
# Process exited with error
|
||||
logger.warning(f"xdg-open exited with code {poll_result}, trying webbrowser module")
|
||||
if webbrowser.open(auth_url):
|
||||
logger.info("Browser opened successfully via webbrowser module")
|
||||
browser_opened = True
|
||||
else:
|
||||
logger.warning("webbrowser.open returned False")
|
||||
browser_opened = False
|
||||
except FileNotFoundError:
|
||||
# xdg-open not found - try webbrowser module
|
||||
logger.warning("xdg-open not found, trying webbrowser module")
|
||||
if webbrowser.open(auth_url):
|
||||
logger.info("Browser opened successfully via webbrowser module")
|
||||
browser_opened = True
|
||||
else:
|
||||
logger.warning("webbrowser.open returned False")
|
||||
browser_opened = False
|
||||
except Exception as e:
|
||||
logger.error(f"Error opening browser: {e}")
|
||||
browser_opened = False
|
||||
|
||||
# Send desktop notification
|
||||
self._send_desktop_notification(
|
||||
"Jackify - Nexus Authorisation",
|
||||
"Please check your browser to authorise Jackify"
|
||||
)
|
||||
|
||||
# Show message via callback if provided (AFTER browser opens)
|
||||
if show_browser_message_callback:
|
||||
if browser_opened:
|
||||
show_browser_message_callback(
|
||||
"Browser opened for Nexus authorisation.\n\n"
|
||||
"After clicking 'Authorize', your browser may ask to\n"
|
||||
"open Jackify or show a popup blocker notification.\n\n"
|
||||
"Please click 'Open' or 'Allow' to complete authorization."
|
||||
)
|
||||
else:
|
||||
show_browser_message_callback(
|
||||
f"Could not open browser automatically.\n\n"
|
||||
f"Please open this URL manually:\n{auth_url}"
|
||||
)
|
||||
|
||||
# Wait for callback via jackify:// protocol
|
||||
if not self._wait_for_callback():
|
||||
return None
|
||||
|
||||
# Check for errors
|
||||
if self._auth_error:
|
||||
logger.error(f"Authorization failed: {self._auth_error}")
|
||||
return None
|
||||
|
||||
if not self._auth_code:
|
||||
logger.error("No authorization code received")
|
||||
return None
|
||||
|
||||
# Verify state matches
|
||||
if self._auth_state != state:
|
||||
logger.error("State mismatch - possible CSRF attack")
|
||||
return None
|
||||
|
||||
logger.info("Authorization code received, exchanging for token")
|
||||
|
||||
# Exchange code for token
|
||||
token_data = self._exchange_code_for_token(self._auth_code, code_verifier)
|
||||
|
||||
if token_data:
|
||||
logger.info("OAuth authorization flow completed successfully")
|
||||
else:
|
||||
logger.error("Failed to exchange authorization code for token")
|
||||
|
||||
return token_data
|
||||
67
jackify/backend/services/platform_detection_service.py
Normal file
67
jackify/backend/services/platform_detection_service.py
Normal file
@@ -0,0 +1,67 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Platform Detection Service
|
||||
|
||||
Centralizes platform detection logic (Steam Deck, etc.) to be performed once at application startup
|
||||
and shared across all components.
|
||||
"""
|
||||
|
||||
import os
|
||||
import logging
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class PlatformDetectionService:
|
||||
"""
|
||||
Service for detecting platform-specific information once at startup
|
||||
"""
|
||||
|
||||
_instance = None
|
||||
_is_steamdeck = None
|
||||
|
||||
def __new__(cls):
|
||||
"""Singleton pattern to ensure only one instance"""
|
||||
if cls._instance is None:
|
||||
cls._instance = super().__new__(cls)
|
||||
return cls._instance
|
||||
|
||||
def __init__(self):
|
||||
"""Initialize platform detection if not already done"""
|
||||
if self._is_steamdeck is None:
|
||||
self._detect_platform()
|
||||
|
||||
def _detect_platform(self):
|
||||
"""Perform platform detection once"""
|
||||
logger.debug("Performing platform detection...")
|
||||
|
||||
# Steam Deck detection
|
||||
self._is_steamdeck = False
|
||||
try:
|
||||
if os.path.exists('/etc/os-release'):
|
||||
with open('/etc/os-release', 'r') as f:
|
||||
content = f.read().lower()
|
||||
if 'steamdeck' in content:
|
||||
self._is_steamdeck = True
|
||||
logger.info("Steam Deck platform detected")
|
||||
else:
|
||||
logger.debug("Non-Steam Deck Linux platform detected")
|
||||
else:
|
||||
logger.debug("No /etc/os-release found - assuming non-Steam Deck platform")
|
||||
except Exception as e:
|
||||
logger.warning(f"Error detecting Steam Deck platform: {e}")
|
||||
self._is_steamdeck = False
|
||||
|
||||
logger.debug(f"Platform detection complete: is_steamdeck={self._is_steamdeck}")
|
||||
|
||||
@property
|
||||
def is_steamdeck(self) -> bool:
|
||||
"""Get Steam Deck detection result"""
|
||||
if self._is_steamdeck is None:
|
||||
self._detect_platform()
|
||||
return self._is_steamdeck
|
||||
|
||||
@classmethod
|
||||
def get_instance(cls):
|
||||
"""Get the singleton instance"""
|
||||
return cls()
|
||||
@@ -6,8 +6,11 @@ Centralized service for detecting and managing protontricks installation across
|
||||
"""
|
||||
|
||||
import logging
|
||||
import os
|
||||
import shutil
|
||||
import subprocess
|
||||
import sys
|
||||
import importlib.util
|
||||
from typing import Optional, Tuple
|
||||
from ..handlers.protontricks_handler import ProtontricksHandler
|
||||
from ..handlers.config_handler import ConfigHandler
|
||||
@@ -44,11 +47,11 @@ class ProtontricksDetectionService:
|
||||
|
||||
def detect_protontricks(self, use_cache: bool = True) -> Tuple[bool, str, str]:
|
||||
"""
|
||||
Detect if protontricks is installed and get installation details
|
||||
|
||||
Detect if system protontricks is installed and get installation details
|
||||
|
||||
Args:
|
||||
use_cache (bool): Whether to use cached detection result
|
||||
|
||||
|
||||
Returns:
|
||||
Tuple[bool, str, str]: (is_installed, installation_type, details_message)
|
||||
- is_installed: True if protontricks is available
|
||||
@@ -82,7 +85,7 @@ class ProtontricksDetectionService:
|
||||
details_message = "Protontricks is installed (unknown type)"
|
||||
else:
|
||||
installation_type = 'none'
|
||||
details_message = "Protontricks not found - required for Jackify functionality"
|
||||
details_message = "Protontricks not found - install via flatpak or package manager"
|
||||
|
||||
# Cache the result
|
||||
self._last_detection_result = (is_installed, installation_type, details_message)
|
||||
@@ -93,57 +96,22 @@ class ProtontricksDetectionService:
|
||||
|
||||
def _detect_without_prompts(self, handler: ProtontricksHandler) -> bool:
|
||||
"""
|
||||
Detect protontricks without user prompts or installation attempts
|
||||
|
||||
Detect system protontricks (flatpak or native) without user prompts.
|
||||
|
||||
Args:
|
||||
handler (ProtontricksHandler): Handler instance to use
|
||||
|
||||
|
||||
Returns:
|
||||
bool: True if protontricks is found
|
||||
bool: True if system protontricks is found
|
||||
"""
|
||||
# Use the handler's silent detection method
|
||||
return handler.detect_protontricks()
|
||||
|
||||
def is_bundled_mode(self) -> bool:
|
||||
"""
|
||||
DEPRECATED: Bundled protontricks no longer supported.
|
||||
Always returns False for backwards compatibility.
|
||||
"""
|
||||
import shutil
|
||||
|
||||
# Check if protontricks exists as a command
|
||||
protontricks_path_which = shutil.which("protontricks")
|
||||
|
||||
if protontricks_path_which:
|
||||
# Check if it's a flatpak wrapper
|
||||
try:
|
||||
with open(protontricks_path_which, 'r') as f:
|
||||
content = f.read()
|
||||
if "flatpak run" in content:
|
||||
logger.debug(f"Detected Protontricks is a Flatpak wrapper at {protontricks_path_which}")
|
||||
handler.which_protontricks = 'flatpak'
|
||||
# Continue to check flatpak list just to be sure
|
||||
else:
|
||||
logger.info(f"Native Protontricks found at {protontricks_path_which}")
|
||||
handler.which_protontricks = 'native'
|
||||
handler.protontricks_path = protontricks_path_which
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.error(f"Error reading protontricks executable: {e}")
|
||||
|
||||
# Check if flatpak protontricks is installed
|
||||
try:
|
||||
env = handler._get_clean_subprocess_env()
|
||||
result = subprocess.run(
|
||||
["flatpak", "list"],
|
||||
capture_output=True,
|
||||
text=True,
|
||||
check=True,
|
||||
env=env
|
||||
)
|
||||
if "com.github.Matoking.protontricks" in result.stdout:
|
||||
logger.info("Flatpak Protontricks is installed")
|
||||
handler.which_protontricks = 'flatpak'
|
||||
return True
|
||||
except FileNotFoundError:
|
||||
logger.warning("'flatpak' command not found. Cannot check for Flatpak Protontricks.")
|
||||
except subprocess.CalledProcessError as e:
|
||||
logger.warning(f"Error checking flatpak list: {e}")
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error checking flatpak: {e}")
|
||||
|
||||
return False
|
||||
|
||||
def install_flatpak_protontricks(self) -> Tuple[bool, str]:
|
||||
@@ -164,14 +132,31 @@ class ProtontricksDetectionService:
|
||||
logger.error(error_msg)
|
||||
return False, error_msg
|
||||
|
||||
# Install command
|
||||
install_cmd = ["flatpak", "install", "-u", "-y", "--noninteractive", "flathub", "com.github.Matoking.protontricks"]
|
||||
# Install command - use --user flag for user-level installation (works on Steam Deck)
|
||||
# This avoids requiring system-wide installation permissions
|
||||
install_cmd = ["flatpak", "install", "--user", "-y", "--noninteractive", "flathub", "com.github.Matoking.protontricks"]
|
||||
|
||||
# Use clean environment
|
||||
env = handler._get_clean_subprocess_env()
|
||||
|
||||
# Run installation
|
||||
process = subprocess.run(install_cmd, check=True, text=True, env=env, capture_output=True)
|
||||
# Log the command for debugging
|
||||
logger.debug(f"Running flatpak install command: {' '.join(install_cmd)}")
|
||||
|
||||
# Run installation with timeout (5 minutes should be plenty)
|
||||
process = subprocess.run(
|
||||
install_cmd,
|
||||
check=True,
|
||||
text=True,
|
||||
env=env,
|
||||
capture_output=True,
|
||||
timeout=300 # 5 minute timeout
|
||||
)
|
||||
|
||||
# Log stdout/stderr for debugging (even on success, might contain useful info)
|
||||
if process.stdout:
|
||||
logger.debug(f"Flatpak install stdout: {process.stdout}")
|
||||
if process.stderr:
|
||||
logger.debug(f"Flatpak install stderr: {process.stderr}")
|
||||
|
||||
# Clear cache to force re-detection
|
||||
self._cached_detection_valid = False
|
||||
@@ -184,13 +169,41 @@ class ProtontricksDetectionService:
|
||||
error_msg = "Flatpak command not found. Please install Flatpak first."
|
||||
logger.error(error_msg)
|
||||
return False, error_msg
|
||||
except subprocess.CalledProcessError as e:
|
||||
error_msg = f"Flatpak installation failed: {e}"
|
||||
except subprocess.TimeoutExpired:
|
||||
error_msg = "Flatpak installation timed out after 5 minutes. Please check your network connection and try again."
|
||||
logger.error(error_msg)
|
||||
return False, error_msg
|
||||
except subprocess.CalledProcessError as e:
|
||||
# Include stderr in error message for better debugging
|
||||
stderr_msg = e.stderr.strip() if e.stderr else "No error details available"
|
||||
stdout_msg = e.stdout.strip() if e.stdout else ""
|
||||
|
||||
# Try to extract meaningful error from stderr
|
||||
if stderr_msg:
|
||||
# Common errors: permission denied, network issues, etc.
|
||||
if "permission" in stderr_msg.lower() or "denied" in stderr_msg.lower():
|
||||
error_msg = f"Permission denied. Try running: flatpak install --user flathub com.github.Matoking.protontricks\n\nDetails: {stderr_msg}"
|
||||
elif "network" in stderr_msg.lower() or "connection" in stderr_msg.lower():
|
||||
error_msg = f"Network error during installation. Check your internet connection.\n\nDetails: {stderr_msg}"
|
||||
elif "already installed" in stderr_msg.lower():
|
||||
# This might actually be success - clear cache and re-detect
|
||||
logger.info("Protontricks appears to already be installed (according to flatpak output)")
|
||||
self._cached_detection_valid = False
|
||||
return True, "Protontricks is already installed."
|
||||
else:
|
||||
error_msg = f"Flatpak installation failed:\n\n{stderr_msg}"
|
||||
if stdout_msg:
|
||||
error_msg += f"\n\nOutput: {stdout_msg}"
|
||||
else:
|
||||
error_msg = f"Flatpak installation failed with return code {e.returncode}."
|
||||
if stdout_msg:
|
||||
error_msg += f"\n\nOutput: {stdout_msg}"
|
||||
|
||||
logger.error(f"Flatpak installation error: {error_msg}")
|
||||
return False, error_msg
|
||||
except Exception as e:
|
||||
error_msg = f"Unexpected error during Flatpak installation: {e}"
|
||||
logger.error(error_msg)
|
||||
logger.error(error_msg, exc_info=True)
|
||||
return False, error_msg
|
||||
|
||||
def get_installation_guidance(self) -> str:
|
||||
|
||||
@@ -5,46 +5,87 @@ import signal
|
||||
import psutil
|
||||
import logging
|
||||
import sys
|
||||
import shutil
|
||||
from typing import Callable, Optional
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
STRATEGY_JACKIFY = "jackify"
|
||||
STRATEGY_NAK_SIMPLE = "nak_simple"
|
||||
|
||||
|
||||
def _get_restart_strategy() -> str:
|
||||
"""Read restart strategy from config with safe fallback."""
|
||||
try:
|
||||
from jackify.backend.handlers.config_handler import ConfigHandler
|
||||
|
||||
strategy = ConfigHandler().get("steam_restart_strategy", STRATEGY_JACKIFY)
|
||||
if strategy not in (STRATEGY_JACKIFY, STRATEGY_NAK_SIMPLE):
|
||||
return STRATEGY_JACKIFY
|
||||
return strategy
|
||||
except Exception as exc: # pragma: no cover - defensive logging only
|
||||
logger.debug(f"Steam restart: Unable to read strategy from config: {exc}")
|
||||
return STRATEGY_JACKIFY
|
||||
|
||||
|
||||
def _strategy_label(strategy: str) -> str:
|
||||
if strategy == STRATEGY_NAK_SIMPLE:
|
||||
return "NaK simple restart"
|
||||
return "Jackify hardened restart"
|
||||
|
||||
def _get_clean_subprocess_env():
|
||||
"""
|
||||
Create a clean environment for subprocess calls by removing PyInstaller-specific
|
||||
environment variables that can interfere with Steam execution.
|
||||
Create a clean environment for subprocess calls by stripping bundle-specific
|
||||
environment variables (e.g., frozen AppImage remnants) that can interfere with Steam.
|
||||
|
||||
CRITICAL: Preserves all display/session variables that Steam needs for GUI:
|
||||
- DISPLAY, WAYLAND_DISPLAY, XDG_SESSION_TYPE, DBUS_SESSION_BUS_ADDRESS,
|
||||
XDG_RUNTIME_DIR, XAUTHORITY, etc.
|
||||
|
||||
Returns:
|
||||
dict: Cleaned environment dictionary
|
||||
dict: Cleaned environment dictionary with GUI variables preserved
|
||||
"""
|
||||
env = os.environ.copy()
|
||||
pyinstaller_vars_removed = []
|
||||
bundle_vars_removed = []
|
||||
|
||||
# Remove PyInstaller-specific environment variables
|
||||
# CRITICAL: Preserve display/session variables that Steam GUI needs
|
||||
# These MUST be kept for Steam to open its GUI window
|
||||
gui_vars_to_preserve = [
|
||||
'DISPLAY', 'WAYLAND_DISPLAY', 'XDG_SESSION_TYPE', 'DBUS_SESSION_BUS_ADDRESS',
|
||||
'XDG_RUNTIME_DIR', 'XAUTHORITY', 'XDG_CURRENT_DESKTOP', 'XDG_SESSION_DESKTOP',
|
||||
'QT_QPA_PLATFORM', 'GDK_BACKEND', 'XDG_DATA_DIRS', 'XDG_CONFIG_DIRS'
|
||||
]
|
||||
preserved_gui_vars = {}
|
||||
for var in gui_vars_to_preserve:
|
||||
if var in env:
|
||||
preserved_gui_vars[var] = env[var]
|
||||
logger.debug(f"Steam restart: Preserving GUI variable {var}={env[var][:50] if len(str(env[var])) > 50 else env[var]}")
|
||||
|
||||
# Remove bundle-specific environment variables
|
||||
if env.pop('_MEIPASS', None):
|
||||
pyinstaller_vars_removed.append('_MEIPASS')
|
||||
bundle_vars_removed.append('_MEIPASS')
|
||||
if env.pop('_MEIPASS2', None):
|
||||
pyinstaller_vars_removed.append('_MEIPASS2')
|
||||
bundle_vars_removed.append('_MEIPASS2')
|
||||
|
||||
# Clean library path variables that PyInstaller modifies (Linux/Unix)
|
||||
# Clean library path variables that frozen bundles modify (Linux/Unix)
|
||||
if 'LD_LIBRARY_PATH_ORIG' in env:
|
||||
# Restore original LD_LIBRARY_PATH if it was backed up by PyInstaller
|
||||
# Restore original LD_LIBRARY_PATH if it was backed up by the bundler
|
||||
env['LD_LIBRARY_PATH'] = env['LD_LIBRARY_PATH_ORIG']
|
||||
pyinstaller_vars_removed.append('LD_LIBRARY_PATH (restored from _ORIG)')
|
||||
bundle_vars_removed.append('LD_LIBRARY_PATH (restored from _ORIG)')
|
||||
else:
|
||||
# Remove PyInstaller-modified LD_LIBRARY_PATH
|
||||
# Remove modified LD_LIBRARY_PATH entries
|
||||
if env.pop('LD_LIBRARY_PATH', None):
|
||||
pyinstaller_vars_removed.append('LD_LIBRARY_PATH (removed)')
|
||||
bundle_vars_removed.append('LD_LIBRARY_PATH (removed)')
|
||||
|
||||
# Clean PATH of PyInstaller-specific entries
|
||||
# Clean PATH of bundle-specific entries
|
||||
if 'PATH' in env and hasattr(sys, '_MEIPASS'):
|
||||
path_entries = env['PATH'].split(os.pathsep)
|
||||
original_count = len(path_entries)
|
||||
# Remove any PATH entries that point to PyInstaller temp directory
|
||||
# Remove any PATH entries that point to the bundle's temp directory
|
||||
cleaned_path = [p for p in path_entries if not p.startswith(sys._MEIPASS)]
|
||||
env['PATH'] = os.pathsep.join(cleaned_path)
|
||||
if len(cleaned_path) < original_count:
|
||||
pyinstaller_vars_removed.append(f'PATH (removed {original_count - len(cleaned_path)} PyInstaller entries)')
|
||||
bundle_vars_removed.append(f'PATH (removed {original_count - len(cleaned_path)} bundle entries)')
|
||||
|
||||
# Clean macOS library path (if present)
|
||||
if 'DYLD_LIBRARY_PATH' in env and hasattr(sys, '_MEIPASS'):
|
||||
@@ -52,16 +93,26 @@ def _get_clean_subprocess_env():
|
||||
cleaned_dyld = [p for p in dyld_entries if not p.startswith(sys._MEIPASS)]
|
||||
if cleaned_dyld:
|
||||
env['DYLD_LIBRARY_PATH'] = os.pathsep.join(cleaned_dyld)
|
||||
pyinstaller_vars_removed.append('DYLD_LIBRARY_PATH (cleaned)')
|
||||
bundle_vars_removed.append('DYLD_LIBRARY_PATH (cleaned)')
|
||||
else:
|
||||
env.pop('DYLD_LIBRARY_PATH', None)
|
||||
pyinstaller_vars_removed.append('DYLD_LIBRARY_PATH (removed)')
|
||||
bundle_vars_removed.append('DYLD_LIBRARY_PATH (removed)')
|
||||
|
||||
# Ensure GUI variables are still present (they should be, but double-check)
|
||||
for var, value in preserved_gui_vars.items():
|
||||
if var not in env:
|
||||
env[var] = value
|
||||
logger.warning(f"Steam restart: Restored GUI variable {var} that was accidentally removed")
|
||||
|
||||
# Log what was cleaned for debugging
|
||||
if pyinstaller_vars_removed:
|
||||
logger.debug(f"Steam restart: Cleaned PyInstaller environment variables: {', '.join(pyinstaller_vars_removed)}")
|
||||
if bundle_vars_removed:
|
||||
logger.debug(f"Steam restart: Cleaned bundled environment variables: {', '.join(bundle_vars_removed)}")
|
||||
else:
|
||||
logger.debug("Steam restart: No PyInstaller environment variables detected (likely DEV mode)")
|
||||
logger.debug("Steam restart: No bundled environment variables detected (likely DEV mode)")
|
||||
|
||||
# Log preserved GUI variables for debugging
|
||||
if preserved_gui_vars:
|
||||
logger.debug(f"Steam restart: Preserved {len(preserved_gui_vars)} GUI environment variables")
|
||||
|
||||
return env
|
||||
|
||||
@@ -86,6 +137,31 @@ def is_steam_deck() -> bool:
|
||||
logger.debug(f"Error detecting Steam Deck: {e}")
|
||||
return False
|
||||
|
||||
def is_flatpak_steam() -> bool:
|
||||
"""Detect if Steam is installed as a Flatpak."""
|
||||
try:
|
||||
# First check if flatpak command exists
|
||||
if not shutil.which('flatpak'):
|
||||
return False
|
||||
|
||||
# Verify the app is actually installed (not just directory exists)
|
||||
result = subprocess.run(['flatpak', 'list', '--app'],
|
||||
stdout=subprocess.PIPE,
|
||||
stderr=subprocess.DEVNULL, # Suppress stderr to avoid error messages
|
||||
text=True,
|
||||
timeout=5)
|
||||
if result.returncode == 0:
|
||||
# Check for exact match - "com.valvesoftware.Steam" as a whole word
|
||||
# This prevents matching "com.valvesoftware.SteamLink" or similar
|
||||
for line in result.stdout.splitlines():
|
||||
parts = line.split()
|
||||
if parts and parts[0] == 'com.valvesoftware.Steam':
|
||||
return True
|
||||
return False
|
||||
except Exception as e:
|
||||
logger.debug(f"Error detecting Flatpak Steam: {e}")
|
||||
return False
|
||||
|
||||
def get_steam_processes() -> list:
|
||||
"""Return a list of psutil.Process objects for running Steam processes."""
|
||||
steam_procs = []
|
||||
@@ -118,20 +194,118 @@ def wait_for_steam_exit(timeout: int = 60, check_interval: float = 0.5) -> bool:
|
||||
time.sleep(check_interval)
|
||||
return False
|
||||
|
||||
def start_steam() -> bool:
|
||||
"""Attempt to start Steam using the exact methods from existing working logic."""
|
||||
env = _get_clean_subprocess_env()
|
||||
def _start_steam_nak_style(is_steamdeck_flag=False, is_flatpak_flag=False, env_override=None) -> bool:
|
||||
"""
|
||||
Start Steam using a simplified NaK-style restart (single command, no env cleanup).
|
||||
|
||||
CRITICAL: Do NOT use start_new_session - Steam needs to inherit the session
|
||||
to connect to display/tray. Ensure all GUI environment variables are preserved.
|
||||
"""
|
||||
env = env_override if env_override is not None else os.environ.copy()
|
||||
|
||||
# Log critical GUI variables for debugging
|
||||
gui_vars = ['DISPLAY', 'WAYLAND_DISPLAY', 'XDG_SESSION_TYPE', 'DBUS_SESSION_BUS_ADDRESS', 'XDG_RUNTIME_DIR']
|
||||
for var in gui_vars:
|
||||
if var in env:
|
||||
logger.debug(f"NaK-style restart: {var}={env[var][:50] if len(str(env[var])) > 50 else env[var]}")
|
||||
else:
|
||||
logger.warning(f"NaK-style restart: {var} is NOT SET - Steam GUI may fail!")
|
||||
|
||||
try:
|
||||
# Try systemd user service (Steam Deck)
|
||||
if is_steam_deck():
|
||||
if is_steamdeck_flag:
|
||||
logger.info("NaK-style restart: Steam Deck detected, restarting via systemctl.")
|
||||
subprocess.Popen(["systemctl", "--user", "restart", "app-steam@autostart.service"], env=env)
|
||||
elif is_flatpak_flag:
|
||||
logger.info("NaK-style restart: Flatpak Steam detected, running flatpak command.")
|
||||
subprocess.Popen(["flatpak", "run", "com.valvesoftware.Steam"],
|
||||
env=env, stderr=subprocess.DEVNULL)
|
||||
else:
|
||||
logger.info("NaK-style restart: launching Steam directly (inheriting session for GUI).")
|
||||
# NaK uses simple "steam" command without -foreground flag
|
||||
# Do NOT use start_new_session - Steam needs session access for GUI
|
||||
# Use shell=True to ensure proper environment inheritance
|
||||
# This helps with GUI display access on some systems
|
||||
subprocess.Popen("steam", shell=True, env=env)
|
||||
|
||||
time.sleep(5)
|
||||
# Use steamwebhelper for detection (actual Steam process, not steam-powerbuttond)
|
||||
check_result = subprocess.run(['pgrep', '-f', 'steamwebhelper'], capture_output=True, timeout=10, env=env)
|
||||
if check_result.returncode == 0:
|
||||
logger.info("NaK-style restart detected running Steam process.")
|
||||
return True
|
||||
|
||||
logger.warning("NaK-style restart did not detect Steam process after launch.")
|
||||
return False
|
||||
except FileNotFoundError as exc:
|
||||
logger.error(f"NaK-style restart command not found: {exc}")
|
||||
return False
|
||||
except Exception as exc:
|
||||
logger.error(f"NaK-style restart encountered an error: {exc}")
|
||||
return False
|
||||
|
||||
|
||||
def start_steam(is_steamdeck_flag=None, is_flatpak_flag=None, env_override=None, strategy: str = STRATEGY_JACKIFY) -> bool:
|
||||
"""
|
||||
Attempt to start Steam using the exact methods from existing working logic.
|
||||
|
||||
Args:
|
||||
is_steamdeck_flag: Optional pre-detected Steam Deck status
|
||||
is_flatpak_flag: Optional pre-detected Flatpak Steam status
|
||||
env_override: Optional environment dictionary for subprocess calls
|
||||
strategy: Restart strategy identifier
|
||||
"""
|
||||
if strategy == STRATEGY_NAK_SIMPLE:
|
||||
return _start_steam_nak_style(
|
||||
is_steamdeck_flag=is_steamdeck_flag,
|
||||
is_flatpak_flag=is_flatpak_flag,
|
||||
env_override=env_override or os.environ.copy(),
|
||||
)
|
||||
|
||||
env = env_override if env_override is not None else _get_clean_subprocess_env()
|
||||
|
||||
# Use provided flags or detect
|
||||
_is_steam_deck = is_steamdeck_flag if is_steamdeck_flag is not None else is_steam_deck()
|
||||
_is_flatpak = is_flatpak_flag if is_flatpak_flag is not None else is_flatpak_steam()
|
||||
logger.info(
|
||||
"Starting Steam (strategy=%s, steam_deck=%s, flatpak=%s)",
|
||||
strategy,
|
||||
_is_steam_deck,
|
||||
_is_flatpak,
|
||||
)
|
||||
|
||||
try:
|
||||
# Try systemd user service (Steam Deck) - HIGHEST PRIORITY
|
||||
if _is_steam_deck:
|
||||
logger.debug("Using systemctl restart for Steam Deck.")
|
||||
subprocess.Popen(["systemctl", "--user", "restart", "app-steam@autostart.service"], env=env)
|
||||
return True
|
||||
|
||||
# Use startup methods with only -silent flag (no -minimized or -no-browser)
|
||||
|
||||
# Check if Flatpak Steam (only if not Steam Deck)
|
||||
if _is_flatpak:
|
||||
logger.info("Flatpak Steam detected - trying flatpak run command first")
|
||||
try:
|
||||
# Try without flags first (most reliable for Ubuntu/PopOS)
|
||||
logger.debug("Executing: flatpak run com.valvesoftware.Steam")
|
||||
subprocess.Popen(["flatpak", "run", "com.valvesoftware.Steam"],
|
||||
env=env, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)
|
||||
time.sleep(7) # Give Flatpak more time to start
|
||||
# For Flatpak Steam, check for the flatpak process, not steamwebhelper
|
||||
check_result = subprocess.run(['pgrep', '-f', 'com.valvesoftware.Steam'], capture_output=True, timeout=10, env=env)
|
||||
if check_result.returncode == 0:
|
||||
logger.info("Flatpak Steam started successfully")
|
||||
return True
|
||||
else:
|
||||
logger.warning("Flatpak Steam not detected after launch - will NOT fall back to prevent conflicts")
|
||||
return False # Flatpak Steam must use flatpak command, don't fall back
|
||||
except Exception as e:
|
||||
logger.error(f"Flatpak Steam start failed: {e}")
|
||||
return False # Flatpak Steam must use flatpak command, don't fall back
|
||||
|
||||
# Use startup methods with -foreground flag to ensure GUI opens
|
||||
start_methods = [
|
||||
{"name": "Popen", "cmd": ["steam", "-silent"], "kwargs": {"stdout": subprocess.DEVNULL, "stderr": subprocess.DEVNULL, "stdin": subprocess.DEVNULL, "start_new_session": True, "env": env}},
|
||||
{"name": "setsid", "cmd": ["setsid", "steam", "-silent"], "kwargs": {"stdout": subprocess.DEVNULL, "stderr": subprocess.DEVNULL, "stdin": subprocess.DEVNULL, "env": env}},
|
||||
{"name": "nohup", "cmd": ["nohup", "steam", "-silent"], "kwargs": {"stdout": subprocess.DEVNULL, "stderr": subprocess.DEVNULL, "stdin": subprocess.DEVNULL, "start_new_session": True, "preexec_fn": os.setpgrp, "env": env}}
|
||||
{"name": "Popen", "cmd": ["steam", "-foreground"], "kwargs": {"stdout": subprocess.DEVNULL, "stderr": subprocess.DEVNULL, "stdin": subprocess.DEVNULL, "start_new_session": True, "env": env}},
|
||||
{"name": "setsid", "cmd": ["setsid", "steam", "-foreground"], "kwargs": {"stdout": subprocess.DEVNULL, "stderr": subprocess.DEVNULL, "stdin": subprocess.DEVNULL, "env": env}},
|
||||
{"name": "nohup", "cmd": ["nohup", "steam", "-foreground"], "kwargs": {"stdout": subprocess.DEVNULL, "stderr": subprocess.DEVNULL, "stdin": subprocess.DEVNULL, "start_new_session": True, "preexec_fn": os.setpgrp, "env": env}}
|
||||
]
|
||||
|
||||
for method in start_methods:
|
||||
@@ -142,7 +316,8 @@ def start_steam() -> bool:
|
||||
if process is not None:
|
||||
logger.info(f"Initiated Steam start with {method_name}.")
|
||||
time.sleep(5) # Wait 5 seconds as in existing logic
|
||||
check_result = subprocess.run(['pgrep', '-f', 'steam'], capture_output=True, timeout=10, env=env)
|
||||
# Use steamwebhelper for detection (actual Steam process, not steam-powerbuttond)
|
||||
check_result = subprocess.run(['pgrep', '-f', 'steamwebhelper'], capture_output=True, timeout=10, env=env)
|
||||
if check_result.returncode == 0:
|
||||
logger.info(f"Steam process detected after using {method_name}. Proceeding to wait phase.")
|
||||
return True
|
||||
@@ -160,106 +335,149 @@ def start_steam() -> bool:
|
||||
logger.error(f"Error starting Steam: {e}")
|
||||
return False
|
||||
|
||||
def robust_steam_restart(progress_callback: Optional[Callable[[str], None]] = None, timeout: int = 60) -> bool:
|
||||
def robust_steam_restart(progress_callback: Optional[Callable[[str], None]] = None, timeout: int = 60, system_info=None) -> bool:
|
||||
"""
|
||||
Robustly restart Steam across all distros. Returns True on success, False on failure.
|
||||
Optionally accepts a progress_callback(message: str) for UI feedback.
|
||||
Uses aggressive pkill approach for maximum reliability.
|
||||
|
||||
Args:
|
||||
progress_callback: Optional callback for progress updates
|
||||
timeout: Timeout in seconds for restart operation
|
||||
system_info: Optional SystemInfo object with pre-detected Steam installation types
|
||||
"""
|
||||
env = _get_clean_subprocess_env()
|
||||
|
||||
shutdown_env = _get_clean_subprocess_env()
|
||||
strategy = _get_restart_strategy()
|
||||
start_env = shutdown_env if strategy == STRATEGY_JACKIFY else os.environ.copy()
|
||||
|
||||
# Use cached detection from system_info if available, otherwise detect
|
||||
_is_steam_deck = system_info.is_steamdeck if system_info else is_steam_deck()
|
||||
_is_flatpak = system_info.is_flatpak_steam if system_info else is_flatpak_steam()
|
||||
|
||||
def report(msg):
|
||||
logger.info(msg)
|
||||
if progress_callback:
|
||||
progress_callback(msg)
|
||||
|
||||
report("Shutting down Steam...")
|
||||
|
||||
# Steam Deck: Use systemctl for shutdown (special handling)
|
||||
if is_steam_deck():
|
||||
report(f"Steam restart strategy: {_strategy_label(strategy)}")
|
||||
|
||||
# Steam Deck: Use systemctl for shutdown (special handling) - HIGHEST PRIORITY
|
||||
if _is_steam_deck:
|
||||
try:
|
||||
report("Steam Deck detected - using systemctl shutdown...")
|
||||
subprocess.run(['systemctl', '--user', 'stop', 'app-steam@autostart.service'],
|
||||
timeout=15, check=False, capture_output=True, env=env)
|
||||
subprocess.run(['systemctl', '--user', 'stop', 'app-steam@autostart.service'],
|
||||
timeout=15, check=False, capture_output=True, env=shutdown_env)
|
||||
time.sleep(2)
|
||||
except Exception as e:
|
||||
logger.debug(f"systemctl stop failed on Steam Deck: {e}")
|
||||
|
||||
# Flatpak Steam: Use flatpak kill command (only if not Steam Deck)
|
||||
elif _is_flatpak:
|
||||
try:
|
||||
report("Flatpak Steam detected - stopping via flatpak...")
|
||||
subprocess.run(['flatpak', 'kill', 'com.valvesoftware.Steam'],
|
||||
timeout=15, check=False, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=shutdown_env)
|
||||
time.sleep(2)
|
||||
except Exception as e:
|
||||
logger.debug(f"flatpak kill failed: {e}")
|
||||
|
||||
# All systems: Use pkill approach (proven 15/16 test success rate)
|
||||
try:
|
||||
# Skip unreliable steam -shutdown, go straight to pkill
|
||||
pkill_result = subprocess.run(['pkill', 'steam'], timeout=15, check=False, capture_output=True, env=env)
|
||||
pkill_result = subprocess.run(['pkill', 'steam'], timeout=15, check=False, capture_output=True, env=shutdown_env)
|
||||
logger.debug(f"pkill steam result: {pkill_result.returncode}")
|
||||
time.sleep(2)
|
||||
|
||||
# Check if Steam is still running
|
||||
check_result = subprocess.run(['pgrep', '-f', 'steamwebhelper'], capture_output=True, timeout=10, env=env)
|
||||
check_result = subprocess.run(['pgrep', '-f', 'steamwebhelper'], capture_output=True, timeout=10, env=shutdown_env)
|
||||
if check_result.returncode == 0:
|
||||
# Force kill if still running
|
||||
report("Steam still running - force terminating...")
|
||||
force_result = subprocess.run(['pkill', '-9', 'steam'], timeout=15, check=False, capture_output=True, env=env)
|
||||
force_result = subprocess.run(['pkill', '-9', 'steam'], timeout=15, check=False, capture_output=True, env=shutdown_env)
|
||||
logger.debug(f"pkill -9 steam result: {force_result.returncode}")
|
||||
time.sleep(2)
|
||||
|
||||
# Final check
|
||||
final_check = subprocess.run(['pgrep', '-f', 'steamwebhelper'], capture_output=True, timeout=10, env=env)
|
||||
final_check = subprocess.run(['pgrep', '-f', 'steamwebhelper'], capture_output=True, timeout=10, env=shutdown_env)
|
||||
if final_check.returncode != 0:
|
||||
logger.info("Steam processes successfully force terminated.")
|
||||
else:
|
||||
report("Failed to terminate Steam processes.")
|
||||
return False
|
||||
# Steam might still be running, but proceed anyway - wait phase will verify
|
||||
logger.warning("Steam processes may still be running after termination attempts. Proceeding to start phase...")
|
||||
report("Steam shutdown incomplete, but proceeding...")
|
||||
else:
|
||||
logger.info("Steam processes successfully terminated.")
|
||||
except Exception as e:
|
||||
logger.error(f"Error during Steam shutdown: {e}")
|
||||
report("Failed to shut down Steam.")
|
||||
return False
|
||||
# Don't fail completely on shutdown errors - proceed to start phase
|
||||
logger.warning(f"Error during Steam shutdown: {e}. Proceeding to start phase anyway...")
|
||||
report("Steam shutdown had issues, but proceeding...")
|
||||
|
||||
report("Steam closed successfully.")
|
||||
|
||||
# Start Steam using platform-specific logic
|
||||
report("Starting Steam...")
|
||||
|
||||
|
||||
# Steam Deck: Use systemctl restart (keep existing working approach)
|
||||
if is_steam_deck():
|
||||
if _is_steam_deck:
|
||||
try:
|
||||
subprocess.Popen(["systemctl", "--user", "restart", "app-steam@autostart.service"], env=env)
|
||||
subprocess.Popen(["systemctl", "--user", "restart", "app-steam@autostart.service"], env=start_env)
|
||||
logger.info("Steam Deck: Initiated systemctl restart")
|
||||
except Exception as e:
|
||||
logger.error(f"Steam Deck systemctl restart failed: {e}")
|
||||
report("Failed to restart Steam on Steam Deck.")
|
||||
return False
|
||||
else:
|
||||
# All other distros: Use proven steam -silent method
|
||||
if not start_steam():
|
||||
report("Failed to start Steam.")
|
||||
return False
|
||||
# All other distros: Use start_steam() which now uses -foreground to ensure GUI opens
|
||||
steam_started = start_steam(
|
||||
is_steamdeck_flag=_is_steam_deck,
|
||||
is_flatpak_flag=_is_flatpak,
|
||||
env_override=start_env,
|
||||
strategy=strategy,
|
||||
)
|
||||
# Even if start_steam() returns False, Steam might still be starting
|
||||
# Give it a chance by proceeding to wait phase
|
||||
if not steam_started:
|
||||
logger.warning("start_steam() returned False, but proceeding to wait phase in case Steam is starting anyway")
|
||||
report("Steam start command issued, waiting for process...")
|
||||
|
||||
# Wait for Steam to fully initialize using existing logic
|
||||
# Wait for Steam to fully initialize
|
||||
# CRITICAL: Use steamwebhelper (actual Steam process), not "steam" (matches steam-powerbuttond, etc.)
|
||||
report("Waiting for Steam to fully start")
|
||||
logger.info("Waiting up to 2 minutes for Steam to fully initialize...")
|
||||
max_startup_wait = 120
|
||||
logger.info("Waiting up to 3 minutes (180 seconds) for Steam to fully initialize...")
|
||||
max_startup_wait = 180 # Increased from 120 to 180 seconds (3 minutes) for slower systems
|
||||
elapsed_wait = 0
|
||||
initial_wait_done = False
|
||||
last_status_log = 0 # Track when we last logged status
|
||||
|
||||
while elapsed_wait < max_startup_wait:
|
||||
try:
|
||||
result = subprocess.run(['pgrep', '-f', 'steam'], capture_output=True, timeout=10, env=env)
|
||||
# Log status every 30 seconds so user knows we're still waiting
|
||||
if elapsed_wait - last_status_log >= 30:
|
||||
remaining = max_startup_wait - elapsed_wait
|
||||
logger.info(f"Still waiting for Steam... ({elapsed_wait}s elapsed, {remaining}s remaining)")
|
||||
if progress_callback:
|
||||
progress_callback(f"Waiting for Steam... ({elapsed_wait}s / {max_startup_wait}s)")
|
||||
last_status_log = elapsed_wait
|
||||
|
||||
# Use steamwebhelper for detection (matches shutdown logic)
|
||||
result = subprocess.run(['pgrep', '-f', 'steamwebhelper'], capture_output=True, timeout=10, env=start_env)
|
||||
if result.returncode == 0:
|
||||
if not initial_wait_done:
|
||||
logger.info("Steam process detected. Waiting additional time for full initialization...")
|
||||
logger.info(f"Steam process detected at {elapsed_wait}s. Waiting additional time for full initialization...")
|
||||
initial_wait_done = True
|
||||
time.sleep(5)
|
||||
elapsed_wait += 5
|
||||
if initial_wait_done and elapsed_wait >= 15:
|
||||
final_check = subprocess.run(['pgrep', '-f', 'steam'], capture_output=True, timeout=10, env=env)
|
||||
# Require at least 20 seconds of stable detection (increased from 15)
|
||||
if initial_wait_done and elapsed_wait >= 20:
|
||||
final_check = subprocess.run(['pgrep', '-f', 'steamwebhelper'], capture_output=True, timeout=10, env=start_env)
|
||||
if final_check.returncode == 0:
|
||||
report("Steam started successfully.")
|
||||
logger.info("Steam confirmed running after wait.")
|
||||
logger.info(f"Steam confirmed running after {elapsed_wait}s wait.")
|
||||
return True
|
||||
else:
|
||||
logger.warning("Steam process disappeared during final initialization wait.")
|
||||
break
|
||||
logger.warning("Steam process disappeared during final initialization wait, continuing to wait...")
|
||||
# Don't break - continue waiting in case Steam is still starting
|
||||
initial_wait_done = False # Reset to allow re-detection
|
||||
else:
|
||||
logger.debug(f"Steam process not yet detected. Waiting... ({elapsed_wait + 5}s)")
|
||||
time.sleep(5)
|
||||
@@ -269,6 +487,7 @@ def robust_steam_restart(progress_callback: Optional[Callable[[str], None]] = No
|
||||
time.sleep(5)
|
||||
elapsed_wait += 5
|
||||
|
||||
report("Steam did not start within timeout.")
|
||||
logger.error("Steam failed to start/initialize within the allowed time.")
|
||||
# Only reach here if we've waited the full duration
|
||||
report(f"Steam did not start within {max_startup_wait}s timeout.")
|
||||
logger.error(f"Steam failed to start/initialize within the allowed time ({elapsed_wait}s elapsed).")
|
||||
return False
|
||||
@@ -271,9 +271,9 @@ class UpdateService:
|
||||
total_size = int(response.headers.get('content-length', 0))
|
||||
downloaded_size = 0
|
||||
|
||||
# Create update directory in user's home directory
|
||||
home_dir = Path.home()
|
||||
update_dir = home_dir / "Jackify" / "updates"
|
||||
# Create update directory in user's data directory
|
||||
from jackify.shared.paths import get_jackify_data_dir
|
||||
update_dir = get_jackify_data_dir() / "updates"
|
||||
update_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
temp_file = update_dir / f"Jackify-{update_info.version}.AppImage"
|
||||
@@ -345,9 +345,9 @@ class UpdateService:
|
||||
Path to helper script, or None if creation failed
|
||||
"""
|
||||
try:
|
||||
# Create update directory in user's home directory
|
||||
home_dir = Path.home()
|
||||
update_dir = home_dir / "Jackify" / "updates"
|
||||
# Create update directory in user's data directory
|
||||
from jackify.shared.paths import get_jackify_data_dir
|
||||
update_dir = get_jackify_data_dir() / "updates"
|
||||
update_dir.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
helper_script = update_dir / "update_helper.sh"
|
||||
|
||||
3
jackify/backend/utils/__init__.py
Normal file
3
jackify/backend/utils/__init__.py
Normal file
@@ -0,0 +1,3 @@
|
||||
"""Helper utilities for backend services."""
|
||||
|
||||
|
||||
46
jackify/backend/utils/nexus_premium_detector.py
Normal file
46
jackify/backend/utils/nexus_premium_detector.py
Normal file
@@ -0,0 +1,46 @@
|
||||
"""
|
||||
Utilities for detecting Nexus Premium requirement messages in engine output.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
_KEYWORD_PHRASES = (
|
||||
"buy nexus premium",
|
||||
"requires nexus premium",
|
||||
"requires a nexus premium",
|
||||
"nexus premium is required",
|
||||
"nexus premium required",
|
||||
"nexus mods premium is required",
|
||||
"manual download", # Evaluated with additional context
|
||||
)
|
||||
|
||||
|
||||
def is_non_premium_indicator(line: str) -> tuple[bool, str | None]:
|
||||
"""
|
||||
Return True if the engine output line indicates a Nexus non-premium scenario.
|
||||
|
||||
Args:
|
||||
line: Raw line emitted from the jackify-engine process.
|
||||
|
||||
Returns:
|
||||
Tuple of (is_premium_error: bool, matched_pattern: str | None)
|
||||
"""
|
||||
if not line:
|
||||
return False, None
|
||||
|
||||
normalized = line.strip().lower()
|
||||
if not normalized:
|
||||
return False, None
|
||||
|
||||
# Direct phrase detection
|
||||
for phrase in _KEYWORD_PHRASES[:6]:
|
||||
if phrase in normalized:
|
||||
return True, phrase
|
||||
|
||||
# Manual download + Nexus URL implies premium requirement in current workflows.
|
||||
if "manual download" in normalized and ("nexusmods.com" in normalized or "nexus mods" in normalized):
|
||||
return True, "manual download + nexusmods.com"
|
||||
|
||||
return False, None
|
||||
|
||||
|
||||
BIN
jackify/engine/Microsoft.CSharp.dll
Normal file → Executable file
BIN
jackify/engine/Microsoft.CSharp.dll
Normal file → Executable file
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
BIN
jackify/engine/System.Collections.Concurrent.dll
Normal file → Executable file
BIN
jackify/engine/System.Collections.Concurrent.dll
Normal file → Executable file
Binary file not shown.
BIN
jackify/engine/System.Collections.Immutable.dll
Normal file → Executable file
BIN
jackify/engine/System.Collections.Immutable.dll
Normal file → Executable file
Binary file not shown.
BIN
jackify/engine/System.Collections.NonGeneric.dll
Normal file → Executable file
BIN
jackify/engine/System.Collections.NonGeneric.dll
Normal file → Executable file
Binary file not shown.
BIN
jackify/engine/System.Collections.Specialized.dll
Normal file → Executable file
BIN
jackify/engine/System.Collections.Specialized.dll
Normal file → Executable file
Binary file not shown.
BIN
jackify/engine/System.Collections.dll
Normal file → Executable file
BIN
jackify/engine/System.Collections.dll
Normal file → Executable file
Binary file not shown.
Binary file not shown.
Binary file not shown.
BIN
jackify/engine/System.ComponentModel.EventBasedAsync.dll
Normal file → Executable file
BIN
jackify/engine/System.ComponentModel.EventBasedAsync.dll
Normal file → Executable file
Binary file not shown.
BIN
jackify/engine/System.ComponentModel.Primitives.dll
Normal file → Executable file
BIN
jackify/engine/System.ComponentModel.Primitives.dll
Normal file → Executable file
Binary file not shown.
BIN
jackify/engine/System.ComponentModel.TypeConverter.dll
Normal file → Executable file
BIN
jackify/engine/System.ComponentModel.TypeConverter.dll
Normal file → Executable file
Binary file not shown.
BIN
jackify/engine/System.ComponentModel.dll
Normal file → Executable file
BIN
jackify/engine/System.ComponentModel.dll
Normal file → Executable file
Binary file not shown.
Binary file not shown.
BIN
jackify/engine/System.Console.dll
Normal file → Executable file
BIN
jackify/engine/System.Console.dll
Normal file → Executable file
Binary file not shown.
Binary file not shown.
BIN
jackify/engine/System.Data.Common.dll
Normal file → Executable file
BIN
jackify/engine/System.Data.Common.dll
Normal file → Executable file
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
BIN
jackify/engine/System.Diagnostics.FileVersionInfo.dll
Normal file → Executable file
BIN
jackify/engine/System.Diagnostics.FileVersionInfo.dll
Normal file → Executable file
Binary file not shown.
BIN
jackify/engine/System.Diagnostics.Process.dll
Normal file → Executable file
BIN
jackify/engine/System.Diagnostics.Process.dll
Normal file → Executable file
Binary file not shown.
BIN
jackify/engine/System.Diagnostics.StackTrace.dll
Normal file → Executable file
BIN
jackify/engine/System.Diagnostics.StackTrace.dll
Normal file → Executable file
Binary file not shown.
Binary file not shown.
Binary file not shown.
BIN
jackify/engine/System.Diagnostics.TraceSource.dll
Normal file → Executable file
BIN
jackify/engine/System.Diagnostics.TraceSource.dll
Normal file → Executable file
Binary file not shown.
Binary file not shown.
BIN
jackify/engine/System.Drawing.Primitives.dll
Normal file → Executable file
BIN
jackify/engine/System.Drawing.Primitives.dll
Normal file → Executable file
Binary file not shown.
BIN
jackify/engine/System.Drawing.dll
Normal file → Executable file
BIN
jackify/engine/System.Drawing.dll
Normal file → Executable file
Binary file not shown.
Binary file not shown.
BIN
jackify/engine/System.Formats.Asn1.dll
Normal file → Executable file
BIN
jackify/engine/System.Formats.Asn1.dll
Normal file → Executable file
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
BIN
jackify/engine/System.IO.Compression.Brotli.dll
Normal file → Executable file
BIN
jackify/engine/System.IO.Compression.Brotli.dll
Normal file → Executable file
Binary file not shown.
Binary file not shown.
BIN
jackify/engine/System.IO.Compression.ZipFile.dll
Normal file → Executable file
BIN
jackify/engine/System.IO.Compression.ZipFile.dll
Normal file → Executable file
Binary file not shown.
BIN
jackify/engine/System.IO.Compression.dll
Normal file → Executable file
BIN
jackify/engine/System.IO.Compression.dll
Normal file → Executable file
Binary file not shown.
Binary file not shown.
BIN
jackify/engine/System.IO.FileSystem.DriveInfo.dll
Normal file → Executable file
BIN
jackify/engine/System.IO.FileSystem.DriveInfo.dll
Normal file → Executable file
Binary file not shown.
Binary file not shown.
BIN
jackify/engine/System.IO.FileSystem.Watcher.dll
Normal file → Executable file
BIN
jackify/engine/System.IO.FileSystem.Watcher.dll
Normal file → Executable file
Binary file not shown.
Binary file not shown.
Binary file not shown.
BIN
jackify/engine/System.IO.MemoryMappedFiles.dll
Normal file → Executable file
BIN
jackify/engine/System.IO.MemoryMappedFiles.dll
Normal file → Executable file
Binary file not shown.
Binary file not shown.
BIN
jackify/engine/System.IO.Pipes.dll
Normal file → Executable file
BIN
jackify/engine/System.IO.Pipes.dll
Normal file → Executable file
Binary file not shown.
Binary file not shown.
Binary file not shown.
BIN
jackify/engine/System.Linq.Expressions.dll
Normal file → Executable file
BIN
jackify/engine/System.Linq.Expressions.dll
Normal file → Executable file
Binary file not shown.
BIN
jackify/engine/System.Linq.Parallel.dll
Normal file → Executable file
BIN
jackify/engine/System.Linq.Parallel.dll
Normal file → Executable file
Binary file not shown.
Binary file not shown.
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user