mirror of
https://github.com/Omni-guides/Jackify.git
synced 2026-01-17 11:37:01 +01:00
Initial public release v0.1.0 - Linux Wabbajack Modlist Application
Jackify provides native Linux support for Wabbajack modlist installation and management with automated Steam integration and Proton configuration. Key Features: - Almost Native Linux implementation (texconv.exe run via proton) - Automated Steam shortcut creation and Proton prefix management - Both CLI and GUI interfaces, with Steam Deck optimization Supported Games: - Skyrim Special Edition - Fallout 4 - Fallout New Vegas - Oblivion, Starfield, Enderal, and diverse other games Technical Architecture: - Clean separation between frontend and backend services - Powered by jackify-engine 0.3.x for Wabbajack-matching modlist installation
This commit is contained in:
36
.gitignore
vendored
Normal file
36
.gitignore
vendored
Normal file
@@ -0,0 +1,36 @@
|
||||
# Byte-compiled / optimized / DLL files
|
||||
__pycache__/
|
||||
*.py[cod]
|
||||
*$py.class
|
||||
|
||||
# Virtual environment
|
||||
venv/
|
||||
.env/
|
||||
|
||||
# Logs
|
||||
*.log
|
||||
jackify-install-engine/logs/
|
||||
logs/
|
||||
|
||||
# Build artifacts
|
||||
build/
|
||||
dist/
|
||||
core/__pycache__/
|
||||
core/modules/__pycache__/
|
||||
|
||||
# Jackify engine temp and output files
|
||||
jackify-install-engine/temp/
|
||||
jackify/engine/temp/
|
||||
jackify-install-engine/*.log
|
||||
jackify-install-engine/*.dds
|
||||
|
||||
# Editor/OS temp files
|
||||
*.tmp
|
||||
*.bak
|
||||
*.swp
|
||||
.DS_Store
|
||||
Thumbs.db
|
||||
|
||||
# NOTE: .dll, .json, .deps.json, .runtimeconfig.json in jackify-install-engine/
|
||||
# are currently tracked because it's unclear if they are required for runtime/distribution
|
||||
# or are build artifacts. Do not ignore until this is confirmed.
|
||||
450
CHANGELOG.md
Normal file
450
CHANGELOG.md
Normal file
@@ -0,0 +1,450 @@
|
||||
# Jackify Changelog
|
||||
|
||||
## v0.0.30 - FNV/Enderal Support and better Modlist Selection
|
||||
**Release Date:** September 5, 2025
|
||||
|
||||
### Major Features
|
||||
- **FNV and Enderal Modlist Support**: Complete implementation for Fallout New Vegas and Enderal modlists
|
||||
- Automatic detection via nvse_loader.exe and Enderal Launcher.exe
|
||||
- Wine components routing to vanilla game compatdata (AppID 22380 for FNV, 933480 for Enderal)
|
||||
- Proper launch options with STEAM_COMPAT_DATA_PATH before Steam restart
|
||||
- Skip DXVK.conf creation for special games using vanilla compatdata
|
||||
- **Enhanced Configuration Output**: Improved visual consistency with proper section headers and timing phases
|
||||
|
||||
### Bug Fixes
|
||||
- **Process Cleanup**: Fixed critical bug where jackify-engine processes weren't terminated when GUI window closes unexpectedly
|
||||
- Added cleanup() method to ModlistOperations class for graceful process termination
|
||||
- Enhanced cleanup_processes() methods across all GUI screens
|
||||
- Integrated with existing main GUI cleanup infrastructure
|
||||
- **Enderal Support**: Fixed Enderal modlists incorrectly showing "unsupported game" dialog
|
||||
- Added Enderal to supported games lists across codebase
|
||||
- Updated WabbajackParser with proper Enderal game type mappings
|
||||
- **Configuration Formatting**: Resolved output formatting inconsistencies between phases
|
||||
|
||||
### Improved Modlist Selection Interface
|
||||
- **Table Layout**: Replaced simple list with organized table showing Modlist Name, Download Size, Install Size, and Total Size in separate columns
|
||||
- **Server-Side Filtering**: Improved performance by filtering modlists at the engine level instead of client-side
|
||||
- **NSFW Checkbox**: Added "Show NSFW" checkbox in modlist selection (defaults to hidden)
|
||||
- **Enhanced Status Indicators**: Clear indicators for unavailable modlists ([DOWN] with strikethrough) and adult content ([NSFW] in red)
|
||||
- **Download Size Information**: Display all three size metrics (Download | Install | Total) to help users plan storage requirements
|
||||
- **This is the first step towards a vastly improved Modlist Selection, with more to come soon.**
|
||||
- **Making use of the updated jackify-engine features, such as --game and --show-all-sizes flags**
|
||||
|
||||
### Technical Improvements
|
||||
- **Special Game Detection**: Detection system using multiple fallback mechanisms
|
||||
`- **Timing System**: Implemented phase separation with proper timing resets between Installation and Configuration phases
|
||||
- **Thread Management**: Improved cleanup of ConfigThread instances across configure screens
|
||||
|
||||
---
|
||||
|
||||
## v0.0.29 - STL tidy-up, jackify-engine 0.3.10 and bug fixes
|
||||
**Release Date:** August 31, 2025
|
||||
|
||||
### Major Features
|
||||
- **STL Dependency Completely Removed**: Removed the remaining steamtinkerlaunch traces.
|
||||
- **Cross-Distribution Compatibility**: Fixed settings menu and API link compatibility issues
|
||||
|
||||
### Engine Updates
|
||||
- **jackify-engine 0.3.10**: Improvements to manual download handling and error clarity
|
||||
- **Manual Download Detection**: Phase 1 system for detecting files requiring manual download with user-friendly summaries
|
||||
- **Enhanced Error Handling**: Clear distinction between corrupted files, download failures, and hash mismatches
|
||||
- **Automatic Cleanup**: Corrupted files automatically deleted with clear guidance on root cause
|
||||
- **Better User Experience**: Numbered download instructions with exact URLs - I will be improving manual downloads in future, but this is a first step.
|
||||
|
||||
### Technical Improvements
|
||||
- **Compatibility Fixes**: Resolved UnboundLocalError in settings menu and Qt library conflicts
|
||||
- **Steam Shortcut Fix**: Fixed regression with Steam shortcut creation
|
||||
|
||||
|
||||
---
|
||||
|
||||
## v0.0.28 - Conditional Path Manipulation and Engine Update
|
||||
**Release Date:** August 30, 2025
|
||||
|
||||
### Major Features
|
||||
- **Conditional Path Manipulation**: Install a Modlist and Tuxborn Auto workflows now skip redundant path manipulation since jackify-engine 0.3.7 outputs correct paths directly
|
||||
- **Workflow Optimization**: Configure New/Existing modlists retain path manipulation for manual installations
|
||||
- **Engine Architecture**: Leverages jackify-engine's improved ModOrganizer.ini path handling
|
||||
|
||||
### Engine Updates
|
||||
- **jackify-engine 0.3.8**: Enhanced ModOrganizer.ini path generation eliminates need for post-processing in engine-based workflows
|
||||
|
||||
### Technical Improvements
|
||||
- **Selective Path Processing**: Added `engine_installed` flag to ModlistContext for workflow differentiation
|
||||
- **Build System**: AppImage builds now use dynamic version extraction from source
|
||||
|
||||
### Bug Fixes
|
||||
- **Path Corruption Prevention**: Eliminates redundant path manipulation that could introduce corruption
|
||||
- **Version Consistency**: Fixed AppImage builds to use correct version numbers automatically
|
||||
- **Steam Restart Reliability**: Improved Steam restart success rate by using aggressive pkill approach instead of unreliable steam -shutdown command
|
||||
- **Settings Menu Compatibility**: Fixed UnboundLocalError for 'row' variable when resource_settings is empty
|
||||
- **API Link Compatibility**: Replaced QDesktopServices with subprocess-based URL opening to resolve Qt library conflicts in PyInstaller environments
|
||||
|
||||
---
|
||||
|
||||
## v0.0.27 - Workflow Architecture Cleanup and Bug Fixes
|
||||
**Release Date:** August 27, 2025
|
||||
|
||||
### Bug Fixes
|
||||
- **Duplicate Shortcut Creation**: Fixed automated workflows creating multiple Steam shortcuts for the same modlist
|
||||
- **GUI Workflow Optimization**: Removed manual shortcut creation from Tuxborn Installer and Configure New Modlist workflows
|
||||
- **Workflow Consistency**: All three main workflows (Install Modlist, Configure New Modlist, Tuxborn Installer) now use unified automated approach
|
||||
|
||||
### Code Architecture Improvements
|
||||
- **Legacy Code Removal**: Eliminated unused ModlistGUIService (42KB) that was creating maintenance overhead
|
||||
- **Simplified Navigation**: ModlistTasksScreen now functions as pure navigation menu to existing workflows
|
||||
- **Clean Architecture**: Removed obsolete service imports, initializations, and cleanup methods
|
||||
- **Code Quality**: Eliminated "tombstone comments" and unused service references
|
||||
|
||||
### Technical Details
|
||||
- **Single Shortcut Creation Path**: All workflows now use `run_working_workflow()` → `create_shortcut_with_native_service()`
|
||||
- **Service Layer Cleanup**: Removed dual codepath architecture in favor of proven automated workflows
|
||||
- **Import Optimization**: Cleaned up unused service imports across GUI components
|
||||
|
||||
## v0.0.26 - Distribution Optimization and STL Integration Polish
|
||||
**Release Date:** August 20, 2025
|
||||
|
||||
### Major Improvements
|
||||
- **AppImage Size Optimization**: Implemented PyInstaller-style pre-filtering for PySide6 components, reducing AppImage size from 246M to 93M (62% reduction)
|
||||
- **STL Distribution Integration**: Fixed SteamTinkerLaunch bundling and path detection for both PyInstaller and AppImage builds
|
||||
- **Build Process Optimization**: Replaced inefficient "install everything then delete" approach with selective component installation
|
||||
|
||||
### Technical Improvements
|
||||
- **Pre-filtering Architecture**: Only install essential PySide6 modules (QtCore, QtGui, QtWidgets, QtNetwork, QtConcurrent, QtOpenGL) and their corresponding Qt libraries
|
||||
- **Unified STL Path Detection**: Created `get_stl_path()` function for consistent STL location across all environments
|
||||
- **AppImage Build Optimization**: Selective copying of Qt libraries, plugins, and data files instead of full installation
|
||||
- **PyInstaller Integration**: Fixed STL bundling using `binaries` instead of `datas` for proper execute permissions
|
||||
|
||||
### Bug Fixes
|
||||
- **AppImage STL Path Resolution**: Fixed STL not found errors in AppImage runtime environment
|
||||
- **PyInstaller STL Permissions**: Resolved permission denied errors for bundled STL binary
|
||||
- **Build Script Paths**: Corrected STL source path in AppImage build script
|
||||
- **Icon Display**: Re-added PyInstaller icon configuration for proper logo display
|
||||
|
||||
### Performance Improvements
|
||||
- **AppImage Size**: Reduced from 246M to 93M (smaller than PyInstaller's 120M)
|
||||
- **Build Efficiency**: Eliminated wasteful post-deletion operations in favor of pre-filtering
|
||||
- **Dependency Management**: Streamlined PySide6 component selection for optimal size
|
||||
|
||||
---
|
||||
|
||||
## v0.0.25 - Shortcut Creation and Configuration Automation
|
||||
**Release Date:** August 19, 2025
|
||||
|
||||
### Major Features
|
||||
- **Fully Automated Shortcut Creation**: Complete automated prefix creation workflow using SteamTinkerLaunch. Jackify can now create the required new shortcut, set it's proton version, create the prefix and set Launch Options automatically. No more Manual Steps required.
|
||||
|
||||
### Technical Improvements
|
||||
- **STL-based Prefix Creation**: Replace manual prefix setup with automated STL workflow
|
||||
- **Compatibility Tool Setting**: Direct VDF manipulation for Proton version configuration
|
||||
- **Cancellation Process Management**: Enhanced Jackify-related process detection and termination - still more to do on this during the Modlist Configuration phase.
|
||||
- **Conflict Resolution**: Added handling of shortcut conflicts and existing installations
|
||||
|
||||
### Bug Fixes
|
||||
- **Shortcut Installation Flag**: Fix Steam shortcuts not appearing in "Installed Locally" section
|
||||
- **Indentation Errors**: Fix syntax errors in modlist parsing logic
|
||||
|
||||
---
|
||||
|
||||
## v0.0.24 - Engine Performance & Stability
|
||||
**Release Date:** August 16, 2025
|
||||
|
||||
### Engine Updates
|
||||
- **jackify-engine 0.3.2**: Performance improvements regarding concurrency, and a few minor bug fixes
|
||||
- **Enhanced Modlist Parsing**: Improved parsing logic for better compatibility
|
||||
- **Resource Management**: Better memory and resource handling
|
||||
|
||||
### Bug Fixes
|
||||
- **Modlist Operations**: Fix parsing errors and improve reliability
|
||||
- **GUI Stability**: Resolve various UI-related issues
|
||||
|
||||
---
|
||||
|
||||
## v0.0.22 - SteamTinkerLaunch/Remove Manual Steps Investigation (Dev build only)
|
||||
**Release Date:** August 13, 2025
|
||||
|
||||
### Research & Development
|
||||
- **STL Integration Research**: Investigation into SteamTinkerLaunch integration possibilities, with the aim of removing the required Manual Steps with a fully automated process flow.
|
||||
- **Proton Version Setting**: Exploration of automated Proton compatibility tool configuration for new shortcuts
|
||||
- **Shortcut Creation Methods**: Analysis of different Steam shortcut creation approaches
|
||||
|
||||
---
|
||||
|
||||
## v0.0.21 - Major Engine Update & UX Overhaul
|
||||
**Release Date:** August 3, 2025
|
||||
|
||||
### Major Features
|
||||
- **jackify-engine 0.3.0**: Complete rework of the texture conversion tools, increased performance and improved compatibility
|
||||
- **Texture Conversion Tools**: Now using texconv.exe via Proton for texture processing, entirely invisible to the user.
|
||||
|
||||
### User Experience
|
||||
- **Streamlined API Key Management**: Implement silent validation
|
||||
- **Interface Changes**: Cleaned up some UI elements
|
||||
- **Error Handling**: Improved error dialogs and user feedback
|
||||
|
||||
### Technical Improvements
|
||||
- **Tool Integration**: New texture processing and diagnostic tools
|
||||
- **Performance Optimization**: Significant speed improvements in modlist (7zz, texconv.exe)
|
||||
|
||||
---
|
||||
|
||||
## v0.0.20 - GUI Regression Fixes
|
||||
**Release Date:** July 23, 2025
|
||||
|
||||
### Bug Fixes
|
||||
- **Fixed console scroll behavior during error output**
|
||||
- Resolved race condition in `_safe_append_text()` where scroll position was checked before text append
|
||||
- Added scroll position tolerance (±1px) to handle rounding issues
|
||||
- Implemented auto-recovery when user manually scrolls back to bottom
|
||||
- Applied fixes consistently across all GUI screens
|
||||
|
||||
- **Enhanced API key save functionality**
|
||||
- Added immediate visual feedback when save checkbox is toggled
|
||||
- Implemented success/failure messages with color-coded tooltips
|
||||
- Added automatic checkbox unchecking when save operations fail
|
||||
- Improved error handling with comprehensive config write permission checks
|
||||
|
||||
- **Added live API key validation**
|
||||
- New "Validate" button with threaded validation against Nexus API endpoint
|
||||
- Visual feedback for validation results (success/error states)
|
||||
- Enhanced security with masked logging and no plain text API key exposure
|
||||
- Maintains existing base64 encoding for stored API keys
|
||||
|
||||
### Engine Updates
|
||||
- **jackify-engine 0.2.11**: Performance improvements and bug fixes
|
||||
|
||||
#### Fixed
|
||||
- **Accurate DDS Texture Format Detection and Skip Logic**
|
||||
- Replaced manual DDS header parsing with BCnEncoder-based format detection for improved accuracy and parity with upstream Wabbajack.
|
||||
- Added logic to skip recompression of B8G8R8X8_UNORM textures, copying them unchanged instead (hopefully matching upstream behavior).
|
||||
- Massive performance improvement: files that previously took 15+ minutes to process now copy in seconds.
|
||||
- Fixes major texture processing performance bottleneck in ESP-embedded textures.
|
||||
|
||||
##### Technical Details
|
||||
- B8G8R8X8_UNORM format (88) is not supported by upstream Wabbajack's ToCompressionFormat; upstream appears to skip these files entirely.
|
||||
- BCnEncoder-based format detection now used for all DDS files, ensuring correct handling and skipping of unsupported formats.
|
||||
- Files detected as B8G8R8X8_UNORM now trigger copy logic instead of recompression, preventing unnecessary CPU-intensive work.
|
||||
- Root cause: Previous logic attempted BC7 recompression on unsupported texture formats, causing major slowdowns.
|
||||
|
||||
---
|
||||
|
||||
## v0.0.19 - Resource Management
|
||||
**Release Date:** 2025-07-20
|
||||
|
||||
### New Features
|
||||
- **Resource Management System**: Resource tracking and management
|
||||
- **jackify-engine 0.2.11**: Performance and stability improvements
|
||||
|
||||
### Technical Improvements
|
||||
- **Memory Management**: Better resource allocation and cleanup
|
||||
- **Process Monitoring**: Enhanced process tracking and management
|
||||
|
||||
## v0.0.18 - Build System Improvements
|
||||
**Release Date:** July 17, 2025
|
||||
|
||||
### Technical Improvements
|
||||
- **Fixed PyInstaller temp directory inclusion issue**
|
||||
- Added custom PyInstaller hook to exclude temporary files from build
|
||||
- Prevents build failures when Jackify is running during build process
|
||||
- Added automatic temp directory cleanup in build script
|
||||
- Updated .gitignore to exclude temp directory from version control
|
||||
|
||||
## v0.0.17 - Settings Dialog & UI Improvements
|
||||
**Release Date:** July 17, 2025
|
||||
|
||||
### User Experience
|
||||
- **Streamlined Resource Limits Interface**
|
||||
- Removed "Max Throughput" column
|
||||
- Added inline "Multithreading (Experimental)" checkbox for File Extractor resource
|
||||
- **Multithreading Configuration**
|
||||
- Added experimental multithreading option for 7-Zip file extraction
|
||||
- Saves `_7zzMultiThread: "on"` to resource_settings.json when enabled
|
||||
- Default state is disabled (off)
|
||||
|
||||
### Technical Improvements
|
||||
- **UI Scaling Implementation**
|
||||
- Fixed vertical scaling issues on Steam Deck (1280x800) and low-resolution displays
|
||||
- Implemented form-priority dynamic scaling across all 4 GUI screens
|
||||
- Form elements now maintain minimum 280px height to ensure full visibility
|
||||
- Console now dynamically shrinks to accommodate form needs instead of vice versa
|
||||
- Added resize event handling for real-time scaling adjustments
|
||||
- **API Key URL Regression Fix**
|
||||
- Fixed API key acquisition URLs not opening browser on Linux systems
|
||||
- Replaced unreliable automatic external link handling with manual QDesktopServices integration
|
||||
- Affects both Install Modlist and Tuxborn Auto workflows
|
||||
|
||||
### Engine Updates
|
||||
- **jackify-engine 0.2.7**: Performance improvements and bug fixes
|
||||
#### Fixed
|
||||
- **Excessive logging when resuming aborted installations**
|
||||
- Suppressed `DirectoryNotFoundException` warnings when re-running on previously aborted install directories
|
||||
- Moved these warnings to debug level while preserving retry behavior
|
||||
- Reduces noise when resuming installations without affecting functionality
|
||||
|
||||
#### Changed
|
||||
- **Texture compression performance optimization**
|
||||
- Reduced BC7, BC6H, and BC5 compression quality settings from aggressive max quality to balanced levels
|
||||
- Disabled channel weighting and adjusted compression speed settings
|
||||
- Matches upstream Wabbajack's balanced compression approach for significantly faster texture processing
|
||||
- Addresses extremely long compression times for large texture files
|
||||
|
||||
## v0.0.16 - Steam Restart & User Experience
|
||||
**Release Date:** July 16, 2025
|
||||
|
||||
### Bug Fixes
|
||||
- **Fixed Steam interface not opening after restart in PyInstaller DIST mode**
|
||||
- Added comprehensive environment cleaning to `steam_restart_service.py`
|
||||
- Prevents PyInstaller environment variables from contaminating Steam subprocess calls
|
||||
- Resolves issue where Steam interface wouldn't open after restart in three workflows requiring steam restarts
|
||||
|
||||
### User Experience
|
||||
- **Reduced popup timeout from 5 seconds to 3 seconds**
|
||||
- Updated success dialogs and message service for faster user interaction
|
||||
- Affects OK/Cancel buttons on confirmation popups
|
||||
- **Fixed Install Modlist form reset issue**
|
||||
- Form no longer resets when users select game type/modlist after filling out fields
|
||||
- Preserves user input during modlist selection workflow
|
||||
|
||||
### Workflow Improvements
|
||||
- **Fixed misleading cancellation messages**
|
||||
- Users who cancel workflows now see proper cancellation message instead of "Install Failed"
|
||||
- Added cancellation detection logic similar to existing Tuxborn installer
|
||||
|
||||
### Security
|
||||
- **Added SSL certificate verification to all HTTP requests**
|
||||
- All `requests.get()` calls now include `verify=True` parameter
|
||||
- Improves security of downloads from GitHub APIs and other external sources
|
||||
- Zero impact on functionality, pure security hardening
|
||||
- **Removed hardcoded test paths**
|
||||
- Cleaned up development test paths from `wabbajack_handler.py`
|
||||
- Improved code hygiene and security posture
|
||||
|
||||
### Technical Improvements
|
||||
- Enhanced environment variable cleaning in steam restart service
|
||||
- Improved error handling and user feedback in workflow cancellation
|
||||
- Consolidated timeout handling across GUI components
|
||||
|
||||
---
|
||||
|
||||
## v0.0.15 - GUI Workflow Logging Refactor
|
||||
**Release Date:** July 15, 2025
|
||||
|
||||
### Major Fixes
|
||||
- **GUI Workflow Logging Refactor**: Complete overhaul of logging behavior across all 4 GUI workflows
|
||||
- Fixed premature log rotation that was creating .1 files before workflows started
|
||||
- Moved log rotation from screen initialization to workflow execution start
|
||||
- Eliminated early log file creation in Install Modlist and Configure Existing workflows
|
||||
- All workflows now have proper log rotation timing and clean startup behavior
|
||||
|
||||
### Technical Improvements
|
||||
- **Backend Service Integration**: Removed remaining CLI subprocess calls from Configure New Modlist workflow
|
||||
- Replaced CLI-based configuration with direct backend service calls
|
||||
- Unified manual steps validation across all workflows using backend services
|
||||
- Improved consistency between Tuxborn Automatic and Configure New Modlist workflows
|
||||
|
||||
### Technical Details
|
||||
- **Thread Safety**: Preserved thread cleanup improvements in all workflows
|
||||
- **Error Handling**: Improved error handling and user feedback during workflow failures
|
||||
- **Code Consistency**: Unified patterns across all 4 workflows for maintainability
|
||||
|
||||
This release completes the logging refactor that was blocking development workflow.
|
||||
|
||||
## v0.0.14 - User Experience & Steam Restart
|
||||
**Release Date:** July 9, 2025
|
||||
|
||||
### User Experience
|
||||
- Introduced protection from accidental confirmations etc due to focus-stealing popups: All user-facing dialogs (info, warnings, confirmations) now use the new MessageService with safety levels (LOW, MEDIUM, HIGH) to prevent focus-stealing and accidental confirmation.
|
||||
- Steam restart workflow improvements: Unified and hardened the logic for restarting Steam and handling post-restart manual steps in all workflows (Tuxborn Installer, Install Modlist, Configure New/Existing Modlist).
|
||||
|
||||
## v0.0.13 - Directory Safety & Configuration
|
||||
**Release Date:** July 8, 2025
|
||||
|
||||
### New Features
|
||||
- **Directory Safety System:** Prevents installation to dangerous system directories; adds install directory markers for validation.
|
||||
- **Warning Dialogs:** Custom Jackify-themed warning dialogs for unsafe operations.
|
||||
|
||||
### Bug Fixes
|
||||
- Fixed 'TuxbornInstallerScreen' object has no attribute 'context' errors.
|
||||
|
||||
### Technical Improvements
|
||||
- **Configuration Persistence:** Debug mode and other settings persist across sessions.
|
||||
- **Upgraded jackify-engine to 0.2.6, which includes:**
|
||||
|
||||
### Engine Updates
|
||||
- **jackify-engine 0.2.6**: Performance improvements and enhanced user feedback
|
||||
|
||||
#### Added
|
||||
- **Enhanced user feedback during long-running operations**
|
||||
- Single-line progress updates for extraction, texture conversion, and BSA building phases
|
||||
- Real-time progress counters showing current/total items (e.g., "Converting Textures (123/456): filename.dds")
|
||||
- Smart filename truncation to prevent line wrapping in narrow console windows
|
||||
- Carriage return-based progress display for cleaner console output
|
||||
|
||||
#### Fixed
|
||||
- **Temp directory cleanup after installation**
|
||||
- Added explicit disposal of temporary file manager to ensure `__temp__` directory is properly cleaned up
|
||||
- Prevents accumulation of temporary files in modlist install directories
|
||||
- Cleanup occurs whether installation succeeds or fails
|
||||
|
||||
#### Changed
|
||||
- **Console output improvements**
|
||||
- Progress updates now use single-line format with carriage returns for better user experience
|
||||
- Maintains compatibility with Jackify's output parsing system
|
||||
- Preserves all existing logging and error reporting functionality
|
||||
|
||||
## v0.0.12 - Success Dialog & UI Improvements
|
||||
**Release Date:** July 7, 2025
|
||||
|
||||
### New Features
|
||||
* Redesigned the workflow completion (“Success”) dialog.
|
||||
* Added an application icon, bundled with the PyInstaller build. Assets are now stored in a dedicated assets/ directory.
|
||||
* Added fallback to pkill for instances where `steam -shutdown` wasn't working.
|
||||
|
||||
### User Experience
|
||||
* All main workflows (Install, Tuxborn Auto, Configure New, Configure Existing) now use the updated SuccessDialog and display the correct game type.
|
||||
* Improved field validation and error handling before starting installs.
|
||||
* Changed text on pop up when user cancels workflow, was previously reusing Failed Install dialog.
|
||||
* Upgraded jackify-engine to latest build (v.0.2.5)
|
||||
* Temporarily hid the non-primary workflow functions from both GUI and CLI FE's.
|
||||
|
||||
### Bug Fixes
|
||||
* Fixed missing app icon in PyInstaller builds by updating the spec file and asset paths.
|
||||
* Scroll Bar behaviour - should be much better now
|
||||
|
||||
## v0.0.11 - Configurable Directories & Game Support
|
||||
**Release Date:** July 4, 2025
|
||||
|
||||
### New Features
|
||||
- **Configurable Base Directories**: Users can now customize default install and download base directories via `~/.config/jackify/config.json`
|
||||
- `modlist_install_base_dir`: Default `/home/user/Games`
|
||||
- `modlist_downloads_base_dir`: Default `/home/user/Games/Modlist_Downloads`
|
||||
- **Enhanced Game Type Support**: Added support for new game types
|
||||
- Starfield
|
||||
- Oblivion Remastered
|
||||
- Improved game type detection and categorization
|
||||
- **Unsupported Game Handling**: Clear warnings for unsupported games (e.g., Cyberpunk 2077)
|
||||
- GUI: Pop-up alert with user confirmation
|
||||
- CLI: Matching warning message with user confirmation
|
||||
- **Simplified Directory Autofill**:
|
||||
- Clean default paths without guessing or appending modlist names
|
||||
- Consistent behavior across all configuration-based screens
|
||||
|
||||
### Technical Improvements
|
||||
- **DXVK Configuration**: fixed malformed dxvk.conf contents
|
||||
- Now generates: `dxvk.enableGraphicsPipelineLibrary = False`
|
||||
- **UI/UX Improvements**:
|
||||
- Removed redundant "Return to Main Menu" buttons
|
||||
- Improved dialog spacing, button order, and color consistency
|
||||
|
||||
### Bug Fixes
|
||||
- **Game Type Filtering**: Fixed modlists appearing in multiple categories by improving matching logic
|
||||
- **CLI/GUI Parity**: Unified backend service usage for consistent behavior across interfaces
|
||||
|
||||
## v0.0.10 - Previous Development Version
|
||||
**Release Date:** Early Development
|
||||
- Core CLI features implemented for running Wabbajack modlists on Linux.
|
||||
- Initial support for Steam Deck and native Linux environments.
|
||||
- Modular handler architecture for extensibility.
|
||||
|
||||
## v0.0.09 and Earlier
|
||||
See commit history for previous versions.
|
||||
120
README.md
Normal file
120
README.md
Normal file
@@ -0,0 +1,120 @@
|
||||
# Jackify
|
||||
|
||||
**Native Linux modlist installer and manager for Wabbajack modlists**
|
||||
|
||||
Jackify enables seamless installation and configuration of Wabbajack modlists on Linux systems, providing automated Steam integration and Proton prefix management without requiring Windows dependencies.
|
||||
|
||||
## Features
|
||||
|
||||
- **Native Linux Support**: Pure Linux implementation with no Wine/Windows dependencies for core operations
|
||||
- **Automated Steam Integration**: Automatic Steam shortcut creation with proper Proton configuration
|
||||
- **Comprehensive Modlist Support**: Support for Skyrim, Fallout 4, Fallout New Vegas, Oblivion, Starfield, and more
|
||||
- **Professional Interface**: Both CLI and GUI interfaces with enhanced modlist selection and metadata display
|
||||
- **Steam Deck Optimized**: Full Steam Deck support with controller-friendly interface
|
||||
- **Advanced Filtering**: Smart categorization with NSFW filtering and game-specific organization
|
||||
|
||||
## Quick Start
|
||||
|
||||
### Requirements
|
||||
|
||||
- Linux system (Steam Deck supported)
|
||||
- Steam installed and configured
|
||||
- Python 3.8+ (for source installation)
|
||||
|
||||
### Installation
|
||||
|
||||
#### AppImage (Recommended)
|
||||
```bash
|
||||
# Download latest release
|
||||
wget https://github.com/your-repo/jackify/releases/latest/jackify.AppImage
|
||||
chmod +x jackify.AppImage
|
||||
./jackify.AppImage
|
||||
```
|
||||
|
||||
#### From Source
|
||||
```bash
|
||||
git clone https://github.com/your-repo/jackify.git
|
||||
cd jackify/src
|
||||
pip install -r requirements.txt
|
||||
python -m jackify.frontends.gui # GUI mode
|
||||
python -m jackify.frontends.cli # CLI mode
|
||||
```
|
||||
|
||||
## Usage
|
||||
|
||||
### GUI Mode
|
||||
Launch the GUI and navigate through the intuitive interface:
|
||||
1. Select "Modlist Tasks" → "Install a Modlist"
|
||||
2. Choose your game type and modlist
|
||||
3. Configure installation and download directories
|
||||
4. Enter your Nexus API key
|
||||
5. Let Jackify handle the rest
|
||||
|
||||
### CLI Mode
|
||||
```bash
|
||||
python -m jackify.frontends.cli
|
||||
```
|
||||
Follow the interactive prompts to configure and install modlists.
|
||||
|
||||
## Supported Games
|
||||
|
||||
- **Skyrim Special Edition** (88+ modlists)
|
||||
- **Fallout 4** (22+ modlists)
|
||||
- **Fallout New Vegas** (13+ modlists)
|
||||
- **Oblivion**
|
||||
- **Starfield**
|
||||
- **Enderal**
|
||||
- **Other Games** (Cyberpunk 2077, Baldur's Gate 3, and more)
|
||||
|
||||
## Architecture
|
||||
|
||||
Jackify follows a clean separation between frontend and backend:
|
||||
|
||||
- **Backend Services**: Pure business logic with no UI dependencies
|
||||
- **Frontend Interfaces**: CLI and GUI implementations using shared backend
|
||||
- **Native Engine**: Powered by jackify-engine for optimal performance
|
||||
- **Steam Integration**: Direct Steam shortcuts.vdf manipulation
|
||||
|
||||
## Configuration
|
||||
|
||||
Configuration files are stored in:
|
||||
- **Linux**: `~/.config/jackify/`
|
||||
- **Steam Deck**: `~/.config/jackify/`
|
||||
|
||||
## Development
|
||||
|
||||
### Building from Source
|
||||
```bash
|
||||
cd src
|
||||
pip install -r requirements-packaging.txt
|
||||
pyinstaller jackify.spec
|
||||
```
|
||||
|
||||
### Running Tests
|
||||
```bash
|
||||
python -m pytest tests/
|
||||
```
|
||||
|
||||
## License
|
||||
|
||||
This project is licensed under the MIT License - see the LICENSE file for details.
|
||||
|
||||
## Contributing
|
||||
|
||||
Contributions are welcome! Please read our contributing guidelines and submit pull requests for any improvements.
|
||||
|
||||
## Support
|
||||
|
||||
- **Issues**: Report bugs and request features via GitHub Issues
|
||||
- **Documentation**: See the Wiki for detailed guides
|
||||
- **Community**: Join our community discussions
|
||||
|
||||
## Acknowledgments
|
||||
|
||||
- Wabbajack team for the modlist ecosystem
|
||||
- jackify-engine developers
|
||||
- Steam Deck and Linux gaming community
|
||||
|
||||
---
|
||||
|
||||
**Jackify** - Bringing professional modlist management to Linux
|
||||
BIN
assets/JackifyLogo_256.png
Normal file
BIN
assets/JackifyLogo_256.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 36 KiB |
6
docs/SessionEnd.md
Normal file
6
docs/SessionEnd.md
Normal file
@@ -0,0 +1,6 @@
|
||||
Session End: 2025-07-06
|
||||
|
||||
- v0.0.12 released: SuccessDialog redesign, accurate workflow timing, robust app icon, game type display, non-modal dialog, asset best practices, and more.
|
||||
- All changes merged to master and pushed to remote.
|
||||
- See CHANGELOG.md for full details.
|
||||
|
||||
106
hook-PySide6.py
Normal file
106
hook-PySide6.py
Normal file
@@ -0,0 +1,106 @@
|
||||
# Custom PyInstaller hook to optimize PySide6 by removing unused components
|
||||
# This significantly reduces build size by excluding unnecessary Qt modules and tools
|
||||
|
||||
from PyInstaller.utils.hooks import collect_data_files, collect_submodules
|
||||
import os
|
||||
import shutil
|
||||
from pathlib import Path
|
||||
|
||||
def hook(hook_api):
|
||||
"""
|
||||
PySide6 optimization hook - removes unused Qt modules and development tools
|
||||
to reduce build size and improve startup performance.
|
||||
"""
|
||||
|
||||
# Get the PySide6 data files
|
||||
pyside_datas = collect_data_files('PySide6')
|
||||
|
||||
# Filter out unnecessary components
|
||||
filtered_datas = []
|
||||
|
||||
for src, dst in pyside_datas:
|
||||
# Skip development tools and scripts
|
||||
if any(skip in src for skip in [
|
||||
'/scripts/',
|
||||
'/assistant/',
|
||||
'/designer/',
|
||||
'/linguist/',
|
||||
'/lupdate',
|
||||
'/lrelease',
|
||||
'/qmllint',
|
||||
'/qmlformat',
|
||||
'/qmlls',
|
||||
'/qsb',
|
||||
'/svgtoqml',
|
||||
'/balsam',
|
||||
'/balsamui'
|
||||
]):
|
||||
continue
|
||||
|
||||
# Skip unused Qt modules (keep only what Jackify uses)
|
||||
if any(skip in src for skip in [
|
||||
'Qt3D',
|
||||
'QtBluetooth',
|
||||
'QtCharts',
|
||||
'QtConcurrent', # Keep this one - might be needed
|
||||
'QtDataVisualization',
|
||||
'QtDBus',
|
||||
'QtDesigner',
|
||||
'QtGraphs',
|
||||
'QtHelp',
|
||||
'QtHttpServer',
|
||||
'QtLocation',
|
||||
'QtMultimedia',
|
||||
'QtNfc',
|
||||
'QtOpenGL', # Keep this one - might be needed by QtWidgets
|
||||
'QtPdf',
|
||||
'QtPositioning',
|
||||
'QtPrintSupport',
|
||||
'QtQml',
|
||||
'QtQuick',
|
||||
'QtRemoteObjects',
|
||||
'QtScxml',
|
||||
'QtSensors',
|
||||
'QtSerial',
|
||||
'QtSpatialAudio',
|
||||
'QtSql',
|
||||
'QtStateMachine',
|
||||
'QtSvg',
|
||||
'QtTest',
|
||||
'QtTextToSpeech',
|
||||
'QtWeb',
|
||||
'QtXml',
|
||||
'QtNetworkAuth',
|
||||
'QtUiTools'
|
||||
]):
|
||||
continue
|
||||
|
||||
# Keep core modules that Jackify uses
|
||||
if any(keep in src for keep in [
|
||||
'QtCore',
|
||||
'QtGui',
|
||||
'QtWidgets',
|
||||
'QtNetwork'
|
||||
]):
|
||||
filtered_datas.append((src, dst))
|
||||
continue
|
||||
|
||||
# Add the filtered data files
|
||||
hook_api.add_datas(filtered_datas)
|
||||
|
||||
# Also filter submodules to exclude unused ones
|
||||
pyside_modules = collect_submodules('PySide6')
|
||||
filtered_modules = []
|
||||
|
||||
for module in pyside_modules:
|
||||
# Keep only core modules
|
||||
if any(keep in module for keep in [
|
||||
'PySide6.QtCore',
|
||||
'PySide6.QtGui',
|
||||
'PySide6.QtWidgets',
|
||||
'PySide6.QtNetwork'
|
||||
]):
|
||||
filtered_modules.append(module)
|
||||
|
||||
# Add the filtered modules
|
||||
hook_api.add_imports(*filtered_modules)
|
||||
17
hook-jackify.py
Normal file
17
hook-jackify.py
Normal file
@@ -0,0 +1,17 @@
|
||||
# Custom hook to exclude temp directory from Jackify engine data collection
|
||||
from PyInstaller.utils.hooks import collect_data_files
|
||||
import os
|
||||
|
||||
def hook(hook_api):
|
||||
# Get the original data files for jackify.engine
|
||||
datas = collect_data_files('jackify.engine')
|
||||
|
||||
# Filter out any files in the temp directory
|
||||
filtered_datas = []
|
||||
for src, dst in datas:
|
||||
# Skip any files that contain 'temp' in their path
|
||||
if 'temp' not in src:
|
||||
filtered_datas.append((src, dst))
|
||||
|
||||
# Set the filtered data files
|
||||
hook_api.add_datas(filtered_datas)
|
||||
49
jackify.spec
Normal file
49
jackify.spec
Normal file
@@ -0,0 +1,49 @@
|
||||
# -*- mode: python ; coding: utf-8 -*-
|
||||
|
||||
|
||||
a = Analysis(
|
||||
['jackify/frontends/gui/__main__.py'],
|
||||
pathex=[],
|
||||
binaries=[],
|
||||
datas=[('jackify/engine', 'jackify/engine'), ('jackify/shared', 'jackify/shared'), ('assets/JackifyLogo_256.png', 'assets')],
|
||||
hiddenimports=[
|
||||
'PySide6.QtCore', 'PySide6.QtGui', 'PySide6.QtWidgets',
|
||||
'jackify.backend.core', 'jackify.backend.handlers', 'jackify.backend.services', 'jackify.backend.models',
|
||||
'jackify.backend.handlers.resolution_handler', 'jackify.backend.handlers.modlist_handler',
|
||||
'jackify.backend.handlers.menu_handler', 'jackify.backend.handlers.path_handler',
|
||||
'jackify.frontends.cli', 'jackify.frontends.cli.main',
|
||||
'jackify.frontends.cli.menus', 'jackify.frontends.cli.menus.main_menu',
|
||||
'jackify.frontends.cli.menus.tuxborn_menu', 'jackify.frontends.cli.menus.wabbajack_menu',
|
||||
'jackify.frontends.gui.widgets.unsupported_game_dialog',
|
||||
'jackify.shared.paths', 'jackify.shared.ui_utils'
|
||||
],
|
||||
hookspath=['.'],
|
||||
hooksconfig={},
|
||||
runtime_hooks=[],
|
||||
excludes=['tkinter', 'matplotlib', 'numpy', 'scipy', 'pandas', 'IPython', 'jupyter', 'test', 'tests', 'unittest'],
|
||||
noarchive=False,
|
||||
optimize=0,
|
||||
)
|
||||
pyz = PYZ(a.pure)
|
||||
|
||||
exe = EXE(
|
||||
pyz,
|
||||
a.scripts,
|
||||
a.binaries,
|
||||
a.datas,
|
||||
[],
|
||||
name='jackify',
|
||||
debug=False,
|
||||
bootloader_ignore_signals=False,
|
||||
strip=False,
|
||||
upx=True,
|
||||
upx_exclude=[],
|
||||
runtime_tmpdir=None,
|
||||
console=True,
|
||||
disable_windowed_traceback=False,
|
||||
argv_emulation=False,
|
||||
target_arch=None,
|
||||
codesign_identity=None,
|
||||
entitlements_file=None,
|
||||
icon='assets/JackifyLogo_256.png',
|
||||
)
|
||||
8
jackify/__init__.py
Normal file
8
jackify/__init__.py
Normal file
@@ -0,0 +1,8 @@
|
||||
"""
|
||||
Jackify - A tool for running Wabbajack modlists on Linux
|
||||
|
||||
This package provides both CLI and GUI interfaces for managing
|
||||
Wabbajack modlists natively on Linux systems.
|
||||
"""
|
||||
|
||||
__version__ = "0.0.30"
|
||||
21
jackify/__main__.py
Normal file
21
jackify/__main__.py
Normal file
@@ -0,0 +1,21 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Main entry point for Jackify package.
|
||||
Launches the GUI by default.
|
||||
"""
|
||||
|
||||
import sys
|
||||
import os
|
||||
|
||||
# Add the src directory to the Python path
|
||||
src_dir = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
|
||||
if src_dir not in sys.path:
|
||||
sys.path.insert(0, src_dir)
|
||||
|
||||
def main():
|
||||
"""Main entry point - launch GUI by default"""
|
||||
from jackify.frontends.gui.main import main as gui_main
|
||||
return gui_main()
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
6
jackify/backend/__init__.py
Normal file
6
jackify/backend/__init__.py
Normal file
@@ -0,0 +1,6 @@
|
||||
"""
|
||||
Jackify Backend
|
||||
|
||||
Pure business logic layer with no user interaction.
|
||||
Provides services and handlers for modlist management.
|
||||
"""
|
||||
5
jackify/backend/core/__init__.py
Normal file
5
jackify/backend/core/__init__.py
Normal file
@@ -0,0 +1,5 @@
|
||||
"""
|
||||
Backend Core Operations
|
||||
|
||||
High-level business logic operations.
|
||||
"""
|
||||
1479
jackify/backend/core/modlist_operations.py
Normal file
1479
jackify/backend/core/modlist_operations.py
Normal file
File diff suppressed because it is too large
Load Diff
5
jackify/backend/handlers/__init__.py
Normal file
5
jackify/backend/handlers/__init__.py
Normal file
@@ -0,0 +1,5 @@
|
||||
"""
|
||||
Backend Handlers
|
||||
|
||||
Business logic handlers with UI interactions removed.
|
||||
"""
|
||||
94
jackify/backend/handlers/completers.py
Normal file
94
jackify/backend/handlers/completers.py
Normal file
@@ -0,0 +1,94 @@
|
||||
"""
|
||||
completers.py
|
||||
Reusable tab completion functions for Jackify CLI, including bash-like path completion.
|
||||
"""
|
||||
|
||||
import os
|
||||
import readline
|
||||
import logging # Added for debugging
|
||||
|
||||
# Get a logger for this module
|
||||
completer_logger = logging.getLogger(__name__) # Logger will be named src.modules.completers
|
||||
|
||||
# Set level to DEBUG for this logger to ensure all debug messages are generated.
|
||||
# These messages will be handled by handlers configured in the main application (e.g., via LoggingHandler).
|
||||
completer_logger.setLevel(logging.INFO)
|
||||
|
||||
# Ensure messages DO NOT propagate to the root logger's console handler by default.
|
||||
# A dedicated file handler will be added in jackify-cli.py.
|
||||
completer_logger.propagate = False
|
||||
|
||||
# IMPORTANT: Do NOT include '/' in the completer delimiters!
|
||||
# Use: readline.set_completer_delims(' \t\n;')
|
||||
|
||||
def path_completer(text, state):
|
||||
"""
|
||||
Bash-like pathname completer for readline.
|
||||
Args:
|
||||
text: The text to complete (provided by readline, e.g., "/foo/b" or "b" or "")
|
||||
state: The state index (0 for first match, 1 for second, etc.)
|
||||
Returns:
|
||||
The matching completion string that should replace 'text', or None.
|
||||
"""
|
||||
line_buffer = readline.get_line_buffer()
|
||||
begidx = readline.get_begidx()
|
||||
endidx = readline.get_endidx()
|
||||
|
||||
effective_text_for_completion = line_buffer[:endidx]
|
||||
expanded_effective_text = os.path.expanduser(os.path.expandvars(effective_text_for_completion))
|
||||
|
||||
# Special case: if text is an exact directory (no trailing slash), complete to text + '/'
|
||||
if os.path.isdir(text) and not text.endswith(os.sep):
|
||||
if state == 0:
|
||||
return text + os.sep
|
||||
else:
|
||||
return None
|
||||
|
||||
# Normal completion logic
|
||||
if os.path.isdir(expanded_effective_text):
|
||||
disk_basedir = expanded_effective_text
|
||||
disk_item_prefix = ""
|
||||
else:
|
||||
disk_basedir = os.path.dirname(expanded_effective_text)
|
||||
disk_item_prefix = os.path.basename(expanded_effective_text)
|
||||
if not disk_basedir:
|
||||
disk_basedir = "."
|
||||
|
||||
matched_item_names_on_disk = []
|
||||
try:
|
||||
if not os.path.exists(disk_basedir) or not os.path.isdir(disk_basedir):
|
||||
completer_logger.warning(f" Disk basedir '{disk_basedir}' non-existent or not a dir. No disk matches.")
|
||||
else:
|
||||
dir_contents = os.listdir(disk_basedir)
|
||||
for item_name in dir_contents:
|
||||
if item_name.startswith(disk_item_prefix):
|
||||
matched_item_names_on_disk.append(item_name)
|
||||
except OSError as e:
|
||||
completer_logger.error(f" OSError listing '{disk_basedir}': {e}")
|
||||
|
||||
final_match_strings_for_readline = []
|
||||
text_dir_part = os.path.dirname(text)
|
||||
# If text is a directory with trailing slash, use it as the base for completions
|
||||
if os.path.isdir(text) and text.endswith(os.sep):
|
||||
base_path = text
|
||||
elif os.path.isdir(text):
|
||||
base_path = text + os.sep
|
||||
else:
|
||||
base_path = text_dir_part + os.sep if text_dir_part else ""
|
||||
|
||||
for item_name in matched_item_names_on_disk:
|
||||
result_str_for_readline = os.path.join(base_path, item_name)
|
||||
actual_disk_path_of_item = os.path.join(disk_basedir, item_name)
|
||||
if os.path.isdir(actual_disk_path_of_item):
|
||||
result_str_for_readline += os.sep
|
||||
final_match_strings_for_readline.append(result_str_for_readline)
|
||||
final_match_strings_for_readline.sort()
|
||||
try:
|
||||
match = final_match_strings_for_readline[state]
|
||||
completer_logger.debug(f" Returning match for state {state}: '{match}'")
|
||||
return match
|
||||
except IndexError:
|
||||
return None
|
||||
except Exception as e:
|
||||
completer_logger.exception(f" Unexpected error retrieving match for state {state}: {e}")
|
||||
return None
|
||||
484
jackify/backend/handlers/config_handler.py
Normal file
484
jackify/backend/handlers/config_handler.py
Normal file
@@ -0,0 +1,484 @@
|
||||
#!/usr/bin/env python3
|
||||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
Configuration Handler Module
|
||||
Handles application settings and configuration
|
||||
"""
|
||||
|
||||
import os
|
||||
import json
|
||||
import logging
|
||||
import shutil
|
||||
import re
|
||||
import base64
|
||||
from pathlib import Path
|
||||
|
||||
# Initialize logger
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class ConfigHandler:
|
||||
"""
|
||||
Handles application configuration and settings
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
"""Initialize configuration handler with default settings"""
|
||||
self.config_dir = os.path.expanduser("~/.config/jackify")
|
||||
self.config_file = os.path.join(self.config_dir, "config.json")
|
||||
self.settings = {
|
||||
"version": "0.0.5",
|
||||
"last_selected_modlist": None,
|
||||
"steam_libraries": [],
|
||||
"resolution": None,
|
||||
"protontricks_path": None,
|
||||
"steam_path": None,
|
||||
"nexus_api_key": None, # Base64 encoded API key
|
||||
"default_install_parent_dir": None, # Parent directory for modlist installations
|
||||
"default_download_parent_dir": None, # Parent directory for downloads
|
||||
"modlist_install_base_dir": os.path.expanduser("~/Games"), # Configurable base directory for modlist installations
|
||||
"modlist_downloads_base_dir": os.path.expanduser("~/Games/Modlist_Downloads") # Configurable base directory for downloads
|
||||
}
|
||||
|
||||
# Load configuration if exists
|
||||
self._load_config()
|
||||
|
||||
# If steam_path is not set, detect it
|
||||
if not self.settings["steam_path"]:
|
||||
self.settings["steam_path"] = self._detect_steam_path()
|
||||
# Save the updated settings
|
||||
self.save_config()
|
||||
|
||||
def _detect_steam_path(self):
|
||||
"""
|
||||
Detect the Steam installation path
|
||||
|
||||
Returns:
|
||||
str: Path to the Steam installation or None if not found
|
||||
"""
|
||||
logger.info("Detecting Steam installation path...")
|
||||
|
||||
# Common Steam installation paths
|
||||
steam_paths = [
|
||||
os.path.expanduser("~/.steam/steam"),
|
||||
os.path.expanduser("~/.local/share/Steam"),
|
||||
os.path.expanduser("~/.steam/root")
|
||||
]
|
||||
|
||||
# Check each path
|
||||
for path in steam_paths:
|
||||
if os.path.exists(path):
|
||||
logger.info(f"Found Steam installation at: {path}")
|
||||
return path
|
||||
|
||||
# If not found in common locations, try to find using libraryfolders.vdf
|
||||
libraryfolders_vdf_paths = [
|
||||
os.path.expanduser("~/.steam/steam/config/libraryfolders.vdf"),
|
||||
os.path.expanduser("~/.local/share/Steam/config/libraryfolders.vdf"),
|
||||
os.path.expanduser("~/.steam/root/config/libraryfolders.vdf")
|
||||
]
|
||||
|
||||
for vdf_path in libraryfolders_vdf_paths:
|
||||
if os.path.exists(vdf_path):
|
||||
# Extract the Steam path from the libraryfolders.vdf path
|
||||
steam_path = os.path.dirname(os.path.dirname(vdf_path))
|
||||
logger.info(f"Found Steam installation at: {steam_path}")
|
||||
return steam_path
|
||||
|
||||
logger.error("Steam installation not found")
|
||||
return None
|
||||
|
||||
def _load_config(self):
|
||||
"""Load configuration from file"""
|
||||
try:
|
||||
if os.path.exists(self.config_file):
|
||||
with open(self.config_file, 'r') as f:
|
||||
saved_config = json.load(f)
|
||||
# Update settings with saved values while preserving defaults
|
||||
self.settings.update(saved_config)
|
||||
logger.debug("Loaded configuration from file")
|
||||
else:
|
||||
logger.debug("No configuration file found, using defaults")
|
||||
self._create_config_dir()
|
||||
except Exception as e:
|
||||
logger.error(f"Error loading configuration: {e}")
|
||||
|
||||
def _create_config_dir(self):
|
||||
"""Create configuration directory if it doesn't exist"""
|
||||
try:
|
||||
os.makedirs(self.config_dir, exist_ok=True)
|
||||
logger.debug(f"Created configuration directory: {self.config_dir}")
|
||||
except Exception as e:
|
||||
logger.error(f"Error creating configuration directory: {e}")
|
||||
|
||||
def save_config(self):
|
||||
"""Save current configuration to file"""
|
||||
try:
|
||||
self._create_config_dir()
|
||||
with open(self.config_file, 'w') as f:
|
||||
json.dump(self.settings, f, indent=2)
|
||||
logger.debug("Saved configuration to file")
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.error(f"Error saving configuration: {e}")
|
||||
return False
|
||||
|
||||
def get(self, key, default=None):
|
||||
"""Get a configuration value by key"""
|
||||
return self.settings.get(key, default)
|
||||
|
||||
def set(self, key, value):
|
||||
"""Set a configuration value"""
|
||||
self.settings[key] = value
|
||||
return True
|
||||
|
||||
def update(self, settings_dict):
|
||||
"""Update multiple configuration values"""
|
||||
self.settings.update(settings_dict)
|
||||
return True
|
||||
|
||||
def add_steam_library(self, path):
|
||||
"""Add a Steam library path to configuration"""
|
||||
if path not in self.settings["steam_libraries"]:
|
||||
self.settings["steam_libraries"].append(path)
|
||||
logger.debug(f"Added Steam library: {path}")
|
||||
return True
|
||||
return False
|
||||
|
||||
def remove_steam_library(self, path):
|
||||
"""Remove a Steam library path from configuration"""
|
||||
if path in self.settings["steam_libraries"]:
|
||||
self.settings["steam_libraries"].remove(path)
|
||||
logger.debug(f"Removed Steam library: {path}")
|
||||
return True
|
||||
return False
|
||||
|
||||
def set_resolution(self, width, height):
|
||||
"""Set preferred resolution"""
|
||||
resolution = f"{width}x{height}"
|
||||
self.settings["resolution"] = resolution
|
||||
logger.debug(f"Set resolution to: {resolution}")
|
||||
return True
|
||||
|
||||
def get_resolution(self):
|
||||
"""Get preferred resolution"""
|
||||
return self.settings.get("resolution")
|
||||
|
||||
def set_last_modlist(self, modlist_name):
|
||||
"""Save the last selected modlist"""
|
||||
self.settings["last_selected_modlist"] = modlist_name
|
||||
logger.debug(f"Set last selected modlist to: {modlist_name}")
|
||||
return True
|
||||
|
||||
def get_last_modlist(self):
|
||||
"""Get the last selected modlist"""
|
||||
return self.settings.get("last_selected_modlist")
|
||||
|
||||
def set_protontricks_path(self, path):
|
||||
"""Set the path to protontricks executable"""
|
||||
self.settings["protontricks_path"] = path
|
||||
logger.debug(f"Set protontricks path to: {path}")
|
||||
return True
|
||||
|
||||
def get_protontricks_path(self):
|
||||
"""Get the path to protontricks executable"""
|
||||
return self.settings.get("protontricks_path")
|
||||
|
||||
def save_api_key(self, api_key):
|
||||
"""
|
||||
Save Nexus API key with base64 encoding
|
||||
|
||||
Args:
|
||||
api_key (str): Plain text API key
|
||||
|
||||
Returns:
|
||||
bool: True if saved successfully, False otherwise
|
||||
"""
|
||||
try:
|
||||
if api_key:
|
||||
# Encode the API key using base64
|
||||
encoded_key = base64.b64encode(api_key.encode('utf-8')).decode('utf-8')
|
||||
self.settings["nexus_api_key"] = encoded_key
|
||||
logger.debug("API key saved successfully")
|
||||
else:
|
||||
# Clear the API key if empty
|
||||
self.settings["nexus_api_key"] = None
|
||||
logger.debug("API key cleared")
|
||||
|
||||
return self.save_config()
|
||||
except Exception as e:
|
||||
logger.error(f"Error saving API key: {e}")
|
||||
return False
|
||||
|
||||
def get_api_key(self):
|
||||
"""
|
||||
Retrieve and decode the saved Nexus API key
|
||||
|
||||
Returns:
|
||||
str: Decoded API key or None if not saved
|
||||
"""
|
||||
try:
|
||||
encoded_key = self.settings.get("nexus_api_key")
|
||||
if encoded_key:
|
||||
# Decode the base64 encoded key
|
||||
decoded_key = base64.b64decode(encoded_key.encode('utf-8')).decode('utf-8')
|
||||
return decoded_key
|
||||
return None
|
||||
except Exception as e:
|
||||
logger.error(f"Error retrieving API key: {e}")
|
||||
return None
|
||||
|
||||
def has_saved_api_key(self):
|
||||
"""
|
||||
Check if an API key is saved in configuration
|
||||
|
||||
Returns:
|
||||
bool: True if API key exists, False otherwise
|
||||
"""
|
||||
return self.settings.get("nexus_api_key") is not None
|
||||
|
||||
def clear_api_key(self):
|
||||
"""
|
||||
Clear the saved API key from configuration
|
||||
|
||||
Returns:
|
||||
bool: True if cleared successfully, False otherwise
|
||||
"""
|
||||
try:
|
||||
self.settings["nexus_api_key"] = None
|
||||
logger.debug("API key cleared from configuration")
|
||||
return self.save_config()
|
||||
except Exception as e:
|
||||
logger.error(f"Error clearing API key: {e}")
|
||||
return False
|
||||
def save_resolution(self, resolution):
|
||||
"""
|
||||
Save resolution setting to configuration
|
||||
|
||||
Args:
|
||||
resolution (str): Resolution string (e.g., '1920x1080')
|
||||
|
||||
Returns:
|
||||
bool: True if saved successfully, False otherwise
|
||||
"""
|
||||
try:
|
||||
if resolution and resolution != 'Leave unchanged':
|
||||
self.settings["resolution"] = resolution
|
||||
logger.debug(f"Resolution saved: {resolution}")
|
||||
else:
|
||||
# Clear resolution if 'Leave unchanged' or empty
|
||||
self.settings["resolution"] = None
|
||||
logger.debug("Resolution cleared")
|
||||
|
||||
return self.save_config()
|
||||
except Exception as e:
|
||||
logger.error(f"Error saving resolution: {e}")
|
||||
return False
|
||||
|
||||
def get_saved_resolution(self):
|
||||
"""
|
||||
Retrieve the saved resolution from configuration
|
||||
|
||||
Returns:
|
||||
str: Saved resolution or None if not saved
|
||||
"""
|
||||
try:
|
||||
resolution = self.settings.get("resolution")
|
||||
if resolution:
|
||||
logger.debug(f"Retrieved saved resolution: {resolution}")
|
||||
else:
|
||||
logger.debug("No saved resolution found")
|
||||
return resolution
|
||||
except Exception as e:
|
||||
logger.error(f"Error retrieving resolution: {e}")
|
||||
return None
|
||||
|
||||
def has_saved_resolution(self):
|
||||
"""
|
||||
Check if a resolution is saved in configuration
|
||||
|
||||
Returns:
|
||||
bool: True if resolution exists, False otherwise
|
||||
"""
|
||||
return self.settings.get("resolution") is not None
|
||||
|
||||
def clear_saved_resolution(self):
|
||||
"""
|
||||
Clear the saved resolution from configuration
|
||||
|
||||
Returns:
|
||||
bool: True if cleared successfully, False otherwise
|
||||
"""
|
||||
try:
|
||||
self.settings["resolution"] = None
|
||||
logger.debug("Resolution cleared from configuration")
|
||||
return self.save_config()
|
||||
except Exception as e:
|
||||
logger.error(f"Error clearing resolution: {e}")
|
||||
return False
|
||||
|
||||
def set_default_install_parent_dir(self, path):
|
||||
"""
|
||||
Save the parent directory for modlist installations
|
||||
|
||||
Args:
|
||||
path (str): Parent directory path to save
|
||||
|
||||
Returns:
|
||||
bool: True if saved successfully, False otherwise
|
||||
"""
|
||||
try:
|
||||
if path and os.path.exists(path):
|
||||
self.settings["default_install_parent_dir"] = path
|
||||
logger.debug(f"Default install parent directory saved: {path}")
|
||||
return self.save_config()
|
||||
else:
|
||||
logger.warning(f"Invalid or non-existent path for install parent directory: {path}")
|
||||
return False
|
||||
except Exception as e:
|
||||
logger.error(f"Error saving install parent directory: {e}")
|
||||
return False
|
||||
|
||||
def get_default_install_parent_dir(self):
|
||||
"""
|
||||
Retrieve the saved parent directory for modlist installations
|
||||
|
||||
Returns:
|
||||
str: Saved parent directory path or None if not saved
|
||||
"""
|
||||
try:
|
||||
path = self.settings.get("default_install_parent_dir")
|
||||
if path and os.path.exists(path):
|
||||
logger.debug(f"Retrieved default install parent directory: {path}")
|
||||
return path
|
||||
else:
|
||||
logger.debug("No valid default install parent directory found")
|
||||
return None
|
||||
except Exception as e:
|
||||
logger.error(f"Error retrieving install parent directory: {e}")
|
||||
return None
|
||||
|
||||
def set_default_download_parent_dir(self, path):
|
||||
"""
|
||||
Save the parent directory for downloads
|
||||
|
||||
Args:
|
||||
path (str): Parent directory path to save
|
||||
|
||||
Returns:
|
||||
bool: True if saved successfully, False otherwise
|
||||
"""
|
||||
try:
|
||||
if path and os.path.exists(path):
|
||||
self.settings["default_download_parent_dir"] = path
|
||||
logger.debug(f"Default download parent directory saved: {path}")
|
||||
return self.save_config()
|
||||
else:
|
||||
logger.warning(f"Invalid or non-existent path for download parent directory: {path}")
|
||||
return False
|
||||
except Exception as e:
|
||||
logger.error(f"Error saving download parent directory: {e}")
|
||||
return False
|
||||
|
||||
def get_default_download_parent_dir(self):
|
||||
"""
|
||||
Retrieve the saved parent directory for downloads
|
||||
|
||||
Returns:
|
||||
str: Saved parent directory path or None if not saved
|
||||
"""
|
||||
try:
|
||||
path = self.settings.get("default_download_parent_dir")
|
||||
if path and os.path.exists(path):
|
||||
logger.debug(f"Retrieved default download parent directory: {path}")
|
||||
return path
|
||||
else:
|
||||
logger.debug("No valid default download parent directory found")
|
||||
return None
|
||||
except Exception as e:
|
||||
logger.error(f"Error retrieving download parent directory: {e}")
|
||||
return None
|
||||
|
||||
def has_saved_install_parent_dir(self):
|
||||
"""
|
||||
Check if a default install parent directory is saved in configuration
|
||||
|
||||
Returns:
|
||||
bool: True if directory exists and is valid, False otherwise
|
||||
"""
|
||||
path = self.settings.get("default_install_parent_dir")
|
||||
return path is not None and os.path.exists(path)
|
||||
|
||||
def has_saved_download_parent_dir(self):
|
||||
"""
|
||||
Check if a default download parent directory is saved in configuration
|
||||
|
||||
Returns:
|
||||
bool: True if directory exists and is valid, False otherwise
|
||||
"""
|
||||
path = self.settings.get("default_download_parent_dir")
|
||||
return path is not None and os.path.exists(path)
|
||||
|
||||
def get_modlist_install_base_dir(self):
|
||||
"""
|
||||
Get the configurable base directory for modlist installations
|
||||
|
||||
Returns:
|
||||
str: Base directory path for modlist installations
|
||||
"""
|
||||
return self.settings.get("modlist_install_base_dir", os.path.expanduser("~/Games"))
|
||||
|
||||
def set_modlist_install_base_dir(self, path):
|
||||
"""
|
||||
Set the configurable base directory for modlist installations
|
||||
|
||||
Args:
|
||||
path (str): Base directory path to save
|
||||
|
||||
Returns:
|
||||
bool: True if saved successfully, False otherwise
|
||||
"""
|
||||
try:
|
||||
if path:
|
||||
self.settings["modlist_install_base_dir"] = path
|
||||
logger.debug(f"Modlist install base directory saved: {path}")
|
||||
return self.save_config()
|
||||
else:
|
||||
logger.warning("Invalid path for modlist install base directory")
|
||||
return False
|
||||
except Exception as e:
|
||||
logger.error(f"Error saving modlist install base directory: {e}")
|
||||
return False
|
||||
|
||||
def get_modlist_downloads_base_dir(self):
|
||||
"""
|
||||
Get the configurable base directory for modlist downloads
|
||||
|
||||
Returns:
|
||||
str: Base directory path for modlist downloads
|
||||
"""
|
||||
return self.settings.get("modlist_downloads_base_dir", os.path.expanduser("~/Games/Modlist_Downloads"))
|
||||
|
||||
def set_modlist_downloads_base_dir(self, path):
|
||||
"""
|
||||
Set the configurable base directory for modlist downloads
|
||||
|
||||
Args:
|
||||
path (str): Base directory path to save
|
||||
|
||||
Returns:
|
||||
bool: True if saved successfully, False otherwise
|
||||
"""
|
||||
try:
|
||||
if path:
|
||||
self.settings["modlist_downloads_base_dir"] = path
|
||||
logger.debug(f"Modlist downloads base directory saved: {path}")
|
||||
return self.save_config()
|
||||
else:
|
||||
logger.warning("Invalid path for modlist downloads base directory")
|
||||
return False
|
||||
except Exception as e:
|
||||
logger.error(f"Error saving modlist downloads base directory: {e}")
|
||||
return False
|
||||
|
||||
|
||||
208
jackify/backend/handlers/diagnostic_helper.py
Normal file
208
jackify/backend/handlers/diagnostic_helper.py
Normal file
@@ -0,0 +1,208 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Jackify Performance Diagnostic Helper
|
||||
|
||||
This utility helps diagnose whether performance issues are in:
|
||||
1. jackify-engine (.NET binary) - stalls, memory leaks, etc.
|
||||
2. jackify (Python wrapper) - subprocess handling, threading issues
|
||||
|
||||
Usage: python -m jackify.backend.handlers.diagnostic_helper
|
||||
"""
|
||||
|
||||
import time
|
||||
import psutil
|
||||
import subprocess
|
||||
import logging
|
||||
import sys
|
||||
from pathlib import Path
|
||||
from typing import List, Dict, Any
|
||||
|
||||
|
||||
def find_jackify_engine_processes() -> List[Dict[str, Any]]:
|
||||
"""Find all running jackify-engine and magick (ImageMagick) processes."""
|
||||
processes = []
|
||||
for proc in psutil.process_iter(['pid', 'name', 'cmdline', 'create_time', 'cpu_percent', 'memory_info']):
|
||||
try:
|
||||
if (
|
||||
'jackify-engine' in proc.info['name'] or
|
||||
any('jackify-engine' in arg for arg in (proc.info['cmdline'] or [])) or
|
||||
proc.info['name'] == 'magick' or
|
||||
any('magick' in arg for arg in (proc.info['cmdline'] or []))
|
||||
):
|
||||
processes.append({
|
||||
'pid': proc.info['pid'],
|
||||
'name': proc.info['name'],
|
||||
'cmdline': ' '.join(proc.info['cmdline'] or []),
|
||||
'age_seconds': time.time() - proc.info['create_time'],
|
||||
'cpu_percent': proc.info['cpu_percent'],
|
||||
'memory_mb': proc.info['memory_info'].rss / (1024 * 1024) if proc.info['memory_info'] else 0,
|
||||
'process': proc
|
||||
})
|
||||
except (psutil.NoSuchProcess, psutil.AccessDenied):
|
||||
continue
|
||||
return processes
|
||||
|
||||
|
||||
def diagnose_stalled_engine(pid: int, duration: int = 60) -> Dict[str, Any]:
|
||||
"""Monitor a specific jackify-engine process for stalls."""
|
||||
try:
|
||||
proc = psutil.Process(pid)
|
||||
except psutil.NoSuchProcess:
|
||||
return {"error": f"Process {pid} not found"}
|
||||
|
||||
print(f"Monitoring jackify-engine PID {pid} for {duration} seconds...")
|
||||
|
||||
samples = []
|
||||
start_time = time.time()
|
||||
|
||||
while time.time() - start_time < duration:
|
||||
try:
|
||||
sample = {
|
||||
'timestamp': time.time(),
|
||||
'cpu_percent': proc.cpu_percent(),
|
||||
'memory_mb': proc.memory_info().rss / (1024 * 1024),
|
||||
'thread_count': proc.num_threads(),
|
||||
'status': proc.status()
|
||||
}
|
||||
|
||||
try:
|
||||
sample['fd_count'] = proc.num_fds()
|
||||
except (psutil.AccessDenied, AttributeError):
|
||||
sample['fd_count'] = 0
|
||||
|
||||
samples.append(sample)
|
||||
|
||||
# Real-time status
|
||||
status_icon = "🟢" if sample['cpu_percent'] > 10 else "🟡" if sample['cpu_percent'] > 2 else "🔴"
|
||||
print(f"{status_icon} CPU: {sample['cpu_percent']:5.1f}% | Memory: {sample['memory_mb']:6.1f}MB | "
|
||||
f"Threads: {sample['thread_count']:2d} | Status: {sample['status']}")
|
||||
|
||||
time.sleep(2)
|
||||
|
||||
except psutil.NoSuchProcess:
|
||||
print("Process terminated during monitoring")
|
||||
break
|
||||
except Exception as e:
|
||||
print(f"Error monitoring process: {e}")
|
||||
break
|
||||
|
||||
if not samples:
|
||||
return {"error": "No samples collected"}
|
||||
|
||||
# Analyze results
|
||||
cpu_values = [s['cpu_percent'] for s in samples]
|
||||
memory_values = [s['memory_mb'] for s in samples]
|
||||
|
||||
low_cpu_samples = [s for s in samples if s['cpu_percent'] < 5]
|
||||
stall_duration = len(low_cpu_samples) * 2 # 2 second intervals
|
||||
|
||||
diagnosis = {
|
||||
'samples': len(samples),
|
||||
'avg_cpu': sum(cpu_values) / len(cpu_values),
|
||||
'max_cpu': max(cpu_values),
|
||||
'min_cpu': min(cpu_values),
|
||||
'avg_memory_mb': sum(memory_values) / len(memory_values),
|
||||
'max_memory_mb': max(memory_values),
|
||||
'low_cpu_samples': len(low_cpu_samples),
|
||||
'stall_duration_seconds': stall_duration,
|
||||
'thread_count_final': samples[-1]['thread_count'] if samples else 0,
|
||||
'likely_stalled': stall_duration > 30 and sum(cpu_values[-5:]) / 5 < 5, # Last 10 seconds low CPU
|
||||
}
|
||||
|
||||
return diagnosis
|
||||
|
||||
|
||||
def check_system_resources() -> Dict[str, Any]:
|
||||
"""Check overall system resources that might affect performance."""
|
||||
return {
|
||||
'total_memory_gb': psutil.virtual_memory().total / (1024**3),
|
||||
'available_memory_gb': psutil.virtual_memory().available / (1024**3),
|
||||
'memory_percent': psutil.virtual_memory().percent,
|
||||
'cpu_count': psutil.cpu_count(),
|
||||
'cpu_percent_overall': psutil.cpu_percent(interval=1),
|
||||
'disk_usage_percent': psutil.disk_usage('/').percent,
|
||||
'load_average': psutil.getloadavg() if hasattr(psutil, 'getloadavg') else None,
|
||||
}
|
||||
|
||||
|
||||
def main():
|
||||
"""Main diagnostic routine."""
|
||||
print("Jackify Performance Diagnostic Tool")
|
||||
print("=" * 50)
|
||||
|
||||
# Check for running engines and magick processes
|
||||
engines = find_jackify_engine_processes()
|
||||
|
||||
if not engines:
|
||||
print("No jackify-engine or magick processes found running")
|
||||
print("\nTo use this tool:")
|
||||
print("1. Start a modlist installation in Jackify")
|
||||
print("2. Run this diagnostic while the installation is active")
|
||||
return
|
||||
|
||||
print(f"Found {len(engines)} relevant process(es):")
|
||||
for engine in engines:
|
||||
age_min = engine['age_seconds'] / 60
|
||||
print(f" PID {engine['pid']}: {engine['name']} {engine['cpu_percent']:.1f}% CPU, "
|
||||
f"{engine['memory_mb']:.1f}MB RAM, running {age_min:.1f} minutes, CMD: {engine['cmdline']}")
|
||||
|
||||
# Check system resources
|
||||
print("\nSystem Resources:")
|
||||
sys_info = check_system_resources()
|
||||
print(f" Memory: {sys_info['memory_percent']:.1f}% used "
|
||||
f"({sys_info['available_memory_gb']:.1f}GB / {sys_info['total_memory_gb']:.1f}GB available)")
|
||||
print(f" CPU: {sys_info['cpu_percent_overall']:.1f}% overall, {sys_info['cpu_count']} cores")
|
||||
print(f" Disk: {sys_info['disk_usage_percent']:.1f}% used")
|
||||
if sys_info['load_average']:
|
||||
print(f" Load average: {sys_info['load_average']}")
|
||||
|
||||
# Focus on the engine with highest CPU usage (likely active)
|
||||
active_engine = max(engines, key=lambda x: x['cpu_percent'])
|
||||
|
||||
print(f"\nMonitoring most active engine (PID {active_engine['pid']}) for stalls...")
|
||||
|
||||
try:
|
||||
diagnosis = diagnose_stalled_engine(active_engine['pid'], duration=60)
|
||||
|
||||
if 'error' in diagnosis:
|
||||
print(f"Error: {diagnosis['error']}")
|
||||
return
|
||||
|
||||
print(f"\n📊 Diagnosis Results:")
|
||||
print(f" Average CPU: {diagnosis['avg_cpu']:.1f}% (Range: {diagnosis['min_cpu']:.1f}% - {diagnosis['max_cpu']:.1f}%)")
|
||||
print(f" Memory usage: {diagnosis['avg_memory_mb']:.1f}MB (Peak: {diagnosis['max_memory_mb']:.1f}MB)")
|
||||
print(f" Low CPU samples: {diagnosis['low_cpu_samples']}/{diagnosis['samples']} "
|
||||
f"(stalled for {diagnosis['stall_duration_seconds']}s)")
|
||||
print(f" Thread count: {diagnosis['thread_count_final']}")
|
||||
|
||||
# Provide diagnosis
|
||||
print(f"\n[DIAGNOSIS]:")
|
||||
if diagnosis['likely_stalled']:
|
||||
print("[ERROR] ENGINE STALL DETECTED")
|
||||
print(" - jackify-engine process shows sustained low CPU usage")
|
||||
print(" - This indicates an issue in the .NET Wabbajack engine, not the Python wrapper")
|
||||
print(" - Recommendation: Report this to the Wabbajack team as a jackify-engine issue")
|
||||
elif diagnosis['avg_cpu'] > 50:
|
||||
print("[OK] Engine appears to be working normally (high CPU activity)")
|
||||
elif diagnosis['avg_cpu'] > 10:
|
||||
print("[WARNING] Engine showing moderate activity - may be normal for current operation")
|
||||
else:
|
||||
print("[WARNING] Engine showing low activity - monitor for longer or check if installation completed")
|
||||
|
||||
# System-level issues
|
||||
if sys_info['memory_percent'] > 90:
|
||||
print("[WARNING] System memory critically low - may cause stalls")
|
||||
elif sys_info['memory_percent'] > 80:
|
||||
print("[CAUTION] System memory usage high")
|
||||
|
||||
if sys_info['cpu_percent_overall'] > 90:
|
||||
print("[WARNING] System CPU usage very high - may indicate system-wide issue")
|
||||
|
||||
except KeyboardInterrupt:
|
||||
print("\n\n[STOPPED] Monitoring interrupted by user")
|
||||
except Exception as e:
|
||||
print(f"\n[ERROR] Error during diagnosis: {e}")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
338
jackify/backend/handlers/engine_monitor.py
Normal file
338
jackify/backend/handlers/engine_monitor.py
Normal file
@@ -0,0 +1,338 @@
|
||||
"""
|
||||
Engine Performance Monitor
|
||||
|
||||
Monitors the jackify-engine process for performance issues like CPU stalls,
|
||||
memory problems, and excessive I/O wait times.
|
||||
"""
|
||||
|
||||
import time
|
||||
import threading
|
||||
import psutil
|
||||
import logging
|
||||
import os
|
||||
from typing import Optional, Dict, Any, Callable
|
||||
from dataclasses import dataclass
|
||||
from enum import Enum
|
||||
|
||||
|
||||
class PerformanceState(Enum):
|
||||
NORMAL = "normal"
|
||||
STALLED = "stalled"
|
||||
HIGH_MEMORY = "high_memory"
|
||||
HIGH_IO_WAIT = "high_io_wait"
|
||||
ZOMBIE = "zombie"
|
||||
|
||||
|
||||
@dataclass
|
||||
class PerformanceMetrics:
|
||||
timestamp: float
|
||||
cpu_percent: float
|
||||
memory_percent: float
|
||||
memory_mb: float
|
||||
io_read_mb: float
|
||||
io_write_mb: float
|
||||
thread_count: int
|
||||
fd_count: int
|
||||
state: PerformanceState
|
||||
|
||||
# Additional diagnostics for engine vs wrapper distinction
|
||||
parent_cpu_percent: Optional[float] = None
|
||||
parent_memory_mb: Optional[float] = None
|
||||
engine_responsive: bool = True
|
||||
|
||||
# New: ImageMagick resource usage
|
||||
magick_cpu_percent: float = 0.0
|
||||
magick_memory_mb: float = 0.0
|
||||
|
||||
|
||||
class EnginePerformanceMonitor:
|
||||
"""
|
||||
Monitors jackify-engine process performance and detects common stall patterns.
|
||||
|
||||
This is designed to help diagnose the issue where extraction starts at 80-100% CPU
|
||||
but drops to 2% after ~5 minutes and requires manual kills.
|
||||
|
||||
Also monitors parent Python process to distinguish between engine vs wrapper issues.
|
||||
"""
|
||||
|
||||
def __init__(self,
|
||||
logger: Optional[logging.Logger] = None,
|
||||
stall_threshold: float = 5.0, # CPU below this % for stall_duration = stall
|
||||
stall_duration: float = 120.0, # seconds of low CPU = stall
|
||||
memory_threshold: float = 85.0, # % memory usage threshold
|
||||
sample_interval: float = 5.0): # seconds between samples
|
||||
|
||||
self.logger = logger or logging.getLogger(__name__)
|
||||
self.stall_threshold = stall_threshold
|
||||
self.stall_duration = stall_duration
|
||||
self.memory_threshold = memory_threshold
|
||||
self.sample_interval = sample_interval
|
||||
|
||||
self._process: Optional[psutil.Process] = None
|
||||
self._parent_process: Optional[psutil.Process] = None
|
||||
self._monitoring = False
|
||||
self._monitor_thread: Optional[threading.Thread] = None
|
||||
self._metrics_history: list[PerformanceMetrics] = []
|
||||
self._callbacks: list[Callable[[PerformanceMetrics], None]] = []
|
||||
|
||||
# Performance state tracking
|
||||
self._low_cpu_start_time: Optional[float] = None
|
||||
self._last_io_read = 0
|
||||
self._last_io_write = 0
|
||||
|
||||
def add_callback(self, callback: Callable[[PerformanceMetrics], None]):
|
||||
"""Add a callback to receive performance metrics updates."""
|
||||
self._callbacks.append(callback)
|
||||
|
||||
def start_monitoring(self, pid: int) -> bool:
|
||||
"""Start monitoring the given process ID."""
|
||||
try:
|
||||
self._process = psutil.Process(pid)
|
||||
|
||||
# Also monitor the parent Python process for comparison
|
||||
try:
|
||||
self._parent_process = psutil.Process(os.getpid())
|
||||
except:
|
||||
self._parent_process = None
|
||||
|
||||
self._monitoring = True
|
||||
self._monitor_thread = threading.Thread(target=self._monitor_loop, daemon=True)
|
||||
self._monitor_thread.start()
|
||||
|
||||
process_name = self._process.name() if self._process else "unknown"
|
||||
self.logger.info(f"Started performance monitoring for PID {pid} ({process_name}) "
|
||||
f"(stall threshold: {self.stall_threshold}% CPU for {self.stall_duration}s)")
|
||||
return True
|
||||
|
||||
except psutil.NoSuchProcess:
|
||||
self.logger.error(f"Process {pid} not found")
|
||||
return False
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to start monitoring PID {pid}: {e}")
|
||||
return False
|
||||
|
||||
def stop_monitoring(self):
|
||||
"""Stop monitoring the process."""
|
||||
self._monitoring = False
|
||||
if self._monitor_thread and self._monitor_thread.is_alive():
|
||||
self._monitor_thread.join(timeout=10)
|
||||
|
||||
def get_metrics_summary(self) -> Dict[str, Any]:
|
||||
"""Get a summary of collected metrics."""
|
||||
if not self._metrics_history:
|
||||
return {}
|
||||
|
||||
cpu_values = [m.cpu_percent for m in self._metrics_history]
|
||||
memory_values = [m.memory_mb for m in self._metrics_history]
|
||||
|
||||
stalled_count = sum(1 for m in self._metrics_history if m.state == PerformanceState.STALLED)
|
||||
|
||||
# Engine vs wrapper analysis
|
||||
engine_avg_cpu = sum(cpu_values) / len(cpu_values)
|
||||
parent_cpu_values = [m.parent_cpu_percent for m in self._metrics_history if m.parent_cpu_percent is not None]
|
||||
parent_avg_cpu = sum(parent_cpu_values) / len(parent_cpu_values) if parent_cpu_values else 0
|
||||
|
||||
return {
|
||||
"total_samples": len(self._metrics_history),
|
||||
"monitoring_duration": self._metrics_history[-1].timestamp - self._metrics_history[0].timestamp,
|
||||
|
||||
# Engine process metrics
|
||||
"engine_avg_cpu_percent": engine_avg_cpu,
|
||||
"engine_max_cpu_percent": max(cpu_values),
|
||||
"engine_min_cpu_percent": min(cpu_values),
|
||||
"engine_avg_memory_mb": sum(memory_values) / len(memory_values),
|
||||
"engine_max_memory_mb": max(memory_values),
|
||||
|
||||
# Parent process metrics (for comparison)
|
||||
"parent_avg_cpu_percent": parent_avg_cpu,
|
||||
|
||||
# Stall analysis
|
||||
"stalled_samples": stalled_count,
|
||||
"stall_percentage": (stalled_count / len(self._metrics_history)) * 100,
|
||||
|
||||
# Diagnosis hints
|
||||
"likely_engine_issue": engine_avg_cpu < 10 and parent_avg_cpu < 5,
|
||||
"likely_wrapper_issue": engine_avg_cpu > 20 and parent_avg_cpu > 50,
|
||||
}
|
||||
|
||||
def _monitor_loop(self):
|
||||
"""Main monitoring loop."""
|
||||
while self._monitoring:
|
||||
try:
|
||||
if not self._process or not self._process.is_running():
|
||||
self.logger.warning("Monitored engine process is no longer running")
|
||||
break
|
||||
|
||||
metrics = self._collect_metrics()
|
||||
self._metrics_history.append(metrics)
|
||||
|
||||
# Notify callbacks
|
||||
for callback in self._callbacks:
|
||||
try:
|
||||
callback(metrics)
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error in performance callback: {e}")
|
||||
|
||||
# Log significant events with engine vs wrapper context
|
||||
if metrics.state == PerformanceState.STALLED:
|
||||
parent_info = ""
|
||||
if metrics.parent_cpu_percent is not None:
|
||||
parent_info = f", Python wrapper: {metrics.parent_cpu_percent:.1f}% CPU"
|
||||
|
||||
self.logger.warning(f"🚨 ENGINE STALL DETECTED: jackify-engine CPU at {metrics.cpu_percent:.1f}% "
|
||||
f"for {self.stall_duration}s+ (Memory: {metrics.memory_mb:.1f}MB, "
|
||||
f"Threads: {metrics.thread_count}, FDs: {metrics.fd_count}{parent_info})")
|
||||
|
||||
# Provide diagnosis hint
|
||||
if metrics.parent_cpu_percent and metrics.parent_cpu_percent > 10:
|
||||
self.logger.warning("Warning: Python wrapper still active - likely jackify-engine (.NET) issue")
|
||||
else:
|
||||
self.logger.warning("Warning: Both processes low CPU - possible system-wide issue")
|
||||
|
||||
elif metrics.state == PerformanceState.HIGH_MEMORY:
|
||||
self.logger.warning(f"HIGH MEMORY USAGE in jackify-engine: {metrics.memory_percent:.1f}% "
|
||||
f"({metrics.memory_mb:.1f}MB)")
|
||||
|
||||
time.sleep(self.sample_interval)
|
||||
|
||||
except psutil.NoSuchProcess:
|
||||
self.logger.info("Monitored engine process terminated")
|
||||
break
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error in monitoring loop: {e}")
|
||||
time.sleep(self.sample_interval)
|
||||
|
||||
def _collect_metrics(self) -> PerformanceMetrics:
|
||||
"""Collect current performance metrics."""
|
||||
now = time.time()
|
||||
|
||||
# Get basic process info for engine
|
||||
cpu_percent = self._process.cpu_percent()
|
||||
memory_info = self._process.memory_info()
|
||||
memory_mb = memory_info.rss / (1024 * 1024)
|
||||
memory_percent = self._process.memory_percent()
|
||||
|
||||
# Get parent process info for comparison
|
||||
parent_cpu_percent = None
|
||||
parent_memory_mb = None
|
||||
if self._parent_process:
|
||||
try:
|
||||
parent_cpu_percent = self._parent_process.cpu_percent()
|
||||
parent_memory_info = self._parent_process.memory_info()
|
||||
parent_memory_mb = parent_memory_info.rss / (1024 * 1024)
|
||||
except:
|
||||
pass
|
||||
|
||||
# Get I/O info
|
||||
try:
|
||||
io_counters = self._process.io_counters()
|
||||
io_read_mb = io_counters.read_bytes / (1024 * 1024)
|
||||
io_write_mb = io_counters.write_bytes / (1024 * 1024)
|
||||
except (psutil.AccessDenied, AttributeError):
|
||||
io_read_mb = 0
|
||||
io_write_mb = 0
|
||||
|
||||
# Get thread and file descriptor counts
|
||||
try:
|
||||
thread_count = self._process.num_threads()
|
||||
except (psutil.AccessDenied, AttributeError):
|
||||
thread_count = 0
|
||||
|
||||
try:
|
||||
fd_count = self._process.num_fds()
|
||||
except (psutil.AccessDenied, AttributeError):
|
||||
fd_count = 0
|
||||
|
||||
# Determine performance state
|
||||
state = self._determine_state(cpu_percent, memory_percent, now)
|
||||
|
||||
# New: Aggregate ImageMagick ('magick') child process usage
|
||||
magick_cpu = 0.0
|
||||
magick_mem = 0.0
|
||||
try:
|
||||
for child in self._process.children(recursive=True):
|
||||
try:
|
||||
if child.name() == 'magick' or 'magick' in ' '.join(child.cmdline()):
|
||||
magick_cpu += child.cpu_percent()
|
||||
magick_mem += child.memory_info().rss / (1024 * 1024)
|
||||
except Exception:
|
||||
continue
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
return PerformanceMetrics(
|
||||
timestamp=now,
|
||||
cpu_percent=cpu_percent,
|
||||
memory_percent=memory_percent,
|
||||
memory_mb=memory_mb,
|
||||
io_read_mb=io_read_mb,
|
||||
io_write_mb=io_write_mb,
|
||||
thread_count=thread_count,
|
||||
fd_count=fd_count,
|
||||
state=state,
|
||||
parent_cpu_percent=parent_cpu_percent,
|
||||
parent_memory_mb=parent_memory_mb,
|
||||
engine_responsive=cpu_percent > self.stall_threshold or (now - self._low_cpu_start_time if self._low_cpu_start_time else 0) < self.stall_duration,
|
||||
magick_cpu_percent=magick_cpu,
|
||||
magick_memory_mb=magick_mem
|
||||
)
|
||||
|
||||
def _determine_state(self, cpu_percent: float, memory_percent: float, timestamp: float) -> PerformanceState:
|
||||
"""Determine the current performance state."""
|
||||
|
||||
# Check for high memory usage
|
||||
if memory_percent > self.memory_threshold:
|
||||
return PerformanceState.HIGH_MEMORY
|
||||
|
||||
# Check for CPU stall
|
||||
if cpu_percent < self.stall_threshold:
|
||||
if self._low_cpu_start_time is None:
|
||||
self._low_cpu_start_time = timestamp
|
||||
elif timestamp - self._low_cpu_start_time >= self.stall_duration:
|
||||
return PerformanceState.STALLED
|
||||
else:
|
||||
# CPU is above threshold, reset stall timer
|
||||
self._low_cpu_start_time = None
|
||||
|
||||
return PerformanceState.NORMAL
|
||||
|
||||
|
||||
def create_debug_callback(logger: logging.Logger) -> Callable[[PerformanceMetrics], None]:
|
||||
"""Create a callback that logs detailed performance metrics for debugging."""
|
||||
|
||||
def debug_callback(metrics: PerformanceMetrics):
|
||||
parent_info = f", Python: {metrics.parent_cpu_percent:.1f}%" if metrics.parent_cpu_percent else ""
|
||||
magick_info = f", Magick: {metrics.magick_cpu_percent:.1f}% CPU, {metrics.magick_memory_mb:.1f}MB RAM" if metrics.magick_cpu_percent or metrics.magick_memory_mb else ""
|
||||
logger.debug(f"Engine Performance: jackify-engine CPU={metrics.cpu_percent:.1f}%, "
|
||||
f"Memory={metrics.memory_mb:.1f}MB ({metrics.memory_percent:.1f}%), "
|
||||
f"Threads={metrics.thread_count}, FDs={metrics.fd_count}, "
|
||||
f"State={metrics.state.value}{parent_info}{magick_info}")
|
||||
|
||||
return debug_callback
|
||||
|
||||
|
||||
def create_stall_alert_callback(logger: logging.Logger,
|
||||
alert_func: Optional[Callable[[str], None]] = None
|
||||
) -> Callable[[PerformanceMetrics], None]:
|
||||
"""Create a callback that alerts when performance issues are detected."""
|
||||
|
||||
def alert_callback(metrics: PerformanceMetrics):
|
||||
if metrics.state in [PerformanceState.STALLED, PerformanceState.HIGH_MEMORY]:
|
||||
|
||||
# Provide context about engine vs wrapper
|
||||
if metrics.state == PerformanceState.STALLED:
|
||||
if metrics.parent_cpu_percent and metrics.parent_cpu_percent > 10:
|
||||
issue_type = "jackify-engine (.NET binary) stalled"
|
||||
else:
|
||||
issue_type = "system-wide performance issue"
|
||||
else:
|
||||
issue_type = metrics.state.value.upper()
|
||||
|
||||
message = (f"{issue_type} - Engine CPU: {metrics.cpu_percent:.1f}%, "
|
||||
f"Memory: {metrics.memory_mb:.1f}MB")
|
||||
|
||||
logger.warning(message)
|
||||
if alert_func:
|
||||
alert_func(message)
|
||||
|
||||
return alert_callback
|
||||
900
jackify/backend/handlers/filesystem_handler.py
Normal file
900
jackify/backend/handlers/filesystem_handler.py
Normal file
@@ -0,0 +1,900 @@
|
||||
"""
|
||||
FileSystemHandler module for managing file system operations.
|
||||
This module handles path normalization, validation, and file operations.
|
||||
"""
|
||||
|
||||
import os
|
||||
import shutil
|
||||
import logging
|
||||
from pathlib import Path
|
||||
from typing import Optional, List, Dict, Tuple
|
||||
from datetime import datetime
|
||||
import re
|
||||
import time
|
||||
import subprocess # Needed for running sudo commands
|
||||
import pwd # To get user name
|
||||
import grp # To get group name
|
||||
import requests # Import requests
|
||||
import vdf # Import VDF library at the top level
|
||||
from jackify.shared.colors import COLOR_PROMPT, COLOR_RESET
|
||||
|
||||
# Initialize logger for the module
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
class FileSystemHandler:
|
||||
def __init__(self):
|
||||
# Keep instance logger if needed, but static methods use module logger
|
||||
self.logger = logging.getLogger(__name__)
|
||||
|
||||
@staticmethod
|
||||
def normalize_path(path: str) -> Path:
|
||||
"""Normalize a path string to a Path object."""
|
||||
try:
|
||||
if path.startswith('~'):
|
||||
path = os.path.expanduser(path)
|
||||
path = os.path.abspath(path)
|
||||
return Path(path)
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to normalize path {path}: {e}")
|
||||
return Path(path) # Return original path as Path object on error
|
||||
|
||||
@staticmethod
|
||||
def validate_path(path: Path) -> bool:
|
||||
"""Validate if a path exists and is accessible."""
|
||||
try:
|
||||
if not path.exists():
|
||||
logger.warning(f"Validation failed: Path does not exist - {path}")
|
||||
return False
|
||||
# Check read access
|
||||
if not os.access(path, os.R_OK):
|
||||
logger.warning(f"Validation failed: No read access - {path}")
|
||||
return False
|
||||
# Check write access (important for many operations)
|
||||
# For directories, check write on parent; for files, check write on file itself
|
||||
if path.is_dir():
|
||||
if not os.access(path, os.W_OK):
|
||||
logger.warning(f"Validation failed: No write access to directory - {path}")
|
||||
return False
|
||||
elif path.is_file():
|
||||
# Check write access to the parent directory for file creation/modification
|
||||
if not os.access(path.parent, os.W_OK):
|
||||
logger.warning(f"Validation failed: No write access to parent dir of file - {path.parent}")
|
||||
return False
|
||||
return True # Passed existence and access checks
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to validate path {path}: {e}")
|
||||
return False
|
||||
|
||||
@staticmethod
|
||||
def ensure_directory(path: Path) -> bool:
|
||||
"""Ensure a directory exists, create if it doesn't."""
|
||||
try:
|
||||
path.mkdir(parents=True, exist_ok=True)
|
||||
logger.debug(f"Ensured directory exists: {path}")
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to ensure directory {path}: {e}")
|
||||
return False
|
||||
|
||||
@staticmethod
|
||||
def backup_file(file_path: Path, backup_dir: Optional[Path] = None) -> Optional[Path]:
|
||||
"""Create a backup of a file with timestamp."""
|
||||
try:
|
||||
if not file_path.is_file():
|
||||
logger.error(f"Backup failed: Source is not a file - {file_path}")
|
||||
return None
|
||||
|
||||
if backup_dir is None:
|
||||
backup_dir = file_path.parent / "backups"
|
||||
|
||||
FileSystemHandler.ensure_directory(backup_dir)
|
||||
|
||||
timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
|
||||
backup_path = backup_dir / f"{file_path.stem}_{timestamp}{file_path.suffix}"
|
||||
|
||||
shutil.copy2(file_path, backup_path)
|
||||
logger.info(f"File backed up to: {backup_path}")
|
||||
return backup_path
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to backup file {file_path}: {e}")
|
||||
return None
|
||||
|
||||
@staticmethod
|
||||
def restore_backup(backup_path: Path, target_path: Path) -> bool:
|
||||
"""Restore a file from backup, backing up the current target first."""
|
||||
try:
|
||||
if not backup_path.is_file():
|
||||
logger.error(f"Restore failed: Backup source is not a file - {backup_path}")
|
||||
return False
|
||||
|
||||
if target_path.exists():
|
||||
logger.warning(f"Target file exists, creating backup before restore: {target_path}")
|
||||
FileSystemHandler.backup_file(target_path)
|
||||
|
||||
# Ensure target directory exists
|
||||
FileSystemHandler.ensure_directory(target_path.parent)
|
||||
|
||||
shutil.copy2(backup_path, target_path)
|
||||
logger.info(f"Restored {backup_path} to {target_path}")
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to restore backup {backup_path} to {target_path}: {e}")
|
||||
return False
|
||||
|
||||
@staticmethod
|
||||
def find_latest_backup(original_file_path: Path) -> Optional[Path]:
|
||||
"""Finds the most recent backup file for a given original file path."""
|
||||
if not original_file_path.exists():
|
||||
logger.warning(f"Cannot find backups for non-existent file: {original_file_path}")
|
||||
return None
|
||||
|
||||
backup_dir = original_file_path.parent / "backups"
|
||||
if not backup_dir.is_dir():
|
||||
logger.debug(f"Backup directory not found: {backup_dir}")
|
||||
return None
|
||||
|
||||
file_stem = original_file_path.stem
|
||||
file_suffix = original_file_path.suffix
|
||||
|
||||
# Look for timestamped backups first (e.g., shortcuts_20230101_120000.vdf)
|
||||
# Adjusted glob pattern to match the format used in backup_file
|
||||
timestamp_pattern = f"{file_stem}_*_*{file_suffix}"
|
||||
timestamped_backups = list(backup_dir.glob(timestamp_pattern))
|
||||
|
||||
latest_backup_path = None
|
||||
latest_timestamp = 0
|
||||
|
||||
if timestamped_backups:
|
||||
logger.debug(f"Found potential timestamped backups: {timestamped_backups}")
|
||||
for backup_path in timestamped_backups:
|
||||
# Extract timestamp from filename (e.g., stem_YYYYMMDD_HHMMSS.suffix)
|
||||
try:
|
||||
name_parts = backup_path.stem.split('_')
|
||||
if len(name_parts) >= 3:
|
||||
# Combine date and time parts for parsing
|
||||
timestamp_str = f"{name_parts[-2]}_{name_parts[-1]}"
|
||||
backup_time = datetime.strptime(timestamp_str, "%Y%m%d_%H%M%S").timestamp()
|
||||
if backup_time > latest_timestamp:
|
||||
latest_timestamp = backup_time
|
||||
latest_backup_path = backup_path
|
||||
else:
|
||||
logger.warning(f"Could not parse timestamp from backup filename: {backup_path.name}")
|
||||
except (ValueError, IndexError) as e:
|
||||
logger.warning(f"Error parsing timestamp from {backup_path.name}: {e}")
|
||||
|
||||
if latest_backup_path:
|
||||
logger.info(f"Latest timestamped backup found: {latest_backup_path}")
|
||||
return latest_backup_path
|
||||
|
||||
# If no timestamped backup found, check for simple .bak file
|
||||
simple_backup_path = backup_dir / f"{original_file_path.name}.bak"
|
||||
# Correction: Simple backup might be in the *same* directory, not backup_dir
|
||||
simple_backup_path_alt = original_file_path.with_suffix(f"{file_suffix}.bak")
|
||||
|
||||
if simple_backup_path_alt.is_file():
|
||||
logger.info(f"Found simple backup file: {simple_backup_path_alt}")
|
||||
return simple_backup_path_alt
|
||||
elif simple_backup_path.is_file(): # Check in backup dir as fallback
|
||||
logger.info(f"Found simple backup file in backup dir: {simple_backup_path}")
|
||||
return simple_backup_path
|
||||
|
||||
logger.warning(f"No suitable backup found for {original_file_path} in {backup_dir} or adjacent.")
|
||||
return None
|
||||
|
||||
@staticmethod
|
||||
def set_permissions(path: Path, permissions: int = 0o755, recursive: bool = True) -> bool:
|
||||
"""Set file or directory permissions (non-sudo)."""
|
||||
try:
|
||||
if not path.exists():
|
||||
logger.error(f"Cannot set permissions: Path does not exist - {path}")
|
||||
return False
|
||||
|
||||
if recursive and path.is_dir():
|
||||
for root, dirs, files in os.walk(path):
|
||||
try:
|
||||
os.chmod(root, 0o755) # Dirs typically 755
|
||||
except Exception as dir_e:
|
||||
logger.warning(f"Failed to chmod dir {root}: {dir_e}")
|
||||
for file in files:
|
||||
try:
|
||||
os.chmod(os.path.join(root, file), 0o644) # Files typically 644
|
||||
except Exception as file_e:
|
||||
logger.warning(f"Failed to chmod file {os.path.join(root, file)}: {file_e}")
|
||||
elif path.is_file():
|
||||
os.chmod(path, 0o644 if permissions == 0o755 else permissions) # Default file perms 644
|
||||
elif path.is_dir():
|
||||
os.chmod(path, permissions) # Set specific perm for top-level dir if not recursive
|
||||
logger.debug(f"Set permissions for {path} (recursive={recursive})")
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to set permissions for {path}: {e}")
|
||||
return False
|
||||
|
||||
@staticmethod
|
||||
def get_permissions(path: Path) -> Optional[int]:
|
||||
"""Get file or directory permissions (last 3 octal digits)."""
|
||||
try:
|
||||
return os.stat(path).st_mode & 0o777
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to get permissions for {path}: {e}")
|
||||
return None
|
||||
|
||||
@staticmethod
|
||||
def is_sd_card(path: Path) -> bool:
|
||||
"""Check if a path likely resides on an SD card based on common mount points."""
|
||||
try:
|
||||
# Get the absolute path to resolve symlinks etc.
|
||||
abs_path_str = str(path.resolve())
|
||||
|
||||
# Common SD card mount patterns/devices on Linux/Steam Deck
|
||||
sd_patterns = [
|
||||
"/run/media/mmcblk",
|
||||
"/media/mmcblk",
|
||||
"/dev/mmcblk"
|
||||
]
|
||||
|
||||
# Check if path starts with known mount points
|
||||
for pattern in sd_patterns:
|
||||
if abs_path_str.startswith(pattern):
|
||||
logger.debug(f"Path {path} matches SD card pattern: {pattern}")
|
||||
return True
|
||||
|
||||
# Less reliable: Check mount point info (can be slow/complex)
|
||||
# try:
|
||||
# # ... (logic using /proc/mounts or df command) ...
|
||||
# except Exception as mount_e:
|
||||
# logger.warning(f"Could not reliably check mount point for {path}: {mount_e}")
|
||||
|
||||
logger.debug(f"Path {path} does not appear to be on a standard SD card mount.")
|
||||
return False
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error checking if path is on SD card: {e}")
|
||||
return False # Default to False on error
|
||||
|
||||
@staticmethod
|
||||
def get_directory_size(path: Path) -> Optional[int]:
|
||||
"""Get the total size of a directory in bytes."""
|
||||
try:
|
||||
total_size = 0
|
||||
for entry in os.scandir(path):
|
||||
if entry.is_dir(follow_symlinks=False):
|
||||
total_size += FileSystemHandler.get_directory_size(Path(entry.path)) or 0
|
||||
elif entry.is_file(follow_symlinks=False):
|
||||
total_size += entry.stat().st_size
|
||||
return total_size
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to get directory size for {path}: {e}")
|
||||
return None
|
||||
|
||||
@staticmethod
|
||||
def cleanup_directory(path: Path, age_days: int) -> bool:
|
||||
"""Delete files in a directory older than age_days."""
|
||||
try:
|
||||
if not path.is_dir():
|
||||
logger.error(f"Cleanup failed: Not a directory - {path}")
|
||||
return False
|
||||
|
||||
current_time = time.time()
|
||||
age_seconds = age_days * 86400
|
||||
deleted_count = 0
|
||||
|
||||
for item in path.iterdir():
|
||||
if item.is_file():
|
||||
try:
|
||||
file_age = current_time - item.stat().st_mtime
|
||||
if file_age > age_seconds:
|
||||
item.unlink()
|
||||
logger.debug(f"Deleted old file: {item}")
|
||||
deleted_count += 1
|
||||
except Exception as item_e:
|
||||
logger.warning(f"Could not process/delete file {item}: {item_e}")
|
||||
|
||||
logger.info(f"Cleanup complete for {path}. Deleted {deleted_count} files older than {age_days} days.")
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to clean up directory {path}: {e}")
|
||||
return False
|
||||
|
||||
@staticmethod
|
||||
def move_directory(source: Path, destination: Path) -> bool:
|
||||
"""Move a directory and its contents."""
|
||||
try:
|
||||
if not source.is_dir():
|
||||
logger.error(f"Move failed: Source is not a directory - {source}")
|
||||
return False
|
||||
|
||||
FileSystemHandler.ensure_directory(destination.parent)
|
||||
|
||||
shutil.move(str(source), str(destination)) # shutil.move needs strings
|
||||
logger.info(f"Moved directory {source} to {destination}")
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to move directory {source} to {destination}: {e}")
|
||||
return False
|
||||
|
||||
@staticmethod
|
||||
def copy_directory(source: Path, destination: Path, dirs_exist_ok=True) -> bool:
|
||||
"""Copy a directory and its contents."""
|
||||
try:
|
||||
if not source.is_dir():
|
||||
logger.error(f"Copy failed: Source is not a directory - {source}")
|
||||
return False
|
||||
|
||||
# shutil.copytree needs destination to NOT exist unless dirs_exist_ok=True (Py 3.8+)
|
||||
# Ensure parent exists
|
||||
FileSystemHandler.ensure_directory(destination.parent)
|
||||
|
||||
shutil.copytree(source, destination, dirs_exist_ok=dirs_exist_ok)
|
||||
logger.info(f"Copied directory {source} to {destination}")
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to copy directory {source} to {destination}: {e}")
|
||||
return False
|
||||
|
||||
@staticmethod
|
||||
def list_directory(path: Path, pattern: Optional[str] = None) -> List[Path]:
|
||||
"""List contents of a directory, optionally filtering by pattern."""
|
||||
try:
|
||||
if not path.is_dir():
|
||||
logger.error(f"Cannot list: Not a directory - {path}")
|
||||
return []
|
||||
|
||||
if pattern:
|
||||
return list(path.glob(pattern))
|
||||
else:
|
||||
return list(path.iterdir())
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to list directory {path}: {e}")
|
||||
return []
|
||||
|
||||
@staticmethod
|
||||
def backup_modorganizer(modlist_ini: Path) -> bool:
|
||||
"""Backs up ModOrganizer.ini and adds a backupPath entry."""
|
||||
logger.info(f"Backing up {modlist_ini}...")
|
||||
backup_path = FileSystemHandler.backup_file(modlist_ini)
|
||||
if not backup_path:
|
||||
return False
|
||||
|
||||
try:
|
||||
# Add backupPath entry (read, find gamePath, duplicate/rename, write)
|
||||
content = modlist_ini.read_text().splitlines()
|
||||
new_content = []
|
||||
gamepath_line = None
|
||||
backupath_exists = False
|
||||
|
||||
for line in content:
|
||||
new_content.append(line)
|
||||
if line.strip().startswith("gamePath="):
|
||||
gamepath_line = line
|
||||
if line.strip().startswith("backupPath="):
|
||||
backupath_exists = True
|
||||
|
||||
if gamepath_line and not backupath_exists:
|
||||
backupath_line = gamepath_line.replace("gamePath=", "backupPath=", 1)
|
||||
# Find the index of gamepath_line to insert backupath after it
|
||||
try:
|
||||
gamepath_index = new_content.index(gamepath_line)
|
||||
new_content.insert(gamepath_index + 1, backupath_line)
|
||||
logger.debug("Added backupPath entry to ModOrganizer.ini")
|
||||
except ValueError:
|
||||
logger.warning("Could not find gamePath line index to insert backupPath.")
|
||||
new_content.append(backupath_line) # Append at end as fallback
|
||||
|
||||
modlist_ini.write_text("\n".join(new_content) + "\n")
|
||||
elif backupath_exists:
|
||||
logger.debug("backupPath already exists in ModOrganizer.ini")
|
||||
else:
|
||||
logger.warning("gamePath not found, cannot add backupPath entry.")
|
||||
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to add backupPath entry to {modlist_ini}: {e}")
|
||||
return False # Backup succeeded, but adding entry failed
|
||||
|
||||
@staticmethod
|
||||
def blank_downloads_dir(modlist_ini: Path) -> bool:
|
||||
"""Blanks the download_directory line in ModOrganizer.ini."""
|
||||
logger.info(f"Blanking download_directory in {modlist_ini}...")
|
||||
try:
|
||||
content = modlist_ini.read_text().splitlines()
|
||||
new_content = []
|
||||
found = False
|
||||
for line in content:
|
||||
if line.strip().startswith("download_directory="):
|
||||
new_content.append("download_directory=")
|
||||
found = True
|
||||
else:
|
||||
new_content.append(line)
|
||||
|
||||
if found:
|
||||
modlist_ini.write_text("\n".join(new_content) + "\n")
|
||||
logger.debug("download_directory line blanked.")
|
||||
else:
|
||||
logger.warning("download_directory line not found.")
|
||||
# Consider if we should add it blank?
|
||||
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to blank download_directory in {modlist_ini}: {e}")
|
||||
return False
|
||||
|
||||
@staticmethod
|
||||
def copy_file(src: Path, dst: Path, overwrite: bool = False) -> bool:
|
||||
"""Copy a single file."""
|
||||
try:
|
||||
if not src.is_file():
|
||||
logger.error(f"Copy failed: Source is not a file - {src}")
|
||||
return False
|
||||
if dst.exists() and not overwrite:
|
||||
logger.warning(f"Copy skipped: Destination exists and overwrite=False - {dst}")
|
||||
return False # Or True, depending on desired behavior for skip
|
||||
|
||||
FileSystemHandler.ensure_directory(dst.parent)
|
||||
shutil.copy2(src, dst)
|
||||
logger.debug(f"Copied file {src} to {dst}")
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to copy file {src} to {dst}: {e}")
|
||||
return False
|
||||
|
||||
@staticmethod
|
||||
def move_file(src: Path, dst: Path, overwrite: bool = False) -> bool:
|
||||
"""Move a single file."""
|
||||
try:
|
||||
if not src.is_file():
|
||||
logger.error(f"Move failed: Source is not a file - {src}")
|
||||
return False
|
||||
if dst.exists() and not overwrite:
|
||||
logger.warning(f"Move skipped: Destination exists and overwrite=False - {dst}")
|
||||
return False
|
||||
|
||||
FileSystemHandler.ensure_directory(dst.parent)
|
||||
shutil.move(str(src), str(dst)) # shutil.move needs strings
|
||||
# Create backup with timestamp
|
||||
timestamp = os.path.getmtime(modlist_ini)
|
||||
backup_path = modlist_ini.with_suffix(f'.{timestamp:.0f}.bak')
|
||||
|
||||
# Copy file to backup
|
||||
shutil.copy2(modlist_ini, backup_path)
|
||||
|
||||
# Copy game path to backup path
|
||||
with open(modlist_ini, 'r') as f:
|
||||
lines = f.readlines()
|
||||
|
||||
game_path_line = None
|
||||
for line in lines:
|
||||
if line.startswith('gamePath'):
|
||||
game_path_line = line
|
||||
break
|
||||
|
||||
if game_path_line:
|
||||
# Create backup path entry
|
||||
backup_path_line = game_path_line.replace('gamePath', 'backupPath')
|
||||
|
||||
# Append to file if not already present
|
||||
with open(modlist_ini, 'a') as f:
|
||||
f.write(backup_path_line)
|
||||
|
||||
self.logger.debug(f"Backed up ModOrganizer.ini and created backupPath entry")
|
||||
return True
|
||||
else:
|
||||
self.logger.error("No gamePath found in ModOrganizer.ini")
|
||||
return False
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error backing up ModOrganizer.ini: {e}")
|
||||
return False
|
||||
|
||||
def blank_downloads_dir(self, modlist_ini: Path) -> bool:
|
||||
"""
|
||||
Blank or reset the MO2 Downloads Directory
|
||||
Returns True on success, False on failure
|
||||
"""
|
||||
try:
|
||||
self.logger.info("Editing download_directory...")
|
||||
|
||||
# Read the file
|
||||
with open(modlist_ini, 'r') as f:
|
||||
content = f.read()
|
||||
|
||||
# Replace the download_directory line
|
||||
modified_content = re.sub(r'download_directory[^\n]*', 'download_directory =', content)
|
||||
|
||||
# Write back to the file
|
||||
with open(modlist_ini, 'w') as f:
|
||||
f.write(modified_content)
|
||||
|
||||
self.logger.debug("Download directory cleared successfully")
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error blanking downloads directory: {e}")
|
||||
return False
|
||||
|
||||
def copy_file(self, src: Path, dst: Path, overwrite: bool = False) -> bool:
|
||||
"""
|
||||
Copy a file from source to destination.
|
||||
|
||||
Args:
|
||||
src: Source file path
|
||||
dst: Destination file path
|
||||
overwrite: Whether to overwrite existing file
|
||||
|
||||
Returns:
|
||||
bool: True if file was copied successfully, False otherwise
|
||||
"""
|
||||
try:
|
||||
if not overwrite and os.path.exists(dst):
|
||||
self.logger.info(f"Destination file already exists: {dst}")
|
||||
return False
|
||||
|
||||
shutil.copy2(src, dst)
|
||||
return True
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error copying file: {e}")
|
||||
return False
|
||||
|
||||
def move_file(self, src: Path, dst: Path, overwrite: bool = False) -> bool:
|
||||
"""
|
||||
Move a file from source to destination.
|
||||
|
||||
Args:
|
||||
src: Source file path
|
||||
dst: Destination file path
|
||||
overwrite: Whether to overwrite existing file
|
||||
|
||||
Returns:
|
||||
bool: True if file was moved successfully, False otherwise
|
||||
"""
|
||||
try:
|
||||
if not overwrite and os.path.exists(dst):
|
||||
self.logger.info(f"Destination file already exists: {dst}")
|
||||
return False
|
||||
|
||||
shutil.move(src, dst)
|
||||
return True
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error moving file: {e}")
|
||||
return False
|
||||
|
||||
def delete_file(self, path: Path) -> bool:
|
||||
"""
|
||||
Delete a file.
|
||||
|
||||
Args:
|
||||
path: Path to the file to delete
|
||||
|
||||
Returns:
|
||||
bool: True if file was deleted successfully, False otherwise
|
||||
"""
|
||||
try:
|
||||
if os.path.exists(path):
|
||||
os.remove(path)
|
||||
return True
|
||||
return False
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error deleting file: {e}")
|
||||
return False
|
||||
|
||||
def delete_directory(self, path: Path, recursive: bool = True) -> bool:
|
||||
"""
|
||||
Delete a directory.
|
||||
|
||||
Args:
|
||||
path: Path to the directory to delete
|
||||
recursive: Whether to delete directory recursively
|
||||
|
||||
Returns:
|
||||
bool: True if directory was deleted successfully, False otherwise
|
||||
"""
|
||||
try:
|
||||
if os.path.exists(path):
|
||||
if recursive:
|
||||
shutil.rmtree(path)
|
||||
else:
|
||||
os.rmdir(path)
|
||||
return True
|
||||
return False
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error deleting directory: {e}")
|
||||
return False
|
||||
|
||||
def create_required_dirs(self, game_name: str, appid: str) -> bool:
|
||||
"""
|
||||
Create required directories for a game modlist
|
||||
|
||||
Args:
|
||||
game_name: Name of the game (e.g., skyrimse, fallout4)
|
||||
appid: Steam AppID of the modlist
|
||||
|
||||
Returns:
|
||||
bool: True if directories were created successfully, False otherwise
|
||||
"""
|
||||
try:
|
||||
# Define base paths
|
||||
home_dir = os.path.expanduser("~")
|
||||
game_dirs = {
|
||||
# Common directories needed across all games
|
||||
"common": [
|
||||
os.path.join(home_dir, ".local", "share", "Steam", "steamapps", "compatdata", appid, "pfx"),
|
||||
os.path.join(home_dir, ".steam", "steam", "steamapps", "compatdata", appid, "pfx")
|
||||
],
|
||||
# Game-specific directories
|
||||
"skyrimse": [
|
||||
os.path.join(home_dir, "Documents", "My Games", "Skyrim Special Edition"),
|
||||
],
|
||||
"fallout4": [
|
||||
os.path.join(home_dir, "Documents", "My Games", "Fallout4"),
|
||||
],
|
||||
"falloutnv": [
|
||||
os.path.join(home_dir, "Documents", "My Games", "FalloutNV"),
|
||||
],
|
||||
"oblivion": [
|
||||
os.path.join(home_dir, "Documents", "My Games", "Oblivion"),
|
||||
]
|
||||
}
|
||||
|
||||
# Create common directories
|
||||
for dir_path in game_dirs["common"]:
|
||||
if dir_path and os.path.exists(os.path.dirname(dir_path)):
|
||||
os.makedirs(dir_path, exist_ok=True)
|
||||
self.logger.debug(f"Created directory: {dir_path}")
|
||||
|
||||
# Create game-specific directories
|
||||
if game_name in game_dirs:
|
||||
for dir_path in game_dirs[game_name]:
|
||||
os.makedirs(dir_path, exist_ok=True)
|
||||
self.logger.debug(f"Created game-specific directory: {dir_path}")
|
||||
|
||||
return True
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error creating required directories: {e}")
|
||||
return False
|
||||
|
||||
@staticmethod
|
||||
def all_owned_by_user(path: Path) -> bool:
|
||||
"""
|
||||
Returns True if all files and directories under 'path' are owned by the current user.
|
||||
"""
|
||||
uid = os.getuid()
|
||||
gid = os.getgid()
|
||||
for root, dirs, files in os.walk(path):
|
||||
for name in dirs + files:
|
||||
full_path = os.path.join(root, name)
|
||||
try:
|
||||
stat = os.stat(full_path)
|
||||
if stat.st_uid != uid or stat.st_gid != gid:
|
||||
return False
|
||||
except Exception:
|
||||
return False
|
||||
return True
|
||||
|
||||
@staticmethod
|
||||
def set_ownership_and_permissions_sudo(path: Path, status_callback=None) -> bool:
|
||||
"""Change ownership and permissions using sudo (robust, with timeout and re-prompt)."""
|
||||
if not path.exists():
|
||||
logger.error(f"Path does not exist: {path}")
|
||||
return False
|
||||
# Check if all files/dirs are already owned by the user
|
||||
if FileSystemHandler.all_owned_by_user(path):
|
||||
logger.info(f"All files in {path} are already owned by the current user. Skipping sudo chown/chmod.")
|
||||
return True
|
||||
try:
|
||||
user_name = pwd.getpwuid(os.geteuid()).pw_name
|
||||
group_name = grp.getgrgid(os.geteuid()).gr_name
|
||||
except KeyError:
|
||||
logger.error("Could not determine current user or group name.")
|
||||
return False
|
||||
|
||||
log_msg = f"Applying ownership/permissions for {path} (user: {user_name}, group: {group_name}) via sudo."
|
||||
logger.info(log_msg)
|
||||
if status_callback:
|
||||
status_callback(f"Setting ownership/permissions for {os.path.basename(str(path))}...")
|
||||
else:
|
||||
print(f'\n{COLOR_PROMPT}Adjusting permissions for {path} (may require sudo password)...{COLOR_RESET}')
|
||||
|
||||
def run_sudo_with_retries(cmd, desc, max_retries=3, timeout=300):
|
||||
for attempt in range(max_retries):
|
||||
try:
|
||||
logger.info(f"Running sudo command (attempt {attempt+1}/{max_retries}): {' '.join(cmd)}")
|
||||
result = subprocess.run(cmd, capture_output=True, text=True, check=False, timeout=timeout)
|
||||
if result.returncode == 0:
|
||||
return True
|
||||
else:
|
||||
logger.error(f"sudo {desc} failed. Error: {result.stderr.strip()}")
|
||||
print(f"Error: Failed to {desc}. Check logs.")
|
||||
return False
|
||||
except subprocess.TimeoutExpired:
|
||||
logger.error(f"sudo {desc} timed out (attempt {attempt+1}/{max_retries}).")
|
||||
print(f"\nSudo prompt timed out after {timeout} seconds. Please try again.")
|
||||
# Flush input if possible, then retry
|
||||
print(f"Failed to {desc} after {max_retries} attempts. Aborting.")
|
||||
return False
|
||||
|
||||
# Run chown with retries
|
||||
chown_command = ['sudo', 'chown', '-R', f'{user_name}:{group_name}', str(path)]
|
||||
if not run_sudo_with_retries(chown_command, "change ownership"):
|
||||
return False
|
||||
print()
|
||||
# Run chmod with retries
|
||||
chmod_command = ['sudo', 'chmod', '-R', '755', str(path)]
|
||||
if not run_sudo_with_retries(chmod_command, "set permissions"):
|
||||
return False
|
||||
print()
|
||||
logger.info("Permissions set successfully.")
|
||||
return True
|
||||
|
||||
def download_file(self, url: str, destination_path: Path, overwrite: bool = False, quiet: bool = False) -> bool:
|
||||
"""Downloads a file from a URL to a destination path."""
|
||||
self.logger.info(f"Downloading {url} to {destination_path}...")
|
||||
|
||||
if not overwrite and destination_path.exists():
|
||||
self.logger.info(f"File already exists, skipping download: {destination_path}")
|
||||
# Only print if not quiet
|
||||
if not quiet:
|
||||
print(f"File {destination_path.name} already exists, skipping download.")
|
||||
return True # Consider existing file as success
|
||||
|
||||
try:
|
||||
# Ensure destination directory exists
|
||||
destination_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# Perform the download with streaming
|
||||
with requests.get(url, stream=True, timeout=300, verify=True) as r:
|
||||
r.raise_for_status() # Raise HTTPError for bad responses (4xx or 5xx)
|
||||
with open(destination_path, 'wb') as f:
|
||||
for chunk in r.iter_content(chunk_size=8192):
|
||||
f.write(chunk)
|
||||
|
||||
self.logger.info("Download complete.")
|
||||
# Only print if not quiet
|
||||
if not quiet:
|
||||
print("Download complete.")
|
||||
return True
|
||||
|
||||
except requests.exceptions.RequestException as e:
|
||||
self.logger.error(f"Download failed: {e}")
|
||||
print(f"Error: Download failed for {url}. Check network connection and URL.")
|
||||
# Clean up potentially incomplete file
|
||||
if destination_path.exists():
|
||||
try: destination_path.unlink()
|
||||
except OSError: pass
|
||||
return False
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error during download or file writing: {e}", exc_info=True)
|
||||
print("Error: An unexpected error occurred during download.")
|
||||
# Clean up potentially incomplete file
|
||||
if destination_path.exists():
|
||||
try: destination_path.unlink()
|
||||
except OSError: pass
|
||||
return False
|
||||
|
||||
@staticmethod
|
||||
def find_steam_library() -> Optional[Path]:
|
||||
"""
|
||||
Find the Steam library containing game installations, prioritizing vdf.
|
||||
|
||||
Returns:
|
||||
Optional[Path]: Path object to the Steam library's steamapps/common dir, or None if not found
|
||||
"""
|
||||
logger.info("Detecting Steam library location...")
|
||||
|
||||
# Try finding libraryfolders.vdf in common Steam paths
|
||||
possible_vdf_paths = [
|
||||
Path.home() / ".steam/steam/config/libraryfolders.vdf",
|
||||
Path.home() / ".local/share/Steam/config/libraryfolders.vdf",
|
||||
Path.home() / ".steam/root/config/libraryfolders.vdf"
|
||||
]
|
||||
|
||||
libraryfolders_vdf_path: Optional[Path] = None
|
||||
for path_obj in possible_vdf_paths:
|
||||
# Explicitly ensure path_obj is Path before checking is_file
|
||||
current_path = Path(path_obj)
|
||||
if current_path.is_file():
|
||||
libraryfolders_vdf_path = current_path # Assign the confirmed Path object
|
||||
logger.debug(f"Found libraryfolders.vdf at: {libraryfolders_vdf_path}")
|
||||
break
|
||||
|
||||
# Check AFTER loop - libraryfolders_vdf_path is now definitely Path or None
|
||||
if not libraryfolders_vdf_path:
|
||||
logger.warning("libraryfolders.vdf not found...")
|
||||
# Proceed to default check below if vdf not found
|
||||
else:
|
||||
# Parse the VDF file to extract library paths
|
||||
try:
|
||||
# Try importing vdf here if not done globally
|
||||
with open(libraryfolders_vdf_path, 'r') as f:
|
||||
data = vdf.load(f)
|
||||
|
||||
# Look for library folders (indices are strings '0', '1', etc.)
|
||||
libraries = data.get('libraryfolders', {})
|
||||
|
||||
for key in libraries:
|
||||
if isinstance(libraries[key], dict) and 'path' in libraries[key]:
|
||||
lib_path_str = libraries[key]['path']
|
||||
if lib_path_str:
|
||||
# Check if this library path is valid
|
||||
potential_lib_path = Path(lib_path_str) / "steamapps/common"
|
||||
if potential_lib_path.is_dir():
|
||||
logger.info(f"Using Steam library path from vdf: {potential_lib_path}")
|
||||
return potential_lib_path # Return first valid Path object found
|
||||
|
||||
logger.warning("No valid library paths found within libraryfolders.vdf.")
|
||||
# Proceed to default check below if vdf parsing fails to find a valid path
|
||||
|
||||
except ImportError:
|
||||
logger.error("Python 'vdf' library not found. Cannot parse libraryfolders.vdf.")
|
||||
# Proceed to default check below
|
||||
except Exception as e:
|
||||
logger.error(f"Error parsing libraryfolders.vdf: {e}")
|
||||
# Proceed to default check below
|
||||
|
||||
# Fallback: Check default location if VDF parsing didn't yield a result
|
||||
default_path = Path.home() / ".steam/steam/steamapps/common"
|
||||
if default_path.is_dir():
|
||||
logger.warning(f"Using default Steam library path: {default_path}")
|
||||
return default_path
|
||||
|
||||
logger.error("No valid Steam library found via vdf or at default location.")
|
||||
return None
|
||||
|
||||
@staticmethod
|
||||
def find_compat_data(appid: str) -> Optional[Path]:
|
||||
"""Find the compatdata directory for a given AppID."""
|
||||
if not appid or not appid.isdigit():
|
||||
logger.error(f"Invalid AppID provided for compatdata search: {appid}")
|
||||
return None
|
||||
|
||||
logger.debug(f"Searching for compatdata directory for AppID: {appid}")
|
||||
|
||||
# Standard Steam locations
|
||||
possible_bases = [
|
||||
Path.home() / ".steam/steam/steamapps/compatdata",
|
||||
Path.home() / ".local/share/Steam/steamapps/compatdata",
|
||||
]
|
||||
|
||||
# Try to get library path from vdf to check there too
|
||||
# Use type hint for clarity
|
||||
steam_lib_common_path: Optional[Path] = FileSystemHandler.find_steam_library()
|
||||
if steam_lib_common_path:
|
||||
# find_steam_library returns steamapps/common, go up two levels for library root
|
||||
library_root = steam_lib_common_path.parent.parent
|
||||
vdf_compat_path = library_root / "steamapps/compatdata"
|
||||
if vdf_compat_path.is_dir() and vdf_compat_path not in possible_bases:
|
||||
possible_bases.insert(0, vdf_compat_path) # Prioritize library path from vdf
|
||||
|
||||
for base_path in possible_bases:
|
||||
if not base_path.is_dir():
|
||||
logger.debug(f"Compatdata base path does not exist or is not a directory: {base_path}")
|
||||
continue
|
||||
|
||||
potential_path = base_path / appid
|
||||
if potential_path.is_dir():
|
||||
logger.info(f"Found compatdata directory: {potential_path}")
|
||||
return potential_path # Return Path object
|
||||
else:
|
||||
logger.debug(f"Compatdata for {appid} not found in {base_path}")
|
||||
|
||||
logger.warning(f"Compatdata directory for AppID {appid} not found in standard or detected library locations.")
|
||||
return None
|
||||
|
||||
@staticmethod
|
||||
def find_steam_config_vdf() -> Optional[Path]:
|
||||
"""Finds the active Steam config.vdf file."""
|
||||
logger.debug("Searching for Steam config.vdf...")
|
||||
possible_steam_paths = [
|
||||
Path.home() / ".steam/steam",
|
||||
Path.home() / ".local/share/Steam",
|
||||
Path.home() / ".steam/root"
|
||||
]
|
||||
for steam_path in possible_steam_paths:
|
||||
potential_path = steam_path / "config/config.vdf"
|
||||
if potential_path.is_file():
|
||||
logger.info(f"Found config.vdf at: {potential_path}")
|
||||
return potential_path # Return Path object
|
||||
|
||||
logger.warning("Could not locate Steam's config.vdf file in standard locations.")
|
||||
return None
|
||||
|
||||
# ... (rest of the class) ...
|
||||
260
jackify/backend/handlers/game_detector.py
Normal file
260
jackify/backend/handlers/game_detector.py
Normal file
@@ -0,0 +1,260 @@
|
||||
"""
|
||||
GameDetector module for detecting and managing game-related information.
|
||||
This module handles game type detection, version detection, and game-specific requirements.
|
||||
"""
|
||||
|
||||
import os
|
||||
import logging
|
||||
from pathlib import Path
|
||||
from typing import Optional, Dict, List, Tuple
|
||||
|
||||
class GameDetector:
|
||||
def __init__(self):
|
||||
self.logger = logging.getLogger(__name__)
|
||||
self.supported_games = {
|
||||
'skyrim': ['Skyrim Special Edition', 'Skyrim'],
|
||||
'fallout4': ['Fallout 4'],
|
||||
'falloutnv': ['Fallout New Vegas'],
|
||||
'oblivion': ['Oblivion'],
|
||||
'starfield': ['Starfield'],
|
||||
'oblivion_remastered': ['Oblivion Remastered']
|
||||
}
|
||||
|
||||
def detect_game_type(self, modlist_name: str) -> Optional[str]:
|
||||
"""Detect the game type from a modlist name."""
|
||||
modlist_lower = modlist_name.lower()
|
||||
|
||||
# Check for game-specific keywords in modlist name
|
||||
# Check for Oblivion Remastered first since "oblivion" is a substring
|
||||
if any(keyword in modlist_lower for keyword in ['oblivion remastered', 'oblivionremastered', 'oblivion_remastered']):
|
||||
return 'oblivion_remastered'
|
||||
elif any(keyword in modlist_lower for keyword in ['skyrim', 'sse', 'skse', 'dragonborn', 'dawnguard']):
|
||||
return 'skyrim'
|
||||
elif any(keyword in modlist_lower for keyword in ['fallout 4', 'fo4', 'f4se', 'commonwealth']):
|
||||
return 'fallout4'
|
||||
elif any(keyword in modlist_lower for keyword in ['fallout new vegas', 'fonv', 'fnv', 'new vegas', 'nvse']):
|
||||
return 'falloutnv'
|
||||
elif any(keyword in modlist_lower for keyword in ['oblivion', 'obse', 'shivering isles']):
|
||||
return 'oblivion'
|
||||
elif any(keyword in modlist_lower for keyword in ['starfield', 'sf', 'starfieldse']):
|
||||
return 'starfield'
|
||||
|
||||
self.logger.debug(f"Could not detect game type from modlist name: {modlist_name}")
|
||||
return None
|
||||
|
||||
def detect_game_version(self, game_type: str, modlist_path: Path) -> Optional[str]:
|
||||
"""Detect the game version from the modlist path."""
|
||||
try:
|
||||
# Look for ModOrganizer.ini to get game info
|
||||
mo_ini = modlist_path / "ModOrganizer.ini"
|
||||
if mo_ini.exists():
|
||||
with open(mo_ini, 'r', encoding='utf-8') as f:
|
||||
content = f.read()
|
||||
|
||||
# Extract game version info from MO2 config
|
||||
if 'gameName=' in content:
|
||||
for line in content.splitlines():
|
||||
if line.startswith('gameName='):
|
||||
game_name = line.split('=', 1)[1].strip()
|
||||
return game_name
|
||||
|
||||
self.logger.debug(f"Could not detect game version for {game_type} at {modlist_path}")
|
||||
return None
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error detecting game version: {e}")
|
||||
return None
|
||||
|
||||
def detect_game_path(self, game_type: str, modlist_path: Path) -> Optional[Path]:
|
||||
"""Detect the game installation path."""
|
||||
try:
|
||||
# Look for ModOrganizer.ini to get game path
|
||||
mo_ini = modlist_path / "ModOrganizer.ini"
|
||||
if mo_ini.exists():
|
||||
with open(mo_ini, 'r', encoding='utf-8') as f:
|
||||
content = f.read()
|
||||
|
||||
# Extract game path from MO2 config
|
||||
for line in content.splitlines():
|
||||
if line.startswith('gamePath='):
|
||||
game_path = line.split('=', 1)[1].strip()
|
||||
return Path(game_path) if game_path else None
|
||||
|
||||
self.logger.debug(f"Could not detect game path for {game_type} at {modlist_path}")
|
||||
return None
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error detecting game path: {e}")
|
||||
return None
|
||||
|
||||
def get_game_requirements(self, game_type: str) -> Dict:
|
||||
"""Get the requirements for a specific game type."""
|
||||
requirements = {
|
||||
'skyrim': {
|
||||
'launcher': 'SKSE',
|
||||
'min_proton_version': '6.0',
|
||||
'required_dlc': ['Dawnguard', 'Hearthfire', 'Dragonborn'],
|
||||
'compatibility_tools': ['protontricks', 'winetricks']
|
||||
},
|
||||
'fallout4': {
|
||||
'launcher': 'F4SE',
|
||||
'min_proton_version': '6.0',
|
||||
'required_dlc': [],
|
||||
'compatibility_tools': ['protontricks', 'winetricks']
|
||||
},
|
||||
'falloutnv': {
|
||||
'launcher': 'NVSE',
|
||||
'min_proton_version': '5.0',
|
||||
'required_dlc': [],
|
||||
'compatibility_tools': ['protontricks', 'winetricks']
|
||||
},
|
||||
'oblivion': {
|
||||
'launcher': 'OBSE',
|
||||
'min_proton_version': '5.0',
|
||||
'required_dlc': [],
|
||||
'compatibility_tools': ['protontricks', 'winetricks']
|
||||
},
|
||||
'starfield': {
|
||||
'launcher': 'SFSE',
|
||||
'min_proton_version': '8.0',
|
||||
'required_dlc': [],
|
||||
'compatibility_tools': ['protontricks', 'winetricks']
|
||||
},
|
||||
'oblivion_remastered': {
|
||||
'launcher': 'OBSE',
|
||||
'min_proton_version': '8.0',
|
||||
'required_dlc': [],
|
||||
'compatibility_tools': ['protontricks', 'winetricks']
|
||||
}
|
||||
}
|
||||
|
||||
return requirements.get(game_type, {})
|
||||
|
||||
def detect_mods(self, modlist_path: Path) -> List[Dict]:
|
||||
"""Detect installed mods in a modlist."""
|
||||
mods = []
|
||||
try:
|
||||
# Look for mods directory in MO2 structure
|
||||
mods_dir = modlist_path / "mods"
|
||||
if mods_dir.exists() and mods_dir.is_dir():
|
||||
for mod_dir in mods_dir.iterdir():
|
||||
if mod_dir.is_dir():
|
||||
mod_info = {
|
||||
'name': mod_dir.name,
|
||||
'path': str(mod_dir),
|
||||
'enabled': True # Assume enabled by default
|
||||
}
|
||||
|
||||
# Check for meta.ini for more details
|
||||
meta_ini = mod_dir / "meta.ini"
|
||||
if meta_ini.exists():
|
||||
try:
|
||||
with open(meta_ini, 'r', encoding='utf-8') as f:
|
||||
meta_content = f.read()
|
||||
# Parse basic mod info from meta.ini
|
||||
for line in meta_content.splitlines():
|
||||
if line.startswith('modid='):
|
||||
mod_info['nexus_id'] = line.split('=', 1)[1].strip()
|
||||
elif line.startswith('version='):
|
||||
mod_info['version'] = line.split('=', 1)[1].strip()
|
||||
except Exception:
|
||||
pass # Continue without meta info
|
||||
|
||||
mods.append(mod_info)
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error detecting mods: {e}")
|
||||
|
||||
return mods
|
||||
|
||||
def detect_launcher(self, game_type: str, modlist_path: Path) -> Optional[str]:
|
||||
"""Detect the game launcher type (SKSE, F4SE, etc)."""
|
||||
launcher_map = {
|
||||
'skyrim': 'SKSE',
|
||||
'fallout4': 'F4SE',
|
||||
'falloutnv': 'NVSE',
|
||||
'oblivion': 'OBSE',
|
||||
'starfield': 'SFSE',
|
||||
'oblivion_remastered': 'OBSE'
|
||||
}
|
||||
|
||||
expected_launcher = launcher_map.get(game_type)
|
||||
if not expected_launcher:
|
||||
return None
|
||||
|
||||
# Check if launcher executable exists
|
||||
launcher_exe = f"{expected_launcher.lower()}_loader.exe"
|
||||
if (modlist_path / launcher_exe).exists():
|
||||
return expected_launcher
|
||||
|
||||
return expected_launcher # Return expected even if not found
|
||||
|
||||
def get_launcher_path(self, launcher_type: str, modlist_path: Path) -> Optional[Path]:
|
||||
"""Get the path to the game launcher."""
|
||||
launcher_exe = f"{launcher_type.lower()}_loader.exe"
|
||||
launcher_path = modlist_path / launcher_exe
|
||||
|
||||
if launcher_path.exists():
|
||||
return launcher_path
|
||||
|
||||
return None
|
||||
|
||||
def detect_compatibility_requirements(self, game_type: str) -> List[str]:
|
||||
"""Detect compatibility requirements for a game type."""
|
||||
requirements = {
|
||||
'skyrim': ['vcrun2019', 'dotnet48', 'dxvk'],
|
||||
'fallout4': ['vcrun2019', 'dotnet48', 'dxvk'],
|
||||
'falloutnv': ['vcrun2019', 'dotnet48'],
|
||||
'oblivion': ['vcrun2019', 'dotnet48'],
|
||||
'starfield': ['vcrun2022', 'dotnet6', 'dotnet7', 'dxvk'],
|
||||
'oblivion_remastered': ['vcrun2022', 'dotnet6', 'dotnet7', 'dxvk']
|
||||
}
|
||||
|
||||
return requirements.get(game_type, [])
|
||||
|
||||
def validate_game_installation(self, game_type: str, game_path: Path) -> bool:
|
||||
"""Validate a game installation."""
|
||||
if not game_path or not game_path.exists():
|
||||
return False
|
||||
|
||||
# Check for game-specific executables
|
||||
game_executables = {
|
||||
'skyrim': ['SkyrimSE.exe', 'Skyrim.exe'],
|
||||
'fallout4': ['Fallout4.exe'],
|
||||
'falloutnv': ['FalloutNV.exe'],
|
||||
'oblivion': ['Oblivion.exe']
|
||||
}
|
||||
|
||||
executables = game_executables.get(game_type, [])
|
||||
for exe in executables:
|
||||
if (game_path / exe).exists():
|
||||
return True
|
||||
|
||||
return False
|
||||
|
||||
def get_game_specific_config(self, game_type: str) -> Dict:
|
||||
"""Get game-specific configuration requirements."""
|
||||
configs = {
|
||||
'skyrim': {
|
||||
'ini_files': ['Skyrim.ini', 'SkyrimPrefs.ini', 'SkyrimCustom.ini'],
|
||||
'config_dirs': ['Data', 'Saves'],
|
||||
'registry_keys': ['HKEY_LOCAL_MACHINE\\SOFTWARE\\Bethesda Softworks\\Skyrim Special Edition']
|
||||
},
|
||||
'fallout4': {
|
||||
'ini_files': ['Fallout4.ini', 'Fallout4Prefs.ini', 'Fallout4Custom.ini'],
|
||||
'config_dirs': ['Data', 'Saves'],
|
||||
'registry_keys': ['HKEY_LOCAL_MACHINE\\SOFTWARE\\Bethesda Softworks\\Fallout 4']
|
||||
},
|
||||
'falloutnv': {
|
||||
'ini_files': ['Fallout.ini', 'FalloutPrefs.ini'],
|
||||
'config_dirs': ['Data', 'Saves'],
|
||||
'registry_keys': ['HKEY_LOCAL_MACHINE\\SOFTWARE\\Bethesda Softworks\\FalloutNV']
|
||||
},
|
||||
'oblivion': {
|
||||
'ini_files': ['Oblivion.ini'],
|
||||
'config_dirs': ['Data', 'Saves'],
|
||||
'registry_keys': ['HKEY_LOCAL_MACHINE\\SOFTWARE\\Bethesda Softworks\\Oblivion']
|
||||
}
|
||||
}
|
||||
|
||||
return configs.get(game_type, {})
|
||||
994
jackify/backend/handlers/hoolamike_handler.py
Normal file
994
jackify/backend/handlers/hoolamike_handler.py
Normal file
@@ -0,0 +1,994 @@
|
||||
import logging
|
||||
import os
|
||||
import subprocess
|
||||
import zipfile
|
||||
import tarfile
|
||||
from pathlib import Path
|
||||
import yaml # Assuming PyYAML is installed
|
||||
from typing import Dict, Optional, List
|
||||
import requests
|
||||
|
||||
# Import necessary handlers from the current Jackify structure
|
||||
from .path_handler import PathHandler
|
||||
from .vdf_handler import VDFHandler # Keeping just in case
|
||||
from .filesystem_handler import FileSystemHandler
|
||||
from .config_handler import ConfigHandler
|
||||
# Import color constants needed for print statements in this module
|
||||
from .ui_colors import COLOR_ERROR, COLOR_SUCCESS, COLOR_WARNING, COLOR_RESET, COLOR_INFO, COLOR_PROMPT, COLOR_SELECTION
|
||||
# Standard logging (no file handler) - LoggingHandler import removed
|
||||
from .status_utils import show_status, clear_status
|
||||
from .subprocess_utils import get_clean_subprocess_env
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Define default Hoolamike AppIDs for relevant games
|
||||
TARGET_GAME_APPIDS = {
|
||||
'Fallout 3': '22370', # GOTY Edition
|
||||
'Fallout New Vegas': '22380', # Base game
|
||||
'Skyrim Special Edition': '489830',
|
||||
'Oblivion': '22330', # GOTY Edition
|
||||
'Fallout 4': '377160'
|
||||
}
|
||||
|
||||
# Define the expected name of the native Hoolamike executable
|
||||
HOOLAMIKE_EXECUTABLE_NAME = "hoolamike" # Assuming this is the binary name
|
||||
# Keep consistent with logs directory - use ~/Jackify/ for user-visible managed components
|
||||
JACKIFY_BASE_DIR = Path.home() / "Jackify"
|
||||
# Use Jackify base directory for ALL Hoolamike-related files to centralize management
|
||||
DEFAULT_HOOLAMIKE_APP_INSTALL_DIR = JACKIFY_BASE_DIR / "Hoolamike"
|
||||
HOOLAMIKE_CONFIG_DIR = DEFAULT_HOOLAMIKE_APP_INSTALL_DIR
|
||||
HOOLAMIKE_CONFIG_FILENAME = "hoolamike.yaml"
|
||||
# Default dirs for other components
|
||||
DEFAULT_HOOLAMIKE_DOWNLOADS_DIR = JACKIFY_BASE_DIR / "Mod_Downloads"
|
||||
DEFAULT_MODLIST_INSTALL_BASE_DIR = Path.home() / "ModdedGames"
|
||||
|
||||
class HoolamikeHandler:
|
||||
"""Handles discovery, configuration, and execution of Hoolamike tasks.
|
||||
Assumes Hoolamike is a native Linux CLI application.
|
||||
"""
|
||||
|
||||
def __init__(self, steamdeck: bool, verbose: bool, filesystem_handler: FileSystemHandler, config_handler: ConfigHandler, menu_handler=None):
|
||||
"""Initialize the handler and perform initial discovery."""
|
||||
self.steamdeck = steamdeck
|
||||
self.verbose = verbose
|
||||
self.path_handler = PathHandler()
|
||||
self.filesystem_handler = filesystem_handler
|
||||
self.config_handler = config_handler
|
||||
self.menu_handler = menu_handler
|
||||
# Use standard logging (no file handler)
|
||||
self.logger = logging.getLogger(__name__)
|
||||
|
||||
# --- Discovered/Managed State ---
|
||||
self.game_install_paths: Dict[str, Path] = {}
|
||||
# Allow user override for Hoolamike app install path later
|
||||
self.hoolamike_app_install_path: Path = DEFAULT_HOOLAMIKE_APP_INSTALL_DIR
|
||||
self.hoolamike_executable_path: Optional[Path] = None # Path to the binary
|
||||
self.hoolamike_installed: bool = False
|
||||
self.hoolamike_config_path: Path = HOOLAMIKE_CONFIG_DIR / HOOLAMIKE_CONFIG_FILENAME
|
||||
self.hoolamike_config: Optional[Dict] = None
|
||||
|
||||
# Load Hoolamike install path from Jackify config if it exists
|
||||
saved_path_str = self.config_handler.get('hoolamike_install_path')
|
||||
if saved_path_str and Path(saved_path_str).is_dir(): # Basic check if path exists
|
||||
self.hoolamike_app_install_path = Path(saved_path_str)
|
||||
self.logger.info(f"Loaded Hoolamike install path from Jackify config: {self.hoolamike_app_install_path}")
|
||||
|
||||
self._load_hoolamike_config()
|
||||
self._run_discovery()
|
||||
|
||||
def _ensure_hoolamike_dirs_exist(self):
|
||||
"""Ensure base directories for Hoolamike exist."""
|
||||
try:
|
||||
HOOLAMIKE_CONFIG_DIR.mkdir(parents=True, exist_ok=True) # Separate Hoolamike config
|
||||
self.hoolamike_app_install_path.mkdir(parents=True, exist_ok=True) # Install dir (~/Jackify/Hoolamike)
|
||||
# Default downloads dir also needs to exist if we reference it
|
||||
DEFAULT_HOOLAMIKE_DOWNLOADS_DIR.mkdir(parents=True, exist_ok=True)
|
||||
except OSError as e:
|
||||
self.logger.error(f"Error creating Hoolamike directories: {e}", exc_info=True)
|
||||
# Decide how to handle this - maybe raise an exception?
|
||||
|
||||
def _check_hoolamike_installation(self):
|
||||
"""Check if Hoolamike executable exists at the expected location.
|
||||
Prioritizes path stored in config if available.
|
||||
"""
|
||||
potential_exe_path = self.hoolamike_app_install_path / HOOLAMIKE_EXECUTABLE_NAME
|
||||
check_path = None
|
||||
if potential_exe_path.is_file() and os.access(potential_exe_path, os.X_OK):
|
||||
check_path = potential_exe_path
|
||||
self.logger.info(f"Found Hoolamike at current path: {check_path}")
|
||||
else:
|
||||
self.logger.info(f"Hoolamike executable ({HOOLAMIKE_EXECUTABLE_NAME}) not found or not executable at current path {self.hoolamike_app_install_path}.")
|
||||
|
||||
# Update state based on whether we found a valid path
|
||||
if check_path:
|
||||
self.hoolamike_installed = True
|
||||
self.hoolamike_executable_path = check_path
|
||||
else:
|
||||
self.hoolamike_installed = False
|
||||
self.hoolamike_executable_path = None
|
||||
|
||||
def _generate_default_config(self) -> Dict:
|
||||
"""Generates the default configuration dictionary."""
|
||||
self.logger.info("Generating default Hoolamike config structure.")
|
||||
# Detection is now handled separately after loading config
|
||||
detected_paths = self.path_handler.find_game_install_paths(TARGET_GAME_APPIDS)
|
||||
|
||||
config = {
|
||||
"downloaders": {
|
||||
"downloads_directory": str(DEFAULT_HOOLAMIKE_DOWNLOADS_DIR),
|
||||
"nexus": {"api_key": "YOUR_API_KEY_HERE"}
|
||||
},
|
||||
"installation": {
|
||||
"wabbajack_file_path": "", # Placeholder, set per-run
|
||||
"installation_path": "" # Placeholder, set per-run
|
||||
},
|
||||
"games": { # Only include detected games with consistent formatting (no spaces)
|
||||
self._format_game_name(game_name): {"root_directory": str(path)}
|
||||
for game_name, path in detected_paths.items()
|
||||
},
|
||||
"fixup": {
|
||||
"game_resolution": "1920x1080"
|
||||
},
|
||||
"extras": {
|
||||
"tale_of_two_wastelands": {
|
||||
"path_to_ttw_mpi_file": "", # Placeholder
|
||||
"variables": {
|
||||
"DESTINATION": "" # Placeholder
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
# Add comment if no games detected
|
||||
if not detected_paths:
|
||||
# This won't appear in YAML, logic adjusted below
|
||||
pass
|
||||
return config
|
||||
|
||||
def _format_game_name(self, game_name: str) -> str:
|
||||
"""Formats game name for Hoolamike configuration (removes spaces).
|
||||
|
||||
Hoolamike expects game names without spaces like: Fallout3, FalloutNewVegas, SkyrimSpecialEdition
|
||||
"""
|
||||
# Handle specific game name formats that Hoolamike expects
|
||||
game_name_map = {
|
||||
"Fallout 3": "Fallout3",
|
||||
"Fallout New Vegas": "FalloutNewVegas",
|
||||
"Skyrim Special Edition": "SkyrimSpecialEdition",
|
||||
"Fallout 4": "Fallout4",
|
||||
"Oblivion": "Oblivion" # No change needed
|
||||
}
|
||||
|
||||
# Use predefined mapping if available
|
||||
if game_name in game_name_map:
|
||||
return game_name_map[game_name]
|
||||
|
||||
# Otherwise, just remove spaces as fallback
|
||||
return game_name.replace(" ", "")
|
||||
|
||||
def _load_hoolamike_config(self):
|
||||
"""Load hoolamike.yaml if it exists, or generate a default one."""
|
||||
self._ensure_hoolamike_dirs_exist() # Ensure parent dir exists
|
||||
|
||||
if self.hoolamike_config_path.is_file():
|
||||
self.logger.info(f"Found existing hoolamike.yaml at {self.hoolamike_config_path}. Loading...")
|
||||
try:
|
||||
with open(self.hoolamike_config_path, 'r', encoding='utf-8') as f:
|
||||
self.hoolamike_config = yaml.safe_load(f)
|
||||
if not isinstance(self.hoolamike_config, dict):
|
||||
self.logger.warning(f"Failed to parse hoolamike.yaml as a dictionary. Generating default.")
|
||||
self.hoolamike_config = self._generate_default_config()
|
||||
self.save_hoolamike_config() # Save the newly generated default
|
||||
else:
|
||||
self.logger.info("Successfully loaded hoolamike.yaml configuration.")
|
||||
# Game path merging is handled in _run_discovery now
|
||||
except yaml.YAMLError as e:
|
||||
self.logger.error(f"Error parsing hoolamike.yaml: {e}. The file may be corrupted.")
|
||||
# Don't automatically overwrite - let user decide
|
||||
self.hoolamike_config = None
|
||||
return False
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error reading hoolamike.yaml: {e}.", exc_info=True)
|
||||
# Don't automatically overwrite - let user decide
|
||||
self.hoolamike_config = None
|
||||
return False
|
||||
else:
|
||||
self.logger.info(f"hoolamike.yaml not found at {self.hoolamike_config_path}. Generating default configuration.")
|
||||
self.hoolamike_config = self._generate_default_config()
|
||||
self.save_hoolamike_config()
|
||||
|
||||
return True
|
||||
|
||||
def save_hoolamike_config(self):
|
||||
"""Saves the current configuration dictionary to hoolamike.yaml."""
|
||||
if self.hoolamike_config is None:
|
||||
self.logger.error("Cannot save config, internal config dictionary is None.")
|
||||
return False
|
||||
|
||||
self._ensure_hoolamike_dirs_exist() # Ensure parent dir exists
|
||||
self.logger.info(f"Saving configuration to {self.hoolamike_config_path}")
|
||||
try:
|
||||
with open(self.hoolamike_config_path, 'w', encoding='utf-8') as f:
|
||||
# Add comments conditionally
|
||||
f.write("# Configuration file created or updated by Jackify\n")
|
||||
if not self.hoolamike_config.get("games"):
|
||||
f.write("# No games were detected by Jackify. Add game paths manually if needed.\n")
|
||||
# Dump the actual YAML
|
||||
yaml.dump(self.hoolamike_config, f, default_flow_style=False, sort_keys=False)
|
||||
self.logger.info("Configuration saved successfully.")
|
||||
return True
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error saving hoolamike.yaml: {e}", exc_info=True)
|
||||
return False
|
||||
|
||||
def _run_discovery(self):
|
||||
"""Execute all discovery steps."""
|
||||
self.logger.info("Starting Hoolamike feature discovery phase...")
|
||||
|
||||
# Detect game paths and update internal state + config
|
||||
self._detect_and_update_game_paths()
|
||||
|
||||
self.logger.info("Hoolamike discovery phase complete.")
|
||||
|
||||
def _detect_and_update_game_paths(self):
|
||||
"""Detect game install paths and update state and config."""
|
||||
self.logger.info("Detecting game install paths...")
|
||||
# Always run detection
|
||||
detected_paths = self.path_handler.find_game_install_paths(TARGET_GAME_APPIDS)
|
||||
self.game_install_paths = detected_paths # Update internal state
|
||||
self.logger.info(f"Detected game paths: {detected_paths}")
|
||||
|
||||
# Update the loaded config if it exists
|
||||
if self.hoolamike_config is not None:
|
||||
self.logger.debug("Updating loaded hoolamike.yaml with detected game paths.")
|
||||
if "games" not in self.hoolamike_config or not isinstance(self.hoolamike_config.get("games"), dict):
|
||||
self.hoolamike_config["games"] = {} # Ensure games section exists
|
||||
|
||||
# Define a unified format for game names in config - no spaces
|
||||
# Clear existing entries first to avoid duplicates
|
||||
self.hoolamike_config["games"] = {}
|
||||
|
||||
# Add detected paths with proper formatting - no spaces
|
||||
for game_name, detected_path in detected_paths.items():
|
||||
formatted_name = self._format_game_name(game_name)
|
||||
self.hoolamike_config["games"][formatted_name] = {"root_directory": str(detected_path)}
|
||||
|
||||
self.logger.info(f"Updated config with {len(detected_paths)} game paths using correct naming format (no spaces)")
|
||||
else:
|
||||
self.logger.warning("Cannot update game paths in config because config is not loaded.")
|
||||
|
||||
# --- Methods for Hoolamike Tasks (To be implemented later) ---
|
||||
# TODO: Update these methods to accept necessary parameters and update/save config
|
||||
|
||||
def install_update_hoolamike(self, context=None) -> bool:
|
||||
"""Install or update Hoolamike application.
|
||||
|
||||
Returns:
|
||||
bool: True if installation/update was successful or process was properly cancelled,
|
||||
False if a critical error occurred.
|
||||
"""
|
||||
self.logger.info("Starting Hoolamike Installation/Update...")
|
||||
print("\nStarting Hoolamike Installation/Update...")
|
||||
|
||||
# 1. Prompt user to install/reinstall/update
|
||||
try:
|
||||
# Check if Hoolamike is already installed at the expected path
|
||||
self._check_hoolamike_installation()
|
||||
if self.hoolamike_installed:
|
||||
self.logger.info(f"Hoolamike appears to be installed at: {self.hoolamike_executable_path}")
|
||||
print(f"{COLOR_INFO}Hoolamike is already installed at:{COLOR_RESET}")
|
||||
print(f" {self.hoolamike_executable_path}")
|
||||
# Use a menu-style prompt for reinstall/update
|
||||
print(f"\n{COLOR_PROMPT}Choose an action for Hoolamike:{COLOR_RESET}")
|
||||
print(f" 1. Reinstall/Update Hoolamike")
|
||||
print(f" 2. Keep existing installation (return to menu)")
|
||||
while True:
|
||||
choice = input(f"Select an option [1-2]: ").strip()
|
||||
if choice == '1':
|
||||
self.logger.info("User chose to reinstall/update Hoolamike.")
|
||||
break
|
||||
elif choice == '2' or choice.lower() == 'q':
|
||||
self.logger.info("User chose to keep existing Hoolamike installation.")
|
||||
print("Skipping Hoolamike installation/update.")
|
||||
return True
|
||||
else:
|
||||
print(f"{COLOR_WARNING}Invalid choice. Please enter 1 or 2.{COLOR_RESET}")
|
||||
# 2. Get installation directory from user (allow override)
|
||||
self.logger.info(f"Default install path: {self.hoolamike_app_install_path}")
|
||||
print("\nHoolamike Installation Directory:")
|
||||
print(f"Default: {self.hoolamike_app_install_path}")
|
||||
install_dir = self.menu_handler.get_directory_path(
|
||||
prompt_message=f"Specify where to install Hoolamike (or press Enter for default)",
|
||||
default_path=self.hoolamike_app_install_path,
|
||||
create_if_missing=True,
|
||||
no_header=True
|
||||
)
|
||||
if install_dir is None:
|
||||
self.logger.warning("User cancelled Hoolamike installation path selection.")
|
||||
print("Installation cancelled.")
|
||||
return True
|
||||
# Check if hoolamike already exists at this specific path
|
||||
potential_existing_exe = install_dir / HOOLAMIKE_EXECUTABLE_NAME
|
||||
if potential_existing_exe.is_file() and os.access(potential_existing_exe, os.X_OK):
|
||||
self.logger.info(f"Hoolamike executable found at the chosen path: {potential_existing_exe}")
|
||||
print(f"{COLOR_INFO}Hoolamike appears to already be installed at:{COLOR_RESET}")
|
||||
print(f" {install_dir}")
|
||||
# Use menu-style prompt for overwrite
|
||||
print(f"{COLOR_PROMPT}Choose an action for the existing installation:{COLOR_RESET}")
|
||||
print(f" 1. Download and overwrite (update)")
|
||||
print(f" 2. Keep existing installation (return to menu)")
|
||||
while True:
|
||||
overwrite_choice = input(f"Select an option [1-2]: ").strip()
|
||||
if overwrite_choice == '1':
|
||||
self.logger.info("User chose to update (overwrite) existing Hoolamike installation.")
|
||||
break
|
||||
elif overwrite_choice == '2' or overwrite_choice.lower() == 'q':
|
||||
self.logger.info("User chose to keep existing Hoolamike installation at chosen path.")
|
||||
print("Update cancelled. Using existing installation for this session.")
|
||||
self.hoolamike_app_install_path = install_dir
|
||||
self.hoolamike_executable_path = potential_existing_exe
|
||||
self.hoolamike_installed = True
|
||||
return True
|
||||
else:
|
||||
print(f"{COLOR_WARNING}Invalid choice. Please enter 1 or 2.{COLOR_RESET}")
|
||||
# Proceed with install/update
|
||||
self.logger.info(f"Proceeding with installation to directory: {install_dir}")
|
||||
self.hoolamike_app_install_path = install_dir
|
||||
# Get latest release info from GitHub
|
||||
release_url = "https://api.github.com/repos/Niedzwiedzw/hoolamike/releases/latest"
|
||||
download_url = None
|
||||
asset_name = None
|
||||
try:
|
||||
self.logger.info(f"Fetching latest release info from {release_url}")
|
||||
show_status("Fetching latest Hoolamike release info...")
|
||||
response = requests.get(release_url, timeout=15, verify=True)
|
||||
response.raise_for_status()
|
||||
release_data = response.json()
|
||||
self.logger.debug(f"GitHub Release Data: {release_data}")
|
||||
linux_tar_asset = None
|
||||
linux_zip_asset = None
|
||||
for asset in release_data.get('assets', []):
|
||||
name = asset.get('name', '').lower()
|
||||
self.logger.debug(f"Checking asset: {name}")
|
||||
is_linux = 'linux' in name
|
||||
is_x64 = 'x86_64' in name or 'amd64' in name
|
||||
is_incompatible_arch = 'arm' in name or 'aarch64' in name or 'darwin' in name
|
||||
if is_linux and is_x64 and not is_incompatible_arch:
|
||||
if name.endswith(('.tar.gz', '.tgz')):
|
||||
linux_tar_asset = asset
|
||||
self.logger.debug(f"Found potential tar asset: {name}")
|
||||
break
|
||||
elif name.endswith('.zip') and not linux_tar_asset:
|
||||
linux_zip_asset = asset
|
||||
self.logger.debug(f"Found potential zip asset: {name}")
|
||||
chosen_asset = linux_tar_asset or linux_zip_asset
|
||||
if not chosen_asset:
|
||||
clear_status()
|
||||
self.logger.error("Could not find a suitable Linux x86_64 download asset (tar.gz/zip) in the latest release.")
|
||||
print(f"{COLOR_ERROR}Error: Could not find a linux x86_64 download asset in the latest Hoolamike release.{COLOR_RESET}")
|
||||
return False
|
||||
download_url = chosen_asset.get('browser_download_url')
|
||||
asset_name = chosen_asset.get('name')
|
||||
if not download_url or not asset_name:
|
||||
clear_status()
|
||||
self.logger.error(f"Chosen asset is missing URL or name: {chosen_asset}")
|
||||
print(f"{COLOR_ERROR}Error: Found asset but could not get download details.{COLOR_RESET}")
|
||||
return False
|
||||
self.logger.info(f"Found asset '{asset_name}' for download: {download_url}")
|
||||
clear_status()
|
||||
except requests.exceptions.RequestException as e:
|
||||
clear_status()
|
||||
self.logger.error(f"Failed to fetch release info from GitHub: {e}")
|
||||
print(f"Error: Failed to contact GitHub to check for Hoolamike updates: {e}")
|
||||
return False
|
||||
except Exception as e:
|
||||
clear_status()
|
||||
self.logger.error(f"Error parsing release info: {e}", exc_info=True)
|
||||
print("Error: Failed to understand release information from GitHub.")
|
||||
return False
|
||||
# Download the asset
|
||||
show_status(f"Downloading {asset_name}...")
|
||||
temp_download_path = self.hoolamike_app_install_path / asset_name
|
||||
if not self.filesystem_handler.download_file(download_url, temp_download_path, overwrite=True, quiet=True):
|
||||
clear_status()
|
||||
self.logger.error(f"Failed to download {asset_name} from {download_url}")
|
||||
print(f"{COLOR_ERROR}Error: Failed to download Hoolamike asset.{COLOR_RESET}")
|
||||
return False
|
||||
clear_status()
|
||||
self.logger.info(f"Downloaded {asset_name} successfully to {temp_download_path}")
|
||||
show_status("Extracting Hoolamike archive...")
|
||||
# Extract the asset
|
||||
try:
|
||||
if asset_name.lower().endswith(('.tar.gz', '.tgz')):
|
||||
self.logger.debug(f"Extracting tar file: {temp_download_path}")
|
||||
with tarfile.open(temp_download_path, 'r:*') as tar:
|
||||
tar.extractall(path=self.hoolamike_app_install_path)
|
||||
self.logger.info("Extracted tar file successfully.")
|
||||
elif asset_name.lower().endswith('.zip'):
|
||||
self.logger.debug(f"Extracting zip file: {temp_download_path}")
|
||||
with zipfile.ZipFile(temp_download_path, 'r') as zip_ref:
|
||||
zip_ref.extractall(self.hoolamike_app_install_path)
|
||||
self.logger.info("Extracted zip file successfully.")
|
||||
else:
|
||||
clear_status()
|
||||
self.logger.error(f"Unknown archive format for asset: {asset_name}")
|
||||
print(f"{COLOR_ERROR}Error: Unknown file type '{asset_name}'. Cannot extract.{COLOR_RESET}")
|
||||
return False
|
||||
clear_status()
|
||||
print("Extraction complete. Setting permissions...")
|
||||
except (tarfile.TarError, zipfile.BadZipFile, EOFError) as e:
|
||||
clear_status()
|
||||
self.logger.error(f"Failed to extract archive {temp_download_path}: {e}", exc_info=True)
|
||||
print(f"{COLOR_ERROR}Error: Failed to extract downloaded file: {e}{COLOR_RESET}")
|
||||
return False
|
||||
except Exception as e:
|
||||
clear_status()
|
||||
self.logger.error(f"An unexpected error occurred during extraction: {e}", exc_info=True)
|
||||
print(f"{COLOR_ERROR}An unexpected error occurred during extraction.{COLOR_RESET}")
|
||||
return False
|
||||
finally:
|
||||
# Clean up downloaded archive
|
||||
if temp_download_path.exists():
|
||||
try:
|
||||
temp_download_path.unlink()
|
||||
self.logger.debug(f"Removed temporary download file: {temp_download_path}")
|
||||
except OSError as e:
|
||||
self.logger.warning(f"Could not remove temporary download file {temp_download_path}: {e}")
|
||||
# Set execute permissions on the binary
|
||||
executable_path = self.hoolamike_app_install_path / HOOLAMIKE_EXECUTABLE_NAME
|
||||
if executable_path.is_file():
|
||||
try:
|
||||
show_status("Setting permissions on Hoolamike executable...")
|
||||
os.chmod(executable_path, 0o755)
|
||||
self.logger.info(f"Set execute permissions (+x) on {executable_path}")
|
||||
clear_status()
|
||||
print("Permissions set successfully.")
|
||||
except OSError as e:
|
||||
clear_status()
|
||||
self.logger.error(f"Failed to set execute permission on {executable_path}: {e}")
|
||||
print(f"{COLOR_ERROR}Error: Could not set execute permission on Hoolamike executable.{COLOR_RESET}")
|
||||
else:
|
||||
clear_status()
|
||||
self.logger.error(f"Hoolamike executable not found after extraction at {executable_path}")
|
||||
print(f"{COLOR_ERROR}Error: Hoolamike executable missing after extraction!{COLOR_RESET}")
|
||||
return False
|
||||
# Update self.hoolamike_installed and self.hoolamike_executable_path state
|
||||
self.logger.info("Refreshing Hoolamike installation status...")
|
||||
self._check_hoolamike_installation()
|
||||
if not self.hoolamike_installed:
|
||||
self.logger.error("Hoolamike check failed after apparent successful install/extract.")
|
||||
print(f"{COLOR_ERROR}Error: Installation completed, but failed final verification check.{COLOR_RESET}")
|
||||
return False
|
||||
# Save install path to Jackify config
|
||||
self.logger.info(f"Saving Hoolamike install path to Jackify config: {self.hoolamike_app_install_path}")
|
||||
self.config_handler.set('hoolamike_install_path', str(self.hoolamike_app_install_path))
|
||||
if not self.config_handler.save_config():
|
||||
self.logger.warning("Failed to save Jackify config file after updating Hoolamike path.")
|
||||
# Non-fatal, but warn user?
|
||||
print(f"{COLOR_WARNING}Warning: Could not save installation path to main Jackify config file.{COLOR_RESET}")
|
||||
print(f"{COLOR_SUCCESS}Hoolamike installation/update successful!{COLOR_RESET}")
|
||||
self.logger.info("Hoolamike install/update process completed successfully.")
|
||||
return True
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error during Hoolamike installation/update: {e}", exc_info=True)
|
||||
print(f"{COLOR_ERROR}Error: An unexpected error occurred during Hoolamike installation/update: {e}{COLOR_RESET}")
|
||||
return False
|
||||
|
||||
def install_modlist(self, wabbajack_path=None, install_path=None, downloads_path=None, premium=False, api_key=None, game_resolution=None, context=None):
|
||||
"""
|
||||
Install a Wabbajack modlist using Hoolamike, following Jackify's Discovery/Configuration/Confirmation pattern.
|
||||
"""
|
||||
self.logger.info("Starting Hoolamike modlist install (Discovery Phase)")
|
||||
self._check_hoolamike_installation()
|
||||
menu = self.menu_handler
|
||||
print(f"\n{'='*60}")
|
||||
print(f"{COLOR_INFO}Hoolamike Modlist Installation{COLOR_RESET}")
|
||||
print(f"{'='*60}\n")
|
||||
|
||||
# --- Discovery Phase ---
|
||||
# 1. Auto-detect games (robust, multi-library)
|
||||
detected_games = self.path_handler.find_vanilla_game_paths()
|
||||
# 2. Prompt for .wabbajack file (custom prompt, only accept .wabbajack, q to exit, with tab-completion)
|
||||
print()
|
||||
while not wabbajack_path:
|
||||
print(f"{COLOR_WARNING}This option requires a Nexus Mods Premium account for automatic downloads.{COLOR_RESET}")
|
||||
print(f"If you don't have a premium account, please use the '{COLOR_SELECTION}Non-Premium Installation{COLOR_RESET}' option from the previous menu instead.\n")
|
||||
print(f"Before continuing, you'll need a .wabbajack file. You can usually find these at:")
|
||||
print(f" 1. {COLOR_INFO}https://build.wabbajack.org/authored_files{COLOR_RESET} - Official Wabbajack modlist repository")
|
||||
print(f" 2. {COLOR_INFO}https://www.nexusmods.com/{COLOR_RESET} - Some modlist authors publish on Nexus Mods")
|
||||
print(f" 3. Various Discord communities for specific modlists\n")
|
||||
print(f"{COLOR_WARNING}NOTE: Download the .wabbajack file first, then continue. Enter 'q' to exit.{COLOR_RESET}\n")
|
||||
# Use menu.get_existing_file_path for tab-completion
|
||||
candidate = menu.get_existing_file_path(
|
||||
prompt_message="Enter the path to your .wabbajack file (or 'q' to cancel):",
|
||||
extension_filter=".wabbajack",
|
||||
no_header=True
|
||||
)
|
||||
if candidate is None:
|
||||
print(f"{COLOR_WARNING}Cancelled by user.{COLOR_RESET}")
|
||||
return False
|
||||
# If user literally typed 'q', treat as cancel
|
||||
if str(candidate).strip().lower() == 'q':
|
||||
print(f"{COLOR_WARNING}Cancelled by user.{COLOR_RESET}")
|
||||
return False
|
||||
wabbajack_path = candidate
|
||||
# 3. Prompt for install directory
|
||||
print()
|
||||
while True:
|
||||
install_path_result = menu.get_directory_path(
|
||||
prompt_message="Select the directory where the modlist should be installed:",
|
||||
default_path=DEFAULT_MODLIST_INSTALL_BASE_DIR / wabbajack_path.stem,
|
||||
create_if_missing=True,
|
||||
no_header=False
|
||||
)
|
||||
if not install_path_result:
|
||||
print(f"{COLOR_WARNING}Cancelled by user.{COLOR_RESET}")
|
||||
return False
|
||||
# Handle tuple (path, should_create)
|
||||
if isinstance(install_path_result, tuple):
|
||||
install_path, install_should_create = install_path_result
|
||||
else:
|
||||
install_path, install_should_create = install_path_result, False
|
||||
# Check if directory exists and is not empty
|
||||
if install_path.exists() and any(install_path.iterdir()):
|
||||
print(f"{COLOR_WARNING}Warning: The selected directory '{install_path}' already exists and is not empty. Its contents may be overwritten!{COLOR_RESET}")
|
||||
confirm = input(f"{COLOR_PROMPT}This directory is not empty and may be overwritten. Proceed? (y/N): {COLOR_RESET}").strip().lower()
|
||||
if not confirm.startswith('y'):
|
||||
print(f"{COLOR_INFO}Please select a different directory.\n{COLOR_RESET}")
|
||||
continue
|
||||
break
|
||||
# 4. Prompt for downloads directory
|
||||
print()
|
||||
if not downloads_path:
|
||||
downloads_path_result = menu.get_directory_path(
|
||||
prompt_message="Select the directory for mod downloads:",
|
||||
default_path=DEFAULT_HOOLAMIKE_DOWNLOADS_DIR,
|
||||
create_if_missing=True,
|
||||
no_header=False
|
||||
)
|
||||
if not downloads_path_result:
|
||||
print(f"{COLOR_WARNING}Cancelled by user.{COLOR_RESET}")
|
||||
return False
|
||||
# Handle tuple (path, should_create)
|
||||
if isinstance(downloads_path_result, tuple):
|
||||
downloads_path, downloads_should_create = downloads_path_result
|
||||
else:
|
||||
downloads_path, downloads_should_create = downloads_path_result, False
|
||||
else:
|
||||
downloads_should_create = False
|
||||
# 5. Nexus API key
|
||||
print()
|
||||
current_api_key = self.hoolamike_config.get('downloaders', {}).get('nexus', {}).get('api_key') if self.hoolamike_config else None
|
||||
if not current_api_key or current_api_key == 'YOUR_API_KEY_HERE':
|
||||
api_key = menu.get_nexus_api_key(current_api_key)
|
||||
if not api_key:
|
||||
print(f"{COLOR_WARNING}Cancelled by user.{COLOR_RESET}")
|
||||
return False
|
||||
else:
|
||||
api_key = current_api_key
|
||||
|
||||
# --- Summary & Confirmation ---
|
||||
print(f"\n{'-'*60}")
|
||||
print(f"{COLOR_INFO}Summary of configuration:{COLOR_RESET}")
|
||||
print(f"- Wabbajack file: {wabbajack_path}")
|
||||
print(f"- Install directory: {install_path}")
|
||||
print(f"- Downloads directory: {downloads_path}")
|
||||
print(f"- Nexus API key: [{'Set' if api_key else 'Not Set'}]")
|
||||
print("- Games:")
|
||||
for game in ["Fallout 3", "Fallout New Vegas", "Skyrim Special Edition", "Oblivion", "Fallout 4"]:
|
||||
found = detected_games.get(game)
|
||||
print(f" {game}: {found if found else 'Not Found'}")
|
||||
print(f"{'-'*60}")
|
||||
print(f"{COLOR_WARNING}Proceed with these settings and start Hoolamike install? (Warning: This can take MANY HOURS){COLOR_RESET}")
|
||||
confirm = input(f"{COLOR_PROMPT}[Y/n]: {COLOR_RESET}").strip().lower()
|
||||
if confirm and not confirm.startswith('y'):
|
||||
print(f"{COLOR_WARNING}Cancelled by user.{COLOR_RESET}")
|
||||
return False
|
||||
# --- Actually create directories if needed ---
|
||||
if install_should_create and not install_path.exists():
|
||||
try:
|
||||
install_path.mkdir(parents=True, exist_ok=True)
|
||||
print(f"{COLOR_SUCCESS}Install directory created: {install_path}{COLOR_RESET}")
|
||||
except Exception as e:
|
||||
print(f"{COLOR_ERROR}Failed to create install directory: {e}{COLOR_RESET}")
|
||||
return False
|
||||
if downloads_should_create and not downloads_path.exists():
|
||||
try:
|
||||
downloads_path.mkdir(parents=True, exist_ok=True)
|
||||
print(f"{COLOR_SUCCESS}Downloads directory created: {downloads_path}{COLOR_RESET}")
|
||||
except Exception as e:
|
||||
print(f"{COLOR_ERROR}Failed to create downloads directory: {e}{COLOR_RESET}")
|
||||
return False
|
||||
|
||||
# --- Configuration Phase ---
|
||||
# Prepare config dict
|
||||
config = {
|
||||
"downloaders": {
|
||||
"downloads_directory": str(downloads_path),
|
||||
"nexus": {"api_key": api_key}
|
||||
},
|
||||
"installation": {
|
||||
"wabbajack_file_path": str(wabbajack_path),
|
||||
"installation_path": str(install_path)
|
||||
},
|
||||
"games": {
|
||||
self._format_game_name(game): {"root_directory": str(path)}
|
||||
for game, path in detected_games.items()
|
||||
},
|
||||
"fixup": {
|
||||
"game_resolution": "1920x1080"
|
||||
},
|
||||
# Resolution intentionally omitted
|
||||
# "extras": {},
|
||||
# No 'jackify_managed' key here
|
||||
}
|
||||
self.hoolamike_config = config
|
||||
if not self.save_hoolamike_config():
|
||||
print(f"{COLOR_ERROR}Failed to save hoolamike.yaml. Aborting.{COLOR_RESET}")
|
||||
return False
|
||||
|
||||
# --- Run Hoolamike ---
|
||||
print(f"\n{COLOR_INFO}Starting Hoolamike...{COLOR_RESET}")
|
||||
print(f"{COLOR_INFO}Streaming output below. Press Ctrl+C to cancel and return to Jackify menu.{COLOR_RESET}\n")
|
||||
# Defensive: Ensure executable path is set and valid
|
||||
if not self.hoolamike_executable_path or not Path(self.hoolamike_executable_path).is_file():
|
||||
print(f"{COLOR_ERROR}Error: Hoolamike executable not found or not set. Please (re)install Hoolamike from the menu before continuing.{COLOR_RESET}")
|
||||
input(f"{COLOR_PROMPT}Press Enter to return to the Hoolamike menu...{COLOR_RESET}")
|
||||
return False
|
||||
try:
|
||||
cmd = [str(self.hoolamike_executable_path), "install"]
|
||||
ret = subprocess.call(cmd, cwd=str(self.hoolamike_app_install_path), env=get_clean_subprocess_env())
|
||||
if ret == 0:
|
||||
print(f"\n{COLOR_SUCCESS}Hoolamike completed successfully!{COLOR_RESET}")
|
||||
input(f"{COLOR_PROMPT}Press Enter to return to the Hoolamike menu...{COLOR_RESET}")
|
||||
return True
|
||||
else:
|
||||
print(f"\n{COLOR_ERROR}Hoolamike process failed with exit code {ret}.{COLOR_RESET}")
|
||||
input(f"{COLOR_PROMPT}Press Enter to return to the Hoolamike menu...{COLOR_RESET}")
|
||||
return False
|
||||
except KeyboardInterrupt:
|
||||
print(f"\n{COLOR_WARNING}Hoolamike install interrupted by user. Returning to menu.{COLOR_RESET}")
|
||||
input(f"{COLOR_PROMPT}Press Enter to return to the Hoolamike menu...{COLOR_RESET}")
|
||||
return False
|
||||
except Exception as e:
|
||||
print(f"\n{COLOR_ERROR}Error running Hoolamike: {e}{COLOR_RESET}")
|
||||
input(f"{COLOR_PROMPT}Press Enter to return to the Hoolamike menu...{COLOR_RESET}")
|
||||
return False
|
||||
|
||||
def install_ttw(self, ttw_mpi_path=None, ttw_output_path=None, context=None):
|
||||
"""Install Tale of Two Wastelands (TTW) using Hoolamike.
|
||||
|
||||
Args:
|
||||
ttw_mpi_path: Path to the TTW installer .mpi file
|
||||
ttw_output_path: Target installation directory for TTW
|
||||
|
||||
Returns:
|
||||
bool: True if successful, False otherwise
|
||||
"""
|
||||
self.logger.info(f"Starting Tale of Two Wastelands installation via Hoolamike")
|
||||
self._check_hoolamike_installation()
|
||||
menu = self.menu_handler
|
||||
print(f"\n{'='*60}")
|
||||
print(f"{COLOR_INFO}Hoolamike: Tale of Two Wastelands Installation{COLOR_RESET}")
|
||||
print(f"{'='*60}\n")
|
||||
print(f"This feature will install Tale of Two Wastelands (TTW) using Hoolamike.")
|
||||
print(f"Requirements:")
|
||||
print(f" • Fallout 3 and Fallout New Vegas must be installed and detected.")
|
||||
print(f" • You must provide the path to your TTW .mpi installer file.")
|
||||
print(f" • You must select an output directory for the TTW install.\n")
|
||||
|
||||
# Ensure config is loaded
|
||||
if self.hoolamike_config is None:
|
||||
loaded = self._load_hoolamike_config()
|
||||
if not loaded or self.hoolamike_config is None:
|
||||
self.logger.error("Failed to load or generate hoolamike.yaml configuration.")
|
||||
print(f"{COLOR_ERROR}Error: Could not load or generate Hoolamike configuration. Aborting TTW install.{COLOR_RESET}")
|
||||
return False
|
||||
|
||||
# Verify required games are in configuration
|
||||
required_games = ['Fallout 3', 'Fallout New Vegas']
|
||||
detected_games = self.path_handler.find_vanilla_game_paths()
|
||||
missing_games = [game for game in required_games if game not in detected_games]
|
||||
if missing_games:
|
||||
self.logger.error(f"Missing required games for TTW installation: {', '.join(missing_games)}")
|
||||
print(f"{COLOR_ERROR}Error: The following required games were not found: {', '.join(missing_games)}{COLOR_RESET}")
|
||||
print("TTW requires both Fallout 3 and Fallout New Vegas to be installed.")
|
||||
return False
|
||||
|
||||
# Prompt for TTW .mpi file
|
||||
print(f"{COLOR_INFO}Please provide the path to your TTW .mpi installer file.{COLOR_RESET}")
|
||||
print(f"You can download this from: {COLOR_INFO}https://mod.pub/ttw/133/files{COLOR_RESET}")
|
||||
print(f"(Extract the .mpi file from the downloaded archive.)\n")
|
||||
while not ttw_mpi_path:
|
||||
candidate = menu.get_existing_file_path(
|
||||
prompt_message="Enter the path to your TTW .mpi file (or 'q' to cancel):",
|
||||
extension_filter=".mpi",
|
||||
no_header=True
|
||||
)
|
||||
if candidate is None:
|
||||
print(f"{COLOR_WARNING}Cancelled by user.{COLOR_RESET}")
|
||||
return False
|
||||
if str(candidate).strip().lower() == 'q':
|
||||
print(f"{COLOR_WARNING}Cancelled by user.{COLOR_RESET}")
|
||||
return False
|
||||
ttw_mpi_path = candidate
|
||||
|
||||
# Prompt for output directory
|
||||
print(f"\n{COLOR_INFO}Please select the output directory where TTW will be installed.{COLOR_RESET}")
|
||||
print(f"(This should be an empty or new directory.)\n")
|
||||
while not ttw_output_path:
|
||||
ttw_output_path = menu.get_directory_path(
|
||||
prompt_message="Select the TTW output directory:",
|
||||
default_path=self.hoolamike_app_install_path / "TTW_Output",
|
||||
create_if_missing=True,
|
||||
no_header=False
|
||||
)
|
||||
if not ttw_output_path:
|
||||
print(f"{COLOR_WARNING}Cancelled by user.{COLOR_RESET}")
|
||||
return False
|
||||
if ttw_output_path.exists() and any(ttw_output_path.iterdir()):
|
||||
print(f"{COLOR_WARNING}Warning: The selected directory '{ttw_output_path}' already exists and is not empty. Its contents may be overwritten!{COLOR_RESET}")
|
||||
confirm = input(f"{COLOR_PROMPT}This directory is not empty and may be overwritten. Proceed? (y/N): {COLOR_RESET}").strip().lower()
|
||||
if not confirm.startswith('y'):
|
||||
print(f"{COLOR_INFO}Please select a different directory.\n{COLOR_RESET}")
|
||||
ttw_output_path = None
|
||||
continue
|
||||
|
||||
# --- Summary & Confirmation ---
|
||||
print(f"\n{'-'*60}")
|
||||
print(f"{COLOR_INFO}Summary of configuration:{COLOR_RESET}")
|
||||
print(f"- TTW .mpi file: {ttw_mpi_path}")
|
||||
print(f"- Output directory: {ttw_output_path}")
|
||||
print("- Games:")
|
||||
for game in required_games:
|
||||
found = detected_games.get(game)
|
||||
print(f" {game}: {found if found else 'Not Found'}")
|
||||
print(f"{'-'*60}")
|
||||
print(f"{COLOR_WARNING}Proceed with these settings and start TTW installation? (This can take MANY HOURS){COLOR_RESET}")
|
||||
confirm = input(f"{COLOR_PROMPT}[Y/n]: {COLOR_RESET}").strip().lower()
|
||||
if confirm and not confirm.startswith('y'):
|
||||
print(f"{COLOR_WARNING}Cancelled by user.{COLOR_RESET}")
|
||||
return False
|
||||
|
||||
# --- Always re-detect games before updating config ---
|
||||
detected_games = self.path_handler.find_vanilla_game_paths()
|
||||
if not detected_games:
|
||||
print(f"{COLOR_ERROR}No supported games were detected on your system. TTW requires Fallout 3 and Fallout New Vegas to be installed.{COLOR_RESET}")
|
||||
return False
|
||||
# Update the games section with correct keys
|
||||
if self.hoolamike_config is None:
|
||||
self.hoolamike_config = {}
|
||||
self.hoolamike_config['games'] = {
|
||||
self._format_game_name(game): {"root_directory": str(path)}
|
||||
for game, path in detected_games.items()
|
||||
}
|
||||
|
||||
# Update TTW configuration
|
||||
self._update_hoolamike_config_for_ttw(ttw_mpi_path, ttw_output_path)
|
||||
if not self.save_hoolamike_config():
|
||||
self.logger.error("Failed to save hoolamike.yaml configuration.")
|
||||
print(f"{COLOR_ERROR}Error: Failed to save Hoolamike configuration.{COLOR_RESET}")
|
||||
print("Attempting to continue anyway...")
|
||||
|
||||
# Construct command to execute
|
||||
cmd = [
|
||||
str(self.hoolamike_executable_path),
|
||||
"tale-of-two-wastelands"
|
||||
]
|
||||
self.logger.info(f"Executing Hoolamike command: {' '.join(cmd)}")
|
||||
print(f"\n{COLOR_INFO}Executing Hoolamike for TTW Installation...{COLOR_RESET}")
|
||||
print(f"Command: {' '.join(cmd)}")
|
||||
print(f"{COLOR_INFO}Streaming output below. Press Ctrl+C to cancel and return to Jackify menu.{COLOR_RESET}\n")
|
||||
try:
|
||||
ret = subprocess.call(cmd, cwd=str(self.hoolamike_app_install_path), env=get_clean_subprocess_env())
|
||||
if ret == 0:
|
||||
self.logger.info("TTW installation completed successfully.")
|
||||
print(f"\n{COLOR_SUCCESS}TTW installation completed successfully!{COLOR_RESET}")
|
||||
input(f"{COLOR_PROMPT}Press Enter to return to the Hoolamike menu...{COLOR_RESET}")
|
||||
return True
|
||||
else:
|
||||
self.logger.error(f"TTW installation process returned non-zero exit code: {ret}")
|
||||
print(f"\n{COLOR_ERROR}Error: TTW installation failed with exit code {ret}.{COLOR_RESET}")
|
||||
input(f"{COLOR_PROMPT}Press Enter to return to the Hoolamike menu...{COLOR_RESET}")
|
||||
return False
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error executing Hoolamike TTW installation: {e}", exc_info=True)
|
||||
print(f"\n{COLOR_ERROR}Error executing Hoolamike TTW installation: {e}{COLOR_RESET}")
|
||||
input(f"{COLOR_PROMPT}Press Enter to return to the Hoolamike menu...{COLOR_RESET}")
|
||||
return False
|
||||
|
||||
def _update_hoolamike_config_for_ttw(self, ttw_mpi_path: Path, ttw_output_path: Path):
|
||||
"""Update the Hoolamike configuration with settings for TTW installation."""
|
||||
# Ensure extras and TTW sections exist
|
||||
if "extras" not in self.hoolamike_config:
|
||||
self.hoolamike_config["extras"] = {}
|
||||
|
||||
if "tale_of_two_wastelands" not in self.hoolamike_config["extras"]:
|
||||
self.hoolamike_config["extras"]["tale_of_two_wastelands"] = {
|
||||
"variables": {}
|
||||
}
|
||||
|
||||
# Update TTW configuration
|
||||
ttw_config = self.hoolamike_config["extras"]["tale_of_two_wastelands"]
|
||||
ttw_config["path_to_ttw_mpi_file"] = str(ttw_mpi_path)
|
||||
|
||||
# Ensure variables section exists
|
||||
if "variables" not in ttw_config:
|
||||
ttw_config["variables"] = {}
|
||||
|
||||
# Set destination variable
|
||||
ttw_config["variables"]["DESTINATION"] = str(ttw_output_path)
|
||||
|
||||
# Set USERPROFILE to a Jackify-managed directory for TTW
|
||||
userprofile_path = str(self.hoolamike_app_install_path / "USERPROFILE")
|
||||
if "variables" not in self.hoolamike_config["extras"]["tale_of_two_wastelands"]:
|
||||
self.hoolamike_config["extras"]["tale_of_two_wastelands"]["variables"] = {}
|
||||
self.hoolamike_config["extras"]["tale_of_two_wastelands"]["variables"]["USERPROFILE"] = userprofile_path
|
||||
|
||||
# Make sure game paths are set correctly
|
||||
for game in ['Fallout 3', 'Fallout New Vegas']:
|
||||
if game in self.game_install_paths:
|
||||
game_key = game.replace(' ', '').lower()
|
||||
|
||||
if "games" not in self.hoolamike_config:
|
||||
self.hoolamike_config["games"] = {}
|
||||
|
||||
if game not in self.hoolamike_config["games"]:
|
||||
self.hoolamike_config["games"][game] = {}
|
||||
|
||||
self.hoolamike_config["games"][game]["root_directory"] = str(self.game_install_paths[game])
|
||||
|
||||
self.logger.info("Updated Hoolamike configuration with TTW settings.")
|
||||
|
||||
def reset_config(self):
|
||||
"""Resets the hoolamike.yaml to default settings, backing up any existing file."""
|
||||
if self.hoolamike_config_path.is_file():
|
||||
# Create a backup with timestamp
|
||||
import datetime
|
||||
timestamp = datetime.datetime.now().strftime("%Y%m%d_%H%M%S")
|
||||
backup_path = self.hoolamike_config_path.with_suffix(f".{timestamp}.bak")
|
||||
try:
|
||||
import shutil
|
||||
shutil.copy2(self.hoolamike_config_path, backup_path)
|
||||
self.logger.info(f"Created backup of existing config at {backup_path}")
|
||||
print(f"{COLOR_INFO}Created backup of existing config at {backup_path}{COLOR_RESET}")
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to create backup of config: {e}")
|
||||
print(f"{COLOR_WARNING}Warning: Failed to create backup of config: {e}{COLOR_RESET}")
|
||||
|
||||
# Generate and save a fresh default config
|
||||
self.logger.info("Generating new default configuration")
|
||||
self.hoolamike_config = self._generate_default_config()
|
||||
if self.save_hoolamike_config():
|
||||
self.logger.info("Successfully reset config to defaults")
|
||||
print(f"{COLOR_SUCCESS}Successfully reset configuration to defaults.{COLOR_RESET}")
|
||||
return True
|
||||
else:
|
||||
self.logger.error("Failed to save new default config")
|
||||
print(f"{COLOR_ERROR}Failed to save new default configuration.{COLOR_RESET}")
|
||||
return False
|
||||
|
||||
def edit_hoolamike_config(self):
|
||||
"""Opens the hoolamike.yaml file in a chosen editor, with a 0 option to return to menu."""
|
||||
self.logger.info("Task: Edit Hoolamike Config started...")
|
||||
self._check_hoolamike_installation()
|
||||
if not self.hoolamike_installed:
|
||||
self.logger.warning("Cannot edit config - Hoolamike not installed")
|
||||
print(f"\n{COLOR_WARNING}Hoolamike is not installed through Jackify yet.{COLOR_RESET}")
|
||||
print(f"Please use option 1 from the Hoolamike menu to install Hoolamike first.")
|
||||
print(f"This will ensure that Jackify can properly manage the Hoolamike configuration.")
|
||||
return False
|
||||
if self.hoolamike_config is None:
|
||||
self.logger.warning("Config is not loaded properly. Will attempt to fix or create.")
|
||||
print(f"\n{COLOR_WARNING}Configuration file may be corrupted or not accessible.{COLOR_RESET}")
|
||||
print("Options:")
|
||||
print("1. Reset to default configuration (backup will be created)")
|
||||
print("2. Try to edit the file anyway (may be corrupted)")
|
||||
print("0. Cancel and return to menu")
|
||||
choice = input("\nEnter your choice (0-2): ").strip()
|
||||
if choice == "1":
|
||||
if not self.reset_config():
|
||||
self.logger.error("Failed to reset configuration")
|
||||
print(f"{COLOR_ERROR}Failed to reset configuration. See logs for details.{COLOR_RESET}")
|
||||
return
|
||||
elif choice == "2":
|
||||
self.logger.warning("User chose to edit potentially corrupted config")
|
||||
# Continue to editing
|
||||
elif choice == "0":
|
||||
self.logger.info("User cancelled editing corrupted config")
|
||||
print("Edit cancelled.")
|
||||
return
|
||||
else:
|
||||
self.logger.info("User cancelled editing corrupted config")
|
||||
print("Edit cancelled.")
|
||||
return
|
||||
if not self.hoolamike_config_path.exists():
|
||||
self.logger.warning(f"Hoolamike config file does not exist at {self.hoolamike_config_path}. Generating default before editing.")
|
||||
self.hoolamike_config = self._generate_default_config()
|
||||
self.save_hoolamike_config()
|
||||
if not self.hoolamike_config_path.exists():
|
||||
self.logger.error("Failed to create config file for editing.")
|
||||
print("Error: Could not create configuration file.")
|
||||
return
|
||||
available_editors = ["nano", "vim", "vi", "gedit", "kate", "micro"]
|
||||
preferred_editor = os.environ.get("EDITOR")
|
||||
found_editors = {}
|
||||
import shutil
|
||||
for editor_name in available_editors:
|
||||
editor_path = shutil.which(editor_name)
|
||||
if editor_path and editor_path not in found_editors.values():
|
||||
found_editors[editor_name] = editor_path
|
||||
if preferred_editor:
|
||||
preferred_editor_path = shutil.which(preferred_editor)
|
||||
if preferred_editor_path and preferred_editor_path not in found_editors.values():
|
||||
display_name = os.path.basename(preferred_editor) if '/' in preferred_editor else preferred_editor
|
||||
if display_name not in found_editors:
|
||||
found_editors[display_name] = preferred_editor_path
|
||||
if not found_editors:
|
||||
self.logger.error("No suitable text editors found on the system.")
|
||||
print(f"{COLOR_ERROR}Error: No common text editors (nano, vim, gedit, kate, micro) found.{COLOR_RESET}")
|
||||
return
|
||||
sorted_editor_names = sorted(found_editors.keys())
|
||||
print("\nSelect an editor to open the configuration file:")
|
||||
print(f"(System default EDITOR is: {preferred_editor if preferred_editor else 'Not set'})")
|
||||
for i, name in enumerate(sorted_editor_names):
|
||||
print(f" {i + 1}. {name}")
|
||||
print(f" 0. Return to Hoolamike Menu")
|
||||
while True:
|
||||
try:
|
||||
choice = input(f"Enter choice (0-{len(sorted_editor_names)}): ").strip()
|
||||
if choice == "0":
|
||||
print("Edit cancelled.")
|
||||
return
|
||||
choice_index = int(choice) - 1
|
||||
if 0 <= choice_index < len(sorted_editor_names):
|
||||
chosen_name = sorted_editor_names[choice_index]
|
||||
editor_to_use_path = found_editors[chosen_name]
|
||||
break
|
||||
else:
|
||||
print("Invalid choice.")
|
||||
except ValueError:
|
||||
print("Invalid input. Please enter a number.")
|
||||
except KeyboardInterrupt:
|
||||
print("\nEdit cancelled.")
|
||||
return
|
||||
if editor_to_use_path:
|
||||
self.logger.info(f"Launching editor '{editor_to_use_path}' for {self.hoolamike_config_path}")
|
||||
try:
|
||||
process = subprocess.Popen([editor_to_use_path, str(self.hoolamike_config_path)])
|
||||
process.wait()
|
||||
self.logger.info(f"Editor '{editor_to_use_path}' closed. Reloading config...")
|
||||
if not self._load_hoolamike_config():
|
||||
self.logger.error("Failed to load config after editing. It may still be corrupted.")
|
||||
print(f"{COLOR_ERROR}Warning: The configuration file could not be parsed after editing.{COLOR_RESET}")
|
||||
print("You may need to fix it manually or reset it to defaults.")
|
||||
return False
|
||||
else:
|
||||
self.logger.info("Successfully reloaded config after editing.")
|
||||
print(f"{COLOR_SUCCESS}Configuration file successfully updated.{COLOR_RESET}")
|
||||
return True
|
||||
except FileNotFoundError:
|
||||
self.logger.error(f"Editor '{editor_to_use_path}' not found unexpectedly.")
|
||||
print(f"{COLOR_ERROR}Error: Editor command '{editor_to_use_path}' not found.{COLOR_RESET}")
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error launching or waiting for editor: {e}")
|
||||
print(f"{COLOR_ERROR}An error occurred while launching the editor: {e}{COLOR_RESET}")
|
||||
|
||||
# Example usage (for testing, remove later)
|
||||
if __name__ == '__main__':
|
||||
logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(name)s - %(levelname)s - %(message)s')
|
||||
print("Running HoolamikeHandler discovery...")
|
||||
handler = HoolamikeHandler(steamdeck=False, verbose=True)
|
||||
print("\n--- Discovery Results ---")
|
||||
print(f"Game Paths: {handler.game_install_paths}")
|
||||
print(f"Hoolamike App Install Path: {handler.hoolamike_app_install_path}")
|
||||
print(f"Hoolamike Executable: {handler.hoolamike_executable_path}")
|
||||
print(f"Hoolamike Installed: {handler.hoolamike_installed}")
|
||||
print(f"Hoolamike Config Path: {handler.hoolamike_config_path}")
|
||||
config_loaded = isinstance(handler.hoolamike_config, dict)
|
||||
print(f"Hoolamike Config Loaded: {config_loaded}")
|
||||
if config_loaded:
|
||||
print(f" Downloads Dir: {handler.hoolamike_config.get('downloaders', {}).get('downloads_directory')}")
|
||||
print(f" API Key Set: {'Yes' if handler.hoolamike_config.get('downloaders', {}).get('nexus', {}).get('api_key') != 'YOUR_API_KEY_HERE' else 'No'}")
|
||||
print("-------------------------")
|
||||
# Test edit config (example)
|
||||
# handler.edit_hoolamike_config()
|
||||
1663
jackify/backend/handlers/install_wabbajack_handler.py
Normal file
1663
jackify/backend/handlers/install_wabbajack_handler.py
Normal file
File diff suppressed because it is too large
Load Diff
189
jackify/backend/handlers/logging_handler.py
Normal file
189
jackify/backend/handlers/logging_handler.py
Normal file
@@ -0,0 +1,189 @@
|
||||
"""
|
||||
LoggingHandler module for managing logging operations.
|
||||
This module handles log file creation, rotation, and management.
|
||||
"""
|
||||
|
||||
import os
|
||||
import logging
|
||||
import logging.handlers
|
||||
from pathlib import Path
|
||||
from typing import Optional, Dict, List
|
||||
from datetime import datetime
|
||||
import shutil
|
||||
|
||||
class LoggingHandler:
|
||||
"""
|
||||
Central logging handler for Jackify.
|
||||
- Uses ~/Jackify/logs/ as the log directory.
|
||||
- Supports per-function log files (e.g., jackify-install-wabbajack.log).
|
||||
- Handles log rotation and log directory creation.
|
||||
Usage:
|
||||
logger = LoggingHandler().setup_logger('install_wabbajack', 'jackify-install-wabbajack.log')
|
||||
"""
|
||||
def __init__(self):
|
||||
self.log_dir = Path.home() / "Jackify" / "logs"
|
||||
self.ensure_log_directory()
|
||||
|
||||
def ensure_log_directory(self) -> None:
|
||||
"""Ensure the log directory exists."""
|
||||
try:
|
||||
self.log_dir.mkdir(parents=True, exist_ok=True)
|
||||
except Exception as e:
|
||||
print(f"Failed to create log directory: {e}")
|
||||
|
||||
def rotate_log_file_per_run(self, log_file_path: Path, backup_count: int = 5):
|
||||
"""Rotate the log file on every run, keeping up to backup_count backups."""
|
||||
if log_file_path.exists():
|
||||
# Remove the oldest backup if it exists
|
||||
oldest = log_file_path.with_suffix(log_file_path.suffix + f'.{backup_count}')
|
||||
if oldest.exists():
|
||||
oldest.unlink()
|
||||
# Shift backups
|
||||
for i in range(backup_count - 1, 0, -1):
|
||||
src = log_file_path.with_suffix(log_file_path.suffix + f'.{i}')
|
||||
dst = log_file_path.with_suffix(log_file_path.suffix + f'.{i+1}')
|
||||
if src.exists():
|
||||
src.rename(dst)
|
||||
# Move current log to .1
|
||||
log_file_path.rename(log_file_path.with_suffix(log_file_path.suffix + '.1'))
|
||||
|
||||
def rotate_log_for_logger(self, name: str, log_file: Optional[str] = None, backup_count: int = 5):
|
||||
"""
|
||||
Rotate the log file for a logger before any logging occurs.
|
||||
Must be called BEFORE any log is written or file handler is attached.
|
||||
"""
|
||||
file_path = self.log_dir / (log_file if log_file else "jackify-cli.log")
|
||||
self.rotate_log_file_per_run(file_path, backup_count=backup_count)
|
||||
|
||||
def setup_logger(self, name: str, log_file: Optional[str] = None, is_general: bool = False) -> logging.Logger:
|
||||
"""Set up a logger with file and console handlers. Call rotate_log_for_logger before this if you want per-run rotation."""
|
||||
logger = logging.getLogger(name)
|
||||
logger.setLevel(logging.DEBUG)
|
||||
logger.propagate = False
|
||||
|
||||
# Create formatters
|
||||
file_formatter = logging.Formatter(
|
||||
'%(asctime)s - %(name)s - %(levelname)s - %(message)s'
|
||||
)
|
||||
console_formatter = logging.Formatter(
|
||||
'%(levelname)s: %(message)s'
|
||||
)
|
||||
|
||||
# Add console handler (ERROR and above only)
|
||||
console_handler = logging.StreamHandler()
|
||||
console_handler.setLevel(logging.ERROR)
|
||||
console_handler.setFormatter(console_formatter)
|
||||
if not any(isinstance(h, logging.StreamHandler) for h in logger.handlers):
|
||||
logger.addHandler(console_handler)
|
||||
|
||||
# Add file handler if log_file is specified, or use default for general
|
||||
if log_file or is_general:
|
||||
file_path = self.log_dir / (log_file if log_file else "jackify-cli.log")
|
||||
file_handler = logging.handlers.RotatingFileHandler(
|
||||
file_path, mode='a', encoding='utf-8', maxBytes=1024*1024, backupCount=5
|
||||
)
|
||||
file_handler.setLevel(logging.DEBUG)
|
||||
file_handler.setFormatter(file_formatter)
|
||||
if not any(isinstance(h, logging.handlers.RotatingFileHandler) and getattr(h, 'baseFilename', None) == str(file_path) for h in logger.handlers):
|
||||
logger.addHandler(file_handler)
|
||||
|
||||
return logger
|
||||
|
||||
def rotate_logs(self, max_bytes: int = 1024 * 1024, backup_count: int = 5) -> None:
|
||||
"""Rotate log files based on size."""
|
||||
for log_file in self.get_log_files():
|
||||
try:
|
||||
if log_file.stat().st_size > max_bytes:
|
||||
# Create backup
|
||||
backup_path = log_file.with_suffix(f'.{datetime.now().strftime("%Y%m%d_%H%M%S")}.log')
|
||||
log_file.rename(backup_path)
|
||||
|
||||
# Clean up old backups
|
||||
backups = sorted(log_file.parent.glob(f"{log_file.stem}.*.log"))
|
||||
if len(backups) > backup_count:
|
||||
for old_backup in backups[:-backup_count]:
|
||||
old_backup.unlink()
|
||||
except Exception as e:
|
||||
print(f"Failed to rotate log file {log_file}: {e}")
|
||||
|
||||
def cleanup_old_logs(self, days: int = 30) -> None:
|
||||
"""Clean up log files older than specified days."""
|
||||
cutoff = datetime.now().timestamp() - (days * 24 * 60 * 60)
|
||||
for log_file in self.get_log_files():
|
||||
try:
|
||||
if log_file.stat().st_mtime < cutoff:
|
||||
log_file.unlink()
|
||||
except Exception as e:
|
||||
print(f"Failed to clean up log file {log_file}: {e}")
|
||||
|
||||
def get_log_files(self) -> List[Path]:
|
||||
"""Get a list of all log files."""
|
||||
return list(self.log_dir.glob("*.log"))
|
||||
|
||||
def get_log_content(self, log_file: Path, lines: int = 100) -> List[str]:
|
||||
"""Get the last N lines of a log file."""
|
||||
try:
|
||||
with open(log_file, 'r') as f:
|
||||
return f.readlines()[-lines:]
|
||||
except Exception as e:
|
||||
print(f"Failed to read log file {log_file}: {e}")
|
||||
return []
|
||||
|
||||
def search_logs(self, pattern: str) -> Dict[Path, List[str]]:
|
||||
"""Search all log files for a pattern."""
|
||||
results = {}
|
||||
for log_file in self.get_log_files():
|
||||
try:
|
||||
with open(log_file, 'r') as f:
|
||||
matches = [line for line in f if pattern in line]
|
||||
if matches:
|
||||
results[log_file] = matches
|
||||
except Exception as e:
|
||||
print(f"Failed to search log file {log_file}: {e}")
|
||||
return results
|
||||
|
||||
def export_logs(self, output_dir: Path) -> bool:
|
||||
"""Export all logs to a directory."""
|
||||
try:
|
||||
output_dir.mkdir(parents=True, exist_ok=True)
|
||||
for log_file in self.get_log_files():
|
||||
shutil.copy2(log_file, output_dir / log_file.name)
|
||||
return True
|
||||
except Exception as e:
|
||||
print(f"Failed to export logs: {e}")
|
||||
return False
|
||||
|
||||
def set_log_level(self, level: int) -> None:
|
||||
"""Set the logging level for all loggers."""
|
||||
for logger_name in logging.root.manager.loggerDict:
|
||||
logger = logging.getLogger(logger_name)
|
||||
logger.setLevel(level)
|
||||
|
||||
def get_log_stats(self) -> Dict:
|
||||
"""Get statistics about log files."""
|
||||
stats = {
|
||||
'total_files': 0,
|
||||
'total_size': 0,
|
||||
'largest_file': None,
|
||||
'oldest_file': None,
|
||||
'newest_file': None
|
||||
}
|
||||
|
||||
try:
|
||||
log_files = self.get_log_files()
|
||||
stats['total_files'] = len(log_files)
|
||||
|
||||
if log_files:
|
||||
stats['total_size'] = sum(f.stat().st_size for f in log_files)
|
||||
stats['largest_file'] = max(log_files, key=lambda x: x.stat().st_size)
|
||||
stats['oldest_file'] = min(log_files, key=lambda x: x.stat().st_mtime)
|
||||
stats['newest_file'] = max(log_files, key=lambda x: x.stat().st_mtime)
|
||||
|
||||
except Exception as e:
|
||||
print(f"Failed to get log stats: {e}")
|
||||
|
||||
return stats
|
||||
|
||||
def get_general_logger(self):
|
||||
"""Get the general CLI logger (~/Jackify/logs/jackify-cli.log)."""
|
||||
return self.setup_logger('jackify_cli', is_general=True)
|
||||
1106
jackify/backend/handlers/menu_handler.py
Normal file
1106
jackify/backend/handlers/menu_handler.py
Normal file
File diff suppressed because it is too large
Load Diff
184
jackify/backend/handlers/mo2_handler.py
Normal file
184
jackify/backend/handlers/mo2_handler.py
Normal file
@@ -0,0 +1,184 @@
|
||||
import shutil
|
||||
import subprocess
|
||||
import requests
|
||||
from pathlib import Path
|
||||
import re
|
||||
import time
|
||||
import os
|
||||
from .ui_colors import COLOR_PROMPT, COLOR_SELECTION, COLOR_RESET, COLOR_INFO, COLOR_ERROR, COLOR_SUCCESS, COLOR_WARNING
|
||||
from .status_utils import show_status, clear_status
|
||||
from jackify.shared.ui_utils import print_section_header, print_subsection_header
|
||||
|
||||
class MO2Handler:
|
||||
"""
|
||||
Handles downloading and installing Mod Organizer 2 (MO2) using system 7z.
|
||||
"""
|
||||
def __init__(self, menu_handler):
|
||||
self.menu_handler = menu_handler
|
||||
# Import shortcut handler from menu_handler if available
|
||||
self.shortcut_handler = getattr(menu_handler, 'shortcut_handler', None)
|
||||
|
||||
def _is_dangerous_path(self, path: Path) -> bool:
|
||||
# Block /, /home, /root, and the user's home directory
|
||||
home = Path.home().resolve()
|
||||
dangerous = [Path('/'), Path('/home'), Path('/root'), home]
|
||||
return any(path.resolve() == d for d in dangerous)
|
||||
|
||||
def install_mo2(self):
|
||||
os.system('cls' if os.name == 'nt' else 'clear')
|
||||
# Banner display handled by frontend
|
||||
print_section_header('Mod Organizer 2 Installation')
|
||||
# 1. Check for 7z
|
||||
if not shutil.which('7z'):
|
||||
print(f"{COLOR_ERROR}[ERROR] 7z is not installed. Please install it (e.g., sudo apt install p7zip-full).{COLOR_RESET}\n")
|
||||
return False
|
||||
# 2. Prompt for install location
|
||||
default_dir = Path.home() / "ModOrganizer2"
|
||||
prompt = f"Enter the full path where Mod Organizer 2 should be installed (default: {default_dir}, enter 'q' to cancel)"
|
||||
install_dir = self.menu_handler.get_directory_path(
|
||||
prompt_message=prompt,
|
||||
default_path=default_dir,
|
||||
create_if_missing=False,
|
||||
no_header=True
|
||||
)
|
||||
if not install_dir:
|
||||
print(f"\n{COLOR_INFO}Installation cancelled by user.{COLOR_RESET}\n")
|
||||
return False
|
||||
# Safety: Block dangerous paths
|
||||
if self._is_dangerous_path(install_dir):
|
||||
print(f"\n{COLOR_ERROR}Refusing to install to a dangerous directory: {install_dir}{COLOR_RESET}\n")
|
||||
return False
|
||||
# 3. Ask if user wants to add MO2 to Steam
|
||||
add_to_steam = input(f"Add Mod Organizer 2 as a custom Steam shortcut for Proton? (Y/n): ").strip().lower()
|
||||
add_to_steam = (add_to_steam == '' or add_to_steam.startswith('y'))
|
||||
shortcut_name = None
|
||||
if add_to_steam:
|
||||
shortcut_name = input(f"Enter a name for your new Steam shortcut (default: Mod Organizer 2): ").strip()
|
||||
if not shortcut_name:
|
||||
shortcut_name = "Mod Organizer 2"
|
||||
print_subsection_header('Configuration Phase')
|
||||
time.sleep(0.5)
|
||||
# 4. Create directory if needed, handle existing contents
|
||||
if not install_dir.exists():
|
||||
try:
|
||||
install_dir.mkdir(parents=True, exist_ok=True)
|
||||
show_status(f"Created directory: {install_dir}")
|
||||
except Exception as e:
|
||||
print(f"{COLOR_ERROR}[ERROR] Could not create directory: {e}{COLOR_RESET}\n")
|
||||
return False
|
||||
else:
|
||||
files = list(install_dir.iterdir())
|
||||
if files:
|
||||
print(f"Warning: The directory '{install_dir}' is not empty.")
|
||||
print("Warning: This will permanently delete all files in the folder. Type 'DELETE' to confirm:")
|
||||
confirm = input("").strip()
|
||||
if confirm != 'DELETE':
|
||||
print(f"{COLOR_INFO}Cancelled by user. Please choose a different directory if you want to keep existing files.{COLOR_RESET}\n")
|
||||
return False
|
||||
for f in files:
|
||||
try:
|
||||
if f.is_dir():
|
||||
shutil.rmtree(f)
|
||||
else:
|
||||
f.unlink()
|
||||
except Exception as e:
|
||||
print(f"{COLOR_ERROR}Failed to delete {f}: {e}{COLOR_RESET}")
|
||||
show_status(f"Deleted all contents of {install_dir}")
|
||||
|
||||
# 5. Fetch latest MO2 release info from GitHub
|
||||
show_status("Fetching latest Mod Organizer 2 release info...")
|
||||
try:
|
||||
response = requests.get("https://api.github.com/repos/ModOrganizer2/modorganizer/releases/latest", timeout=15, verify=True)
|
||||
response.raise_for_status()
|
||||
release = response.json()
|
||||
except Exception as e:
|
||||
print(f"{COLOR_ERROR}[ERROR] Failed to fetch MO2 release info: {e}{COLOR_RESET}\n")
|
||||
return False
|
||||
|
||||
# 6. Find the correct .7z asset (exclude -pdbs, -src, etc)
|
||||
asset = None
|
||||
for a in release.get('assets', []):
|
||||
name = a['name']
|
||||
if re.match(r"Mod\.Organizer-\d+\.\d+(\.\d+)?\.7z$", name):
|
||||
asset = a
|
||||
break
|
||||
if not asset:
|
||||
print(f"{COLOR_ERROR}[ERROR] Could not find main MO2 .7z asset in latest release.{COLOR_RESET}\n")
|
||||
return False
|
||||
|
||||
# 7. Download the archive
|
||||
show_status(f"Downloading {asset['name']}...")
|
||||
archive_path = install_dir / asset['name']
|
||||
try:
|
||||
with requests.get(asset['browser_download_url'], stream=True, timeout=60, verify=True) as r:
|
||||
r.raise_for_status()
|
||||
with open(archive_path, 'wb') as f:
|
||||
for chunk in r.iter_content(chunk_size=8192):
|
||||
f.write(chunk)
|
||||
except Exception as e:
|
||||
print(f"{COLOR_ERROR}[ERROR] Failed to download MO2 archive: {e}{COLOR_RESET}\n")
|
||||
return False
|
||||
|
||||
# 8. Extract using 7z (suppress noisy output)
|
||||
show_status(f"Extracting to {install_dir}...")
|
||||
try:
|
||||
result = subprocess.run(['7z', 'x', str(archive_path), f'-o{install_dir}'], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
|
||||
if result.returncode != 0:
|
||||
print(f"{COLOR_ERROR}[ERROR] Extraction failed: {result.stderr.decode(errors='ignore')}{COLOR_RESET}\n")
|
||||
return False
|
||||
except Exception as e:
|
||||
print(f"{COLOR_ERROR}[ERROR] Extraction failed: {e}{COLOR_RESET}\n")
|
||||
return False
|
||||
|
||||
# 9. Validate extraction
|
||||
mo2_exe = next(install_dir.glob('**/ModOrganizer.exe'), None)
|
||||
if not mo2_exe:
|
||||
print(f"{COLOR_ERROR}[ERROR] ModOrganizer.exe not found after extraction. Please check extraction.{COLOR_RESET}\n")
|
||||
return False
|
||||
else:
|
||||
show_status(f"MO2 installed at: {mo2_exe.parent}")
|
||||
|
||||
# 10. Add to Steam if requested
|
||||
if add_to_steam and self.shortcut_handler:
|
||||
show_status("Creating Steam shortcut...")
|
||||
try:
|
||||
from ..services.native_steam_service import NativeSteamService
|
||||
steam_service = NativeSteamService()
|
||||
|
||||
success, app_id = steam_service.create_shortcut_with_proton(
|
||||
app_name=shortcut_name,
|
||||
exe_path=str(mo2_exe),
|
||||
start_dir=str(mo2_exe.parent),
|
||||
launch_options="%command%",
|
||||
tags=["Jackify"],
|
||||
proton_version="proton_experimental"
|
||||
)
|
||||
if not success or not app_id:
|
||||
print(f"{COLOR_ERROR}[ERROR] Failed to create Steam shortcut.{COLOR_RESET}\n")
|
||||
else:
|
||||
show_status(f"Steam shortcut created for '{COLOR_INFO}{shortcut_name}{COLOR_RESET}'.")
|
||||
# Restart Steam and show manual steps (reuse logic from Configure Modlist)
|
||||
print("\n───────────────────────────────────────────────────────────────────")
|
||||
print(f"{COLOR_INFO}Important:{COLOR_RESET} Steam needs to restart to detect the new shortcut.")
|
||||
print("This process involves several manual steps after the restart.")
|
||||
restart_choice = input(f"\n{COLOR_PROMPT}Restart Steam automatically now? (Y/n): {COLOR_RESET}").strip().lower()
|
||||
if restart_choice != 'n':
|
||||
if hasattr(self.shortcut_handler, 'secure_steam_restart'):
|
||||
print("Restarting Steam...")
|
||||
self.shortcut_handler.secure_steam_restart()
|
||||
print("\nAfter restarting, you MUST perform the manual Proton setup steps:")
|
||||
print(f" 1. Locate '{COLOR_INFO}{shortcut_name}{COLOR_RESET}' in your Steam Library")
|
||||
print(" 2. Right-click and select 'Properties'")
|
||||
print(" 3. Switch to the 'Compatibility' tab")
|
||||
print(" 4. Check 'Force the use of a specific Steam Play compatibility tool'")
|
||||
print(" 5. Select 'Proton - Experimental' from the dropdown menu")
|
||||
print(" 6. Close the Properties window")
|
||||
print(f" 7. Launch '{COLOR_INFO}{shortcut_name}{COLOR_RESET}' from your Steam Library")
|
||||
print(" 8. If Mod Organizer opens or produces any error message, that's normal")
|
||||
print(" 9. CLOSE Mod Organizer completely and return here")
|
||||
print("───────────────────────────────────────────────────────────────────\n")
|
||||
except Exception as e:
|
||||
print(f"{COLOR_ERROR}[ERROR] Failed to create Steam shortcut: {e}{COLOR_RESET}\n")
|
||||
|
||||
print(f"{COLOR_SUCCESS}Mod Organizer 2 has been installed successfully!{COLOR_RESET}\n")
|
||||
return True
|
||||
1277
jackify/backend/handlers/modlist_handler.py
Normal file
1277
jackify/backend/handlers/modlist_handler.py
Normal file
File diff suppressed because it is too large
Load Diff
1100
jackify/backend/handlers/modlist_install_cli.py
Normal file
1100
jackify/backend/handlers/modlist_install_cli.py
Normal file
File diff suppressed because it is too large
Load Diff
963
jackify/backend/handlers/path_handler.py
Normal file
963
jackify/backend/handlers/path_handler.py
Normal file
@@ -0,0 +1,963 @@
|
||||
#!/usr/bin/env python3
|
||||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
Path Handler Module
|
||||
Handles path-related operations for ModOrganizer.ini and other configuration files
|
||||
"""
|
||||
|
||||
import os
|
||||
import re
|
||||
import logging
|
||||
import shutil
|
||||
from pathlib import Path
|
||||
from typing import Optional, Union, Dict, Any, List, Tuple
|
||||
from datetime import datetime
|
||||
|
||||
# Initialize logger
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# --- Configuration (Adapted from Proposal) ---
|
||||
# Define known script extender executables (lowercase for comparisons)
|
||||
TARGET_EXECUTABLES_LOWER = ["skse64_loader.exe", "f4se_loader.exe", "nvse_loader.exe", "obse_loader.exe", "sfse_loader.exe", "obse64_loader.exe", "falloutnv.exe"]
|
||||
# Define known stock game folder names (case-sensitive, as they appear on disk)
|
||||
STOCK_GAME_FOLDERS = ["Stock Game", "Game Root", "Stock Folder", "Skyrim Stock"]
|
||||
# Define the SD card path prefix on Steam Deck/Linux
|
||||
SDCARD_PREFIX = '/run/media/mmcblk0p1/'
|
||||
|
||||
class PathHandler:
|
||||
"""
|
||||
Handles path-related operations for ModOrganizer.ini and other configuration files
|
||||
"""
|
||||
|
||||
@staticmethod
|
||||
def _strip_sdcard_path_prefix(path_obj: Path) -> str:
|
||||
"""
|
||||
Removes the '/run/media/mmcblk0p1/' prefix if present.
|
||||
Returns the path as a POSIX-style string (using /).
|
||||
"""
|
||||
path_str = path_obj.as_posix() # Work with consistent forward slashes
|
||||
if path_str.lower().startswith(SDCARD_PREFIX.lower()):
|
||||
# Return the part *after* the prefix, ensuring no leading slash remains unless root
|
||||
relative_part = path_str[len(SDCARD_PREFIX):]
|
||||
return relative_part if relative_part else "." # Return '.' if it was exactly the prefix
|
||||
return path_str
|
||||
|
||||
@staticmethod
|
||||
def update_mo2_ini_paths(
|
||||
modlist_ini_path: Path,
|
||||
modlist_dir_path: Path,
|
||||
modlist_sdcard: bool,
|
||||
steam_library_common_path: Optional[Path] = None,
|
||||
basegame_dir_name: Optional[str] = None,
|
||||
basegame_sdcard: bool = False # Default to False if not provided
|
||||
) -> bool:
|
||||
logger.info(f"[DEBUG] update_mo2_ini_paths called with: modlist_ini_path={modlist_ini_path}, modlist_dir_path={modlist_dir_path}, modlist_sdcard={modlist_sdcard}, steam_library_common_path={steam_library_common_path}, basegame_dir_name={basegame_dir_name}, basegame_sdcard={basegame_sdcard}")
|
||||
if not modlist_ini_path.is_file():
|
||||
logger.error(f"ModOrganizer.ini not found at specified path: {modlist_ini_path}")
|
||||
# Attempt to create a minimal INI
|
||||
try:
|
||||
logger.warning("Creating minimal ModOrganizer.ini with [General] section.")
|
||||
with open(modlist_ini_path, 'w', encoding='utf-8') as f:
|
||||
f.write('[General]\n')
|
||||
# Continue as if file existed
|
||||
except Exception as e:
|
||||
logger.critical(f"Failed to create minimal ModOrganizer.ini: {e}")
|
||||
return False
|
||||
if not modlist_dir_path.is_dir():
|
||||
logger.error(f"Modlist directory not found or not a directory: {modlist_dir_path}")
|
||||
# Warn but continue
|
||||
|
||||
# --- Bulletproof game directory detection ---
|
||||
# 1. Get all Steam libraries and log them
|
||||
all_steam_libraries = PathHandler.get_all_steam_library_paths()
|
||||
logger.info(f"[DEBUG] Detected Steam libraries: {all_steam_libraries}")
|
||||
import sys
|
||||
if hasattr(sys, 'argv') and any(arg in ('--debug', '-d') for arg in sys.argv):
|
||||
self.logger.debug(f"Detected Steam libraries: {all_steam_libraries}")
|
||||
|
||||
# 2. For each library, check for the canonical vanilla game directory
|
||||
GAME_DIR_NAMES = {
|
||||
"Skyrim Special Edition": "Skyrim Special Edition",
|
||||
"Fallout 4": "Fallout 4",
|
||||
"Fallout New Vegas": "Fallout New Vegas",
|
||||
"Oblivion": "Oblivion"
|
||||
}
|
||||
canonical_name = None
|
||||
if basegame_dir_name and basegame_dir_name in GAME_DIR_NAMES:
|
||||
canonical_name = GAME_DIR_NAMES[basegame_dir_name]
|
||||
elif basegame_dir_name:
|
||||
canonical_name = basegame_dir_name # fallback, but should match above
|
||||
gamepath_target_dir = None
|
||||
gamepath_target_is_sdcard = modlist_sdcard
|
||||
checked_candidates = []
|
||||
if canonical_name:
|
||||
for lib in all_steam_libraries:
|
||||
candidate = lib / "steamapps" / "common" / canonical_name
|
||||
checked_candidates.append(str(candidate))
|
||||
logger.info(f"[DEBUG] Checking for vanilla game directory: {candidate}")
|
||||
if candidate.is_dir():
|
||||
gamepath_target_dir = candidate
|
||||
logger.info(f"Found vanilla game directory: {candidate}")
|
||||
break
|
||||
if not gamepath_target_dir:
|
||||
logger.error(f"Could not find vanilla game directory '{canonical_name}' in any Steam library. Checked: {checked_candidates}")
|
||||
# 4. Prompt the user for the path
|
||||
print("\nCould not automatically detect a Stock Game or vanilla game directory.")
|
||||
print("Please enter the full path to your vanilla game directory (e.g., /path/to/Skyrim Special Edition):")
|
||||
while True:
|
||||
user_input = input("Game directory path: ").strip()
|
||||
user_path = Path(user_input)
|
||||
logger.info(f"[DEBUG] User entered: {user_input}")
|
||||
if user_path.is_dir():
|
||||
exe_candidates = list(user_path.glob('*.exe'))
|
||||
logger.info(f"[DEBUG] .exe files in user path: {exe_candidates}")
|
||||
if exe_candidates:
|
||||
gamepath_target_dir = user_path
|
||||
logger.info(f"User provided valid vanilla game directory: {gamepath_target_dir}")
|
||||
break
|
||||
else:
|
||||
print("Directory exists but does not appear to contain the game executable. Please check and try again.")
|
||||
logger.warning("User path exists but no .exe files found.")
|
||||
else:
|
||||
print("Directory not found. Please enter a valid path.")
|
||||
logger.warning("User path does not exist.")
|
||||
if not gamepath_target_dir:
|
||||
logger.critical("[FATAL] Could not determine a valid target directory for gamePath. Check configuration and paths. Aborting update.")
|
||||
return False
|
||||
|
||||
# 3. Update gamePath, binary, and workingDirectory entries in the INI
|
||||
logger.debug(f"Determined gamePath target directory: {gamepath_target_dir}")
|
||||
logger.debug(f"gamePath target is on SD card: {gamepath_target_is_sdcard}")
|
||||
try:
|
||||
logger.debug(f"Reading original INI file: {modlist_ini_path}")
|
||||
with open(modlist_ini_path, 'r', encoding='utf-8', errors='ignore') as f:
|
||||
original_lines = f.readlines()
|
||||
|
||||
# --- Find and robustly update gamePath line ---
|
||||
gamepath_line_num = -1
|
||||
general_section_line = -1
|
||||
for i, line in enumerate(original_lines):
|
||||
if re.match(r'^\s*\[General\]\s*$', line, re.IGNORECASE):
|
||||
general_section_line = i
|
||||
if re.match(r'^\s*gamepath\s*=\s*', line, re.IGNORECASE):
|
||||
gamepath_line_num = i
|
||||
break
|
||||
processed_str = PathHandler._strip_sdcard_path_prefix(gamepath_target_dir)
|
||||
windows_style_single = processed_str.replace('/', '\\')
|
||||
gamepath_drive_letter = "D:" if gamepath_target_is_sdcard else "Z:"
|
||||
# Use robust formatter
|
||||
formatted_gamepath = PathHandler._format_gamepath_for_mo2(f'{gamepath_drive_letter}{windows_style_single}')
|
||||
new_gamepath_line = f'gamePath = @ByteArray({formatted_gamepath})\n'
|
||||
if gamepath_line_num != -1:
|
||||
logger.info(f"Updating existing gamePath line: {original_lines[gamepath_line_num].strip()} -> {new_gamepath_line.strip()}")
|
||||
original_lines[gamepath_line_num] = new_gamepath_line
|
||||
else:
|
||||
insert_at = general_section_line + 1 if general_section_line != -1 else 0
|
||||
logger.info(f"Adding missing gamePath line at line {insert_at+1}: {new_gamepath_line.strip()}")
|
||||
original_lines.insert(insert_at, new_gamepath_line)
|
||||
|
||||
# --- Update customExecutables binaries and workingDirectories ---
|
||||
TARGET_EXECUTABLES_LOWER = [
|
||||
"skse64_loader.exe", "f4se_loader.exe", "nvse_loader.exe", "obse_loader.exe", "falloutnv.exe"
|
||||
]
|
||||
in_custom_exec = False
|
||||
for i, line in enumerate(original_lines):
|
||||
if re.match(r'^\s*\[customExecutables\]\s*$', line, re.IGNORECASE):
|
||||
in_custom_exec = True
|
||||
continue
|
||||
if in_custom_exec and re.match(r'^\s*\[.*\]\s*$', line):
|
||||
in_custom_exec = False
|
||||
if in_custom_exec:
|
||||
m = re.match(r'^(\d+)\\binary\s*=\s*(.*)$', line.strip(), re.IGNORECASE)
|
||||
if m:
|
||||
idx, old_path = m.group(1), m.group(2)
|
||||
exe_name = os.path.basename(old_path).lower()
|
||||
if exe_name in TARGET_EXECUTABLES_LOWER:
|
||||
new_path = f'{gamepath_drive_letter}/{PathHandler._strip_sdcard_path_prefix(gamepath_target_dir)}/{exe_name}'
|
||||
# Use robust formatter
|
||||
new_path = PathHandler._format_binary_for_mo2(new_path)
|
||||
logger.info(f"Updating binary for entry {idx}: {old_path} -> {new_path}")
|
||||
original_lines[i] = f'{idx}\\binary = {new_path}\n'
|
||||
m_wd = re.match(r'^(\d+)\\workingDirectory\s*=\s*(.*)$', line.strip(), re.IGNORECASE)
|
||||
if m_wd:
|
||||
idx, old_wd = m_wd.group(1), m_wd.group(2)
|
||||
new_wd = f'{gamepath_drive_letter}{windows_style_single}'
|
||||
# Use robust formatter
|
||||
new_wd = PathHandler._format_workingdir_for_mo2(new_wd)
|
||||
logger.info(f"Updating workingDirectory for entry {idx}: {old_wd} -> {new_wd}")
|
||||
original_lines[i] = f'{idx}\\workingDirectory = {new_wd}\n'
|
||||
|
||||
# --- Backup and Write New INI ---
|
||||
backup_path = modlist_ini_path.with_suffix(f".{datetime.now().strftime('%Y%m%d_%H%M%S')}.bak")
|
||||
try:
|
||||
shutil.copy2(modlist_ini_path, backup_path)
|
||||
logger.info(f"Backed up original INI to: {backup_path}")
|
||||
except Exception as bak_err:
|
||||
logger.error(f"Failed to backup original INI file: {bak_err}")
|
||||
return False
|
||||
try:
|
||||
with open(modlist_ini_path, 'w', encoding='utf-8') as f:
|
||||
f.writelines(original_lines)
|
||||
logger.info(f"Successfully wrote updated paths to {modlist_ini_path}")
|
||||
return True
|
||||
except Exception as write_err:
|
||||
logger.error(f"Failed to write updated INI file {modlist_ini_path}: {write_err}", exc_info=True)
|
||||
logger.error("Attempting to restore from backup...")
|
||||
try:
|
||||
shutil.move(backup_path, modlist_ini_path)
|
||||
logger.info("Successfully restored original INI from backup.")
|
||||
except Exception as restore_err:
|
||||
logger.critical(f"CRITICAL FAILURE: Could not write new INI and failed to restore backup {backup_path}. Manual intervention required at {modlist_ini_path}! Error: {restore_err}")
|
||||
return False
|
||||
except Exception as e:
|
||||
logger.error(f"An unexpected error occurred during INI path update: {e}", exc_info=True)
|
||||
return False
|
||||
|
||||
@staticmethod
|
||||
def edit_resolution(modlist_ini, resolution):
|
||||
"""
|
||||
Edit resolution settings in ModOrganizer.ini
|
||||
|
||||
Args:
|
||||
modlist_ini (str): Path to ModOrganizer.ini
|
||||
resolution (str): Resolution in the format "1920x1080"
|
||||
|
||||
Returns:
|
||||
bool: True on success, False on failure
|
||||
"""
|
||||
try:
|
||||
logger.info(f"Editing resolution settings to {resolution}...")
|
||||
|
||||
# Parse resolution
|
||||
width, height = resolution.split('x')
|
||||
|
||||
# Read the current ModOrganizer.ini
|
||||
with open(modlist_ini, 'r') as f:
|
||||
content = f.read()
|
||||
|
||||
# Replace width and height settings
|
||||
content = re.sub(r'^width\s*=\s*\d+$', f'width = {width}', content, flags=re.MULTILINE)
|
||||
content = re.sub(r'^height\s*=\s*\d+$', f'height = {height}', content, flags=re.MULTILINE)
|
||||
|
||||
# Write the updated content back to the file
|
||||
with open(modlist_ini, 'w') as f:
|
||||
f.write(content)
|
||||
|
||||
logger.info("Resolution settings edited successfully")
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error editing resolution settings: {e}")
|
||||
return False
|
||||
|
||||
@staticmethod
|
||||
def create_dxvk_conf(modlist_dir, modlist_sdcard, steam_library, basegame_sdcard, game_var_full):
|
||||
"""
|
||||
Create dxvk.conf file in the appropriate location
|
||||
|
||||
Args:
|
||||
modlist_dir (str): Path to the modlist directory
|
||||
modlist_sdcard (bool): Whether the modlist is on an SD card
|
||||
steam_library (str): Path to the Steam library
|
||||
basegame_sdcard (bool): Whether the base game is on an SD card
|
||||
game_var_full (str): Full name of the game (e.g., "Skyrim Special Edition")
|
||||
|
||||
Returns:
|
||||
bool: True on success, False on failure
|
||||
"""
|
||||
try:
|
||||
logger.info("Creating dxvk.conf file...")
|
||||
|
||||
# Determine the location for dxvk.conf
|
||||
dxvk_conf_path = None
|
||||
|
||||
# Check for common stock game directories
|
||||
stock_game_paths = [
|
||||
os.path.join(modlist_dir, "Stock Game"),
|
||||
os.path.join(modlist_dir, "STOCK GAME"),
|
||||
os.path.join(modlist_dir, "Game Root"),
|
||||
os.path.join(modlist_dir, "Stock Folder"),
|
||||
os.path.join(modlist_dir, "Skyrim Stock"),
|
||||
os.path.join(modlist_dir, "root", "Skyrim Special Edition"),
|
||||
os.path.join(steam_library, game_var_full)
|
||||
]
|
||||
|
||||
for path in stock_game_paths:
|
||||
if os.path.exists(path):
|
||||
dxvk_conf_path = os.path.join(path, "dxvk.conf")
|
||||
break
|
||||
|
||||
if not dxvk_conf_path:
|
||||
logger.error("Could not determine location for dxvk.conf")
|
||||
return False
|
||||
|
||||
# Create simple dxvk.conf content - just one line
|
||||
dxvk_conf_content = "dxvk.enableGraphicsPipelineLibrary = False\n"
|
||||
|
||||
# Write dxvk.conf to the appropriate location
|
||||
with open(dxvk_conf_path, 'w') as f:
|
||||
f.write(dxvk_conf_content)
|
||||
|
||||
logger.info(f"dxvk.conf created successfully at {dxvk_conf_path}")
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error creating dxvk.conf: {e}")
|
||||
return False
|
||||
|
||||
@staticmethod
|
||||
def find_steam_config_vdf() -> Optional[Path]:
|
||||
"""Finds the active Steam config.vdf file."""
|
||||
logger.debug("Searching for Steam config.vdf...")
|
||||
possible_steam_paths = [
|
||||
Path.home() / ".steam/steam",
|
||||
Path.home() / ".local/share/Steam",
|
||||
Path.home() / ".steam/root"
|
||||
]
|
||||
for steam_path in possible_steam_paths:
|
||||
potential_path = steam_path / "config/config.vdf"
|
||||
if potential_path.is_file():
|
||||
logger.info(f"Found config.vdf at: {potential_path}")
|
||||
return potential_path # Return Path object
|
||||
|
||||
logger.warning("Could not locate Steam's config.vdf file in standard locations.")
|
||||
return None
|
||||
|
||||
@staticmethod
|
||||
def find_steam_library() -> Optional[Path]:
|
||||
"""Find the primary Steam library common directory containing games."""
|
||||
logger.debug("Attempting to find Steam library...")
|
||||
|
||||
# Potential locations for libraryfolders.vdf
|
||||
libraryfolders_vdf_paths = [
|
||||
os.path.expanduser("~/.steam/steam/config/libraryfolders.vdf"),
|
||||
os.path.expanduser("~/.local/share/Steam/config/libraryfolders.vdf"),
|
||||
# Add other potential standard locations if necessary
|
||||
]
|
||||
|
||||
# Simple backup mechanism (optional but good practice)
|
||||
for path in libraryfolders_vdf_paths:
|
||||
if os.path.exists(path):
|
||||
backup_dir = os.path.join(os.path.dirname(path), "backups")
|
||||
if not os.path.exists(backup_dir):
|
||||
try:
|
||||
os.makedirs(backup_dir)
|
||||
except OSError as e:
|
||||
logger.warning(f"Could not create backup directory {backup_dir}: {e}")
|
||||
|
||||
# Create timestamped backup if it doesn't exist for today
|
||||
timestamp = datetime.now().strftime("%Y%m%d")
|
||||
backup_filename = f"libraryfolders_{timestamp}.vdf.bak"
|
||||
backup_path = os.path.join(backup_dir, backup_filename)
|
||||
|
||||
if not os.path.exists(backup_path):
|
||||
try:
|
||||
import shutil
|
||||
shutil.copy2(path, backup_path)
|
||||
logger.debug(f"Created backup of libraryfolders.vdf at {backup_path}")
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to create backup of libraryfolders.vdf: {e}")
|
||||
# Continue anyway, as we're only reading the file
|
||||
pass
|
||||
|
||||
libraryfolders_vdf_path_obj = None # Will hold the Path object
|
||||
found_path_str = None
|
||||
for path_str in libraryfolders_vdf_paths:
|
||||
if os.path.exists(path_str):
|
||||
found_path_str = path_str # Keep the string path for logging/opening
|
||||
libraryfolders_vdf_path_obj = Path(path_str) # Convert to Path object here
|
||||
logger.debug(f"Found libraryfolders.vdf at: {path_str}")
|
||||
break
|
||||
|
||||
# Check using the Path object's is_file() method
|
||||
if not libraryfolders_vdf_path_obj or not libraryfolders_vdf_path_obj.is_file():
|
||||
logger.warning("libraryfolders.vdf not found or is not a file. Cannot automatically detect Steam Library.")
|
||||
return None
|
||||
|
||||
# Parse the VDF file to extract library paths
|
||||
library_paths = []
|
||||
try:
|
||||
# Open using the original string path is fine, or use the Path object
|
||||
with open(found_path_str, 'r') as f: # Or use libraryfolders_vdf_path_obj
|
||||
content = f.read()
|
||||
|
||||
# Use regex to find all path entries
|
||||
path_matches = re.finditer(r'"path"\s*"([^"]+)"', content)
|
||||
for match in path_matches:
|
||||
library_path_str = match.group(1).replace('\\\\', '\\') # Fix potential double escapes
|
||||
common_path = os.path.join(library_path_str, "steamapps", "common")
|
||||
if os.path.isdir(common_path): # Verify the common path exists
|
||||
library_paths.append(Path(common_path))
|
||||
logger.debug(f"Found potential common path: {common_path}")
|
||||
else:
|
||||
logger.debug(f"Skipping non-existent common path derived from VDF: {common_path}")
|
||||
|
||||
logger.debug(f"Found {len(library_paths)} valid library common paths from VDF.")
|
||||
|
||||
# Return the first valid path found
|
||||
if library_paths:
|
||||
logger.info(f"Using Steam library common path: {library_paths[0]}")
|
||||
return library_paths[0]
|
||||
|
||||
# If no valid paths found in VDF, try the default structure
|
||||
logger.debug("No valid common paths found in VDF, checking default location...")
|
||||
default_common_path = Path.home() / ".steam/steam/steamapps/common"
|
||||
if default_common_path.is_dir():
|
||||
logger.info(f"Using default Steam library common path: {default_common_path}")
|
||||
return default_common_path
|
||||
|
||||
default_common_path_local = Path.home() / ".local/share/Steam/steamapps/common"
|
||||
if default_common_path_local.is_dir():
|
||||
logger.info(f"Using default local Steam library common path: {default_common_path_local}")
|
||||
return default_common_path_local
|
||||
|
||||
logger.error("No valid Steam library common path found in VDF or default locations.")
|
||||
return None
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error parsing libraryfolders.vdf or finding Steam library: {e}", exc_info=True)
|
||||
return None
|
||||
|
||||
@staticmethod
|
||||
def find_compat_data(appid: str) -> Optional[Path]:
|
||||
"""Find the compatdata directory for a given AppID."""
|
||||
if not appid:
|
||||
logger.error(f"Invalid AppID provided for compatdata search: {appid}")
|
||||
return None
|
||||
|
||||
# Handle negative AppIDs (remove minus sign for validation)
|
||||
appid_clean = appid.lstrip('-')
|
||||
if not appid_clean.isdigit():
|
||||
logger.error(f"Invalid AppID provided for compatdata search: {appid}")
|
||||
return None
|
||||
|
||||
logger.debug(f"Searching for compatdata directory for AppID: {appid}")
|
||||
|
||||
# Use libraryfolders.vdf to find all Steam library paths
|
||||
library_paths = PathHandler.get_all_steam_library_paths()
|
||||
if not library_paths:
|
||||
logger.error("Could not find any Steam library paths from libraryfolders.vdf")
|
||||
return None
|
||||
|
||||
logger.debug(f"Checking compatdata in {len(library_paths)} Steam libraries")
|
||||
|
||||
# Check each Steam library's compatdata directory
|
||||
for library_path in library_paths:
|
||||
compatdata_base = library_path / "steamapps" / "compatdata"
|
||||
if not compatdata_base.is_dir():
|
||||
logger.debug(f"Compatdata directory does not exist: {compatdata_base}")
|
||||
continue
|
||||
|
||||
potential_path = compatdata_base / appid
|
||||
if potential_path.is_dir():
|
||||
logger.info(f"Found compatdata directory: {potential_path}")
|
||||
return potential_path
|
||||
else:
|
||||
logger.debug(f"Compatdata for AppID {appid} not found in {compatdata_base}")
|
||||
|
||||
# Fallback: Broad search (can be slow, consider if needed)
|
||||
# try:
|
||||
# logger.debug(f"Compatdata not found in standard locations, attempting wider search...")
|
||||
# # This can be very slow and resource-intensive
|
||||
# # find_output = subprocess.check_output(['find', '/', '-type', 'd', '-name', appid, '-path', '*/compatdata/*', '-print', '-quit', '2>/dev/null'], text=True).strip()
|
||||
# # if find_output:
|
||||
# # logger.info(f"Found compatdata via find command: {find_output}")
|
||||
# # return Path(find_output)
|
||||
# except Exception as e:
|
||||
# logger.warning(f"Error during 'find' command for compatdata: {e}")
|
||||
|
||||
logger.warning(f"Compatdata directory for AppID {appid} not found.")
|
||||
return None
|
||||
|
||||
@staticmethod
|
||||
def detect_stock_game_path(game_type: str, steam_library: Path) -> Optional[Path]:
|
||||
"""
|
||||
Detect the stock game path for a given game type and Steam library
|
||||
Returns the path if found, None otherwise
|
||||
"""
|
||||
try:
|
||||
# Map of game types to their Steam App IDs
|
||||
game_app_ids = {
|
||||
'skyrim': '489830', # Skyrim Special Edition
|
||||
'fallout4': '377160', # Fallout 4
|
||||
'fnv': '22380', # Fallout: New Vegas
|
||||
'oblivion': '22330' # The Elder Scrolls IV: Oblivion
|
||||
}
|
||||
|
||||
if game_type not in game_app_ids:
|
||||
return None
|
||||
|
||||
app_id = game_app_ids[game_type]
|
||||
game_path = steam_library / 'steamapps' / 'common'
|
||||
|
||||
# List of possible game directory names
|
||||
possible_names = {
|
||||
'skyrim': ['Skyrim Special Edition', 'Skyrim'],
|
||||
'fallout4': ['Fallout 4'],
|
||||
'fnv': ['Fallout New Vegas', 'FalloutNV'],
|
||||
'oblivion': ['Oblivion']
|
||||
}
|
||||
|
||||
if game_type not in possible_names:
|
||||
return None
|
||||
|
||||
# Check each possible directory name
|
||||
for name in possible_names[game_type]:
|
||||
potential_path = game_path / name
|
||||
if potential_path.exists():
|
||||
return potential_path
|
||||
|
||||
return None
|
||||
|
||||
except Exception as e:
|
||||
logging.error(f"Error detecting stock game path: {e}")
|
||||
return None
|
||||
|
||||
@staticmethod
|
||||
def get_steam_library_path(steam_path: str) -> Optional[str]:
|
||||
"""Get the Steam library path from libraryfolders.vdf."""
|
||||
try:
|
||||
libraryfolders_path = os.path.join(steam_path, 'steamapps', 'libraryfolders.vdf')
|
||||
if not os.path.exists(libraryfolders_path):
|
||||
return None
|
||||
|
||||
with open(libraryfolders_path, 'r', encoding='utf-8') as f:
|
||||
content = f.read()
|
||||
|
||||
# Parse the VDF content
|
||||
libraries = {}
|
||||
current_library = None
|
||||
for line in content.split('\n'):
|
||||
line = line.strip()
|
||||
if line.startswith('"path"'):
|
||||
current_library = line.split('"')[3].replace('\\\\', '\\')
|
||||
elif line.startswith('"apps"') and current_library:
|
||||
libraries[current_library] = True
|
||||
|
||||
# Return the first library path that exists
|
||||
for library_path in libraries:
|
||||
if os.path.exists(library_path):
|
||||
return library_path
|
||||
|
||||
return None
|
||||
except Exception as e:
|
||||
logger.error(f"Error getting Steam library path: {str(e)}")
|
||||
return None
|
||||
|
||||
@staticmethod
|
||||
def get_all_steam_library_paths() -> List[Path]:
|
||||
"""Finds all Steam library paths listed in all known libraryfolders.vdf files (including Flatpak)."""
|
||||
logger.info("[DEBUG] Searching for all Steam libraryfolders.vdf files...")
|
||||
vdf_paths = [
|
||||
Path.home() / ".steam/steam/config/libraryfolders.vdf",
|
||||
Path.home() / ".local/share/Steam/config/libraryfolders.vdf",
|
||||
Path.home() / ".steam/root/config/libraryfolders.vdf",
|
||||
Path.home() / ".var/app/com.valvesoftware.Steam/.local/share/Steam/config/libraryfolders.vdf", # Flatpak
|
||||
]
|
||||
library_paths = set()
|
||||
for vdf_path in vdf_paths:
|
||||
if vdf_path.is_file():
|
||||
logger.info(f"[DEBUG] Parsing libraryfolders.vdf: {vdf_path}")
|
||||
try:
|
||||
with open(vdf_path) as f:
|
||||
for line in f:
|
||||
m = re.search(r'"path"\s*"([^"]+)"', line)
|
||||
if m:
|
||||
lib_path = Path(m.group(1))
|
||||
library_paths.add(lib_path)
|
||||
except Exception as e:
|
||||
logger.error(f"[DEBUG] Failed to parse {vdf_path}: {e}")
|
||||
logger.info(f"[DEBUG] All detected Steam libraries: {library_paths}")
|
||||
return list(library_paths)
|
||||
|
||||
# Moved _find_shortcuts_vdf here from ShortcutHandler
|
||||
def _find_shortcuts_vdf(self) -> Optional[str]:
|
||||
"""Helper to find the active shortcuts.vdf file for a user.
|
||||
|
||||
Iterates through userdata directories and returns the path to the
|
||||
first found shortcuts.vdf file.
|
||||
|
||||
Returns:
|
||||
Optional[str]: The full path to the shortcuts.vdf file, or None if not found.
|
||||
"""
|
||||
# This implementation was moved from ShortcutHandler
|
||||
userdata_base_paths = [
|
||||
os.path.expanduser("~/.steam/steam/userdata"),
|
||||
os.path.expanduser("~/.local/share/Steam/userdata"),
|
||||
os.path.expanduser("~/.var/app/com.valvesoftware.Steam/.local/share/Steam/userdata")
|
||||
]
|
||||
found_vdf_path = None
|
||||
for base_path in userdata_base_paths:
|
||||
if not os.path.isdir(base_path):
|
||||
logger.debug(f"Userdata base path not found or not a directory: {base_path}")
|
||||
continue
|
||||
logger.debug(f"Searching for user IDs in: {base_path}")
|
||||
try:
|
||||
for item in os.listdir(base_path):
|
||||
user_path = os.path.join(base_path, item)
|
||||
if os.path.isdir(user_path) and item.isdigit():
|
||||
logger.debug(f"Checking user directory: {user_path}")
|
||||
config_path = os.path.join(user_path, "config")
|
||||
shortcuts_file = os.path.join(config_path, "shortcuts.vdf")
|
||||
if os.path.isfile(shortcuts_file):
|
||||
logger.info(f"Found shortcuts.vdf at: {shortcuts_file}")
|
||||
found_vdf_path = shortcuts_file
|
||||
break # Found it for this base path
|
||||
else:
|
||||
logger.debug(f"shortcuts.vdf not found in {config_path}")
|
||||
except OSError as e:
|
||||
logger.warning(f"Could not access directory {base_path}: {e}")
|
||||
continue # Try next base path
|
||||
if found_vdf_path:
|
||||
break # Found it in this base path
|
||||
if not found_vdf_path:
|
||||
logger.error("Could not find any shortcuts.vdf file in common Steam locations.")
|
||||
return found_vdf_path
|
||||
|
||||
@staticmethod
|
||||
def find_game_install_paths(target_appids: Dict[str, str]) -> Dict[str, Path]:
|
||||
"""
|
||||
Find installation paths for multiple specified games using Steam app IDs.
|
||||
|
||||
Args:
|
||||
target_appids: Dictionary mapping game names to app IDs
|
||||
|
||||
Returns:
|
||||
Dictionary mapping game names to their installation paths
|
||||
"""
|
||||
# Get all Steam library paths
|
||||
library_paths = PathHandler.get_all_steam_library_paths()
|
||||
if not library_paths:
|
||||
logger.warning("Failed to find any Steam library paths")
|
||||
return {}
|
||||
|
||||
results = {}
|
||||
|
||||
# For each library path, look for each target game
|
||||
for library_path in library_paths:
|
||||
# Check if the common directory exists
|
||||
common_dir = library_path / "common"
|
||||
if not common_dir.is_dir():
|
||||
logger.debug(f"No 'common' directory in library: {library_path}")
|
||||
continue
|
||||
|
||||
# Get subdirectories in common dir
|
||||
try:
|
||||
game_dirs = [d for d in common_dir.iterdir() if d.is_dir()]
|
||||
except (PermissionError, OSError) as e:
|
||||
logger.warning(f"Cannot access directory {common_dir}: {e}")
|
||||
continue
|
||||
|
||||
# For each app ID, check if we find its directory
|
||||
for game_name, app_id in target_appids.items():
|
||||
if game_name in results:
|
||||
continue # Already found this game
|
||||
|
||||
# Try to find by appmanifest
|
||||
appmanifest_path = library_path / f"appmanifest_{app_id}.acf"
|
||||
if appmanifest_path.is_file():
|
||||
# Find the installdir value
|
||||
try:
|
||||
with open(appmanifest_path, 'r', encoding='utf-8') as f:
|
||||
content = f.read()
|
||||
match = re.search(r'"installdir"\s+"([^"]+)"', content)
|
||||
if match:
|
||||
install_dir_name = match.group(1)
|
||||
install_path = common_dir / install_dir_name
|
||||
if install_path.is_dir():
|
||||
results[game_name] = install_path
|
||||
logger.info(f"Found {game_name} at {install_path}")
|
||||
continue
|
||||
except Exception as e:
|
||||
logger.warning(f"Error reading appmanifest for {game_name}: {e}")
|
||||
|
||||
return results
|
||||
|
||||
def replace_gamepath(self, modlist_ini_path: Path, new_game_path: Path, modlist_sdcard: bool = False) -> bool:
|
||||
"""
|
||||
Updates the gamePath value in ModOrganizer.ini to the specified path.
|
||||
Strictly matches the bash script: only replaces an existing gamePath line.
|
||||
If the file or line does not exist, logs error and aborts.
|
||||
"""
|
||||
logger.info(f"Replacing gamePath in {modlist_ini_path} with {new_game_path}")
|
||||
if not modlist_ini_path.is_file():
|
||||
logger.error(f"ModOrganizer.ini not found at: {modlist_ini_path}")
|
||||
return False
|
||||
try:
|
||||
with open(modlist_ini_path, 'r', encoding='utf-8', errors='ignore') as f:
|
||||
lines = f.readlines()
|
||||
drive_letter = "D:" if modlist_sdcard else "Z:"
|
||||
processed_path = self._strip_sdcard_path_prefix(new_game_path)
|
||||
windows_style = processed_path.replace('/', '\\')
|
||||
windows_style_double = windows_style.replace('\\', '\\\\')
|
||||
new_gamepath_line = f'gamePath=@ByteArray({drive_letter}{windows_style_double})\n'
|
||||
gamepath_found = False
|
||||
for i, line in enumerate(lines):
|
||||
# Make the check case-insensitive and robust to whitespace
|
||||
if re.match(r'^\s*gamepath\s*=.*$', line, re.IGNORECASE):
|
||||
lines[i] = new_gamepath_line
|
||||
gamepath_found = True
|
||||
break
|
||||
if not gamepath_found:
|
||||
logger.error("No gamePath line found in ModOrganizer.ini")
|
||||
return False
|
||||
with open(modlist_ini_path, 'w', encoding='utf-8') as f:
|
||||
f.writelines(lines)
|
||||
logger.info(f"Successfully updated gamePath to {new_game_path}")
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.error(f"Error replacing gamePath: {e}", exc_info=True)
|
||||
return False
|
||||
|
||||
# =====================================================================================
|
||||
# CRITICAL: DO NOT CHANGE THIS FUNCTION WITHOUT UPDATING TESTS AND CONSULTING PROJECT LEAD
|
||||
# This function implements the exact path rewriting logic required for ModOrganizer.ini
|
||||
# to match the original, robust bash script. Any change here risks breaking modlist
|
||||
# configuration for users. If you must change this, update all relevant tests and
|
||||
# consult the Project Lead for Jackify. See also omni-guides.sh for reference logic.
|
||||
# =====================================================================================
|
||||
def edit_binary_working_paths(self, modlist_ini_path: Path, modlist_dir_path: Path, modlist_sdcard: bool, steam_libraries: Optional[List[Path]] = None) -> bool:
|
||||
"""
|
||||
Update all binary paths and working directories in a ModOrganizer.ini file.
|
||||
Handles various ModOrganizer.ini formats (single or double backslashes in keys).
|
||||
When updating gamePath, binary, and workingDirectory, retain the original stock folder (Stock Game, Game Root, etc) if present in the current value.
|
||||
steam_libraries: Optional[List[Path]] - already-discovered Steam library paths to use for vanilla detection.
|
||||
|
||||
# DO NOT CHANGE THIS LOGIC WITHOUT UPDATING TESTS AND CONSULTING THE PROJECT LEAD
|
||||
# This is a critical, regression-prone area. See omni-guides.sh for reference.
|
||||
"""
|
||||
try:
|
||||
logger.debug(f"Updating binary paths and working directories in {modlist_ini_path} to use root: {modlist_dir_path}")
|
||||
if not modlist_ini_path.is_file():
|
||||
logger.error(f"INI file {modlist_ini_path} does not exist")
|
||||
return False
|
||||
with open(modlist_ini_path, 'r', encoding='utf-8') as f:
|
||||
lines = f.readlines()
|
||||
game_path_updated = False
|
||||
binary_paths_updated = 0
|
||||
working_dirs_updated = 0
|
||||
binary_lines = []
|
||||
working_dir_lines = []
|
||||
for i, line in enumerate(lines):
|
||||
stripped = line.strip()
|
||||
binary_match = re.match(r'^(\d+)(\\+)\s*binary\s*=.*$', stripped, re.IGNORECASE)
|
||||
if binary_match:
|
||||
index = binary_match.group(1)
|
||||
backslash_style = binary_match.group(2)
|
||||
binary_lines.append((i, stripped, index, backslash_style))
|
||||
wd_match = re.match(r'^(\d+)(\\+)\s*workingDirectory\s*=.*$', stripped, re.IGNORECASE)
|
||||
if wd_match:
|
||||
index = wd_match.group(1)
|
||||
backslash_style = wd_match.group(2)
|
||||
working_dir_lines.append((i, stripped, index, backslash_style))
|
||||
binary_paths_by_index = {}
|
||||
# Use provided steam_libraries if available, else detect
|
||||
if steam_libraries is None or not steam_libraries:
|
||||
steam_libraries = PathHandler.get_all_steam_library_paths()
|
||||
for i, line, index, backslash_style in binary_lines:
|
||||
parts = line.split('=', 1)
|
||||
if len(parts) != 2:
|
||||
logger.error(f"Malformed binary line: {line}")
|
||||
continue
|
||||
key_part, value_part = parts
|
||||
exe_name = os.path.basename(value_part).lower()
|
||||
|
||||
# SELECTIVE FILTERING: Only process target executables (script extenders, etc.)
|
||||
if exe_name not in TARGET_EXECUTABLES_LOWER:
|
||||
logger.debug(f"Skipping non-target executable: {exe_name}")
|
||||
continue
|
||||
|
||||
drive_prefix = "D:" if modlist_sdcard else "Z:"
|
||||
rel_path = None
|
||||
# --- BEGIN: FULL PARITY LOGIC ---
|
||||
if 'steamapps' in value_part:
|
||||
idx = value_part.index('steamapps')
|
||||
subpath = value_part[idx:].lstrip('/')
|
||||
correct_steam_lib = None
|
||||
for lib in steam_libraries:
|
||||
if (lib / subpath.split('/')[2]).exists():
|
||||
correct_steam_lib = lib.parent
|
||||
break
|
||||
if not correct_steam_lib and steam_libraries:
|
||||
correct_steam_lib = steam_libraries[0].parent
|
||||
if correct_steam_lib:
|
||||
new_binary_path = f"{drive_prefix}/{correct_steam_lib}/{subpath}".replace('\\', '/').replace('//', '/')
|
||||
else:
|
||||
logger.error("Could not determine correct Steam library for vanilla game path.")
|
||||
continue
|
||||
else:
|
||||
found_stock = None
|
||||
for folder in STOCK_GAME_FOLDERS:
|
||||
folder_pattern = f"/{folder.replace(' ', '')}".lower()
|
||||
value_part_lower = value_part.replace(' ', '').lower()
|
||||
if folder_pattern in value_part_lower:
|
||||
idx = value_part_lower.index(folder_pattern)
|
||||
rel_path = value_part[idx:].lstrip('/')
|
||||
found_stock = folder
|
||||
break
|
||||
if not rel_path:
|
||||
mods_pattern = "/mods/"
|
||||
if mods_pattern in value_part:
|
||||
idx = value_part.index(mods_pattern)
|
||||
rel_path = value_part[idx:].lstrip('/')
|
||||
else:
|
||||
rel_path = exe_name
|
||||
new_binary_path = f"{drive_prefix}/{modlist_dir_path}/{rel_path}".replace('\\', '/').replace('//', '/')
|
||||
formatted_binary_path = PathHandler._format_binary_for_mo2(new_binary_path)
|
||||
new_binary_line = f"{index}{backslash_style}binary={formatted_binary_path}"
|
||||
logger.debug(f"Updating binary path: {line.strip()} -> {new_binary_line}")
|
||||
lines[i] = new_binary_line + "\n"
|
||||
binary_paths_updated += 1
|
||||
binary_paths_by_index[index] = formatted_binary_path
|
||||
for j, wd_line, index, backslash_style in working_dir_lines:
|
||||
if index in binary_paths_by_index:
|
||||
binary_path = binary_paths_by_index[index]
|
||||
wd_path = os.path.dirname(binary_path)
|
||||
drive_prefix = "D:" if modlist_sdcard else "Z:"
|
||||
if wd_path.startswith("D:") or wd_path.startswith("Z:"):
|
||||
wd_path = wd_path[2:]
|
||||
wd_path = drive_prefix + wd_path
|
||||
formatted_wd_path = PathHandler._format_workingdir_for_mo2(wd_path)
|
||||
key_part = f"{index}{backslash_style}workingDirectory"
|
||||
new_wd_line = f"{key_part}={formatted_wd_path}"
|
||||
logger.debug(f"Updating working directory: {wd_line.strip()} -> {new_wd_line}")
|
||||
lines[j] = new_wd_line + "\n"
|
||||
working_dirs_updated += 1
|
||||
with open(modlist_ini_path, 'w', encoding='utf-8') as f:
|
||||
f.writelines(lines)
|
||||
logger.info(f"edit_binary_working_paths completed: Game path updated: {game_path_updated}, Binary paths updated: {binary_paths_updated}, Working directories updated: {working_dirs_updated}")
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.error(f"Error updating binary paths in {modlist_ini_path}: {str(e)}")
|
||||
return False
|
||||
|
||||
def _format_path_for_mo2(self, path: str) -> str:
|
||||
"""Format a path for MO2's ModOrganizer.ini file (working directories)."""
|
||||
# Replace forward slashes with double backslashes
|
||||
formatted = path.replace('/', '\\')
|
||||
# Ensure we have a Windows drive letter format
|
||||
if not re.match(r'^[A-Za-z]:', formatted):
|
||||
formatted = 'D:' + formatted
|
||||
# Double the backslashes for the INI file format
|
||||
formatted = formatted.replace('\\', '\\\\')
|
||||
return formatted
|
||||
|
||||
def _format_binary_path_for_mo2(self, path_str):
|
||||
"""Format a binary path for MO2 config file.
|
||||
|
||||
Binary paths need forward slashes (/) in the path portion.
|
||||
"""
|
||||
# Replace backslashes with forward slashes
|
||||
return path_str.replace('\\', '/')
|
||||
|
||||
def _format_working_dir_for_mo2(self, path_str):
|
||||
"""
|
||||
Format a working directory path for MO2 config file.
|
||||
Ensures double backslashes throughout, as required by ModOrganizer.ini.
|
||||
"""
|
||||
import re
|
||||
path = path_str.replace('/', '\\')
|
||||
path = path.replace('\\', '\\\\') # Double all backslashes
|
||||
# Ensure only one double backslash after drive letter
|
||||
path = re.sub(r'^([A-Z]:)\\\\+', r'\1\\\\', path)
|
||||
return path
|
||||
|
||||
@staticmethod
|
||||
def find_vanilla_game_paths(game_names=None) -> Dict[str, Path]:
|
||||
"""
|
||||
For each known game, iterate all Steam libraries and look for the canonical game directory name in steamapps/common.
|
||||
Returns a dict of found games and their paths.
|
||||
Args:
|
||||
game_names: Optional list of game names to check. If None, uses default supported games.
|
||||
Returns:
|
||||
Dict[str, Path]: Mapping of game name to found install Path.
|
||||
"""
|
||||
# Canonical game directory names (allow list for Fallout 3)
|
||||
GAME_DIR_NAMES = {
|
||||
"Skyrim Special Edition": ["Skyrim Special Edition"],
|
||||
"Fallout 4": ["Fallout 4"],
|
||||
"Fallout New Vegas": ["Fallout New Vegas"],
|
||||
"Oblivion": ["Oblivion"],
|
||||
"Fallout 3": ["Fallout 3", "Fallout 3 goty"]
|
||||
}
|
||||
if game_names is None:
|
||||
game_names = list(GAME_DIR_NAMES.keys())
|
||||
all_steam_libraries = PathHandler.get_all_steam_library_paths()
|
||||
logger.info(f"[DEBUG] Detected Steam libraries: {all_steam_libraries}")
|
||||
found_games = {}
|
||||
for game in game_names:
|
||||
possible_names = GAME_DIR_NAMES.get(game, [game])
|
||||
for lib in all_steam_libraries:
|
||||
for name in possible_names:
|
||||
candidate = lib / "steamapps" / "common" / name
|
||||
logger.info(f"[DEBUG] Checking for vanilla game directory: {candidate}")
|
||||
if candidate.is_dir():
|
||||
found_games[game] = candidate
|
||||
logger.info(f"Found vanilla game directory for {game}: {candidate}")
|
||||
break # Stop after first found location
|
||||
if game in found_games:
|
||||
break
|
||||
return found_games
|
||||
|
||||
def _detect_stock_game_path(self):
|
||||
"""Detects common 'Stock Game' or 'Game Root' directories within the modlist path."""
|
||||
self.logger.info("Step 7a: Detecting Stock Game/Game Root directory...")
|
||||
if not self.modlist_dir:
|
||||
self.logger.error("Modlist directory not set, cannot detect stock game path.")
|
||||
return False
|
||||
|
||||
modlist_path = Path(self.modlist_dir)
|
||||
# Always prefer 'Stock Game' if it exists, then fallback to others
|
||||
preferred_order = [
|
||||
"Stock Game",
|
||||
"STOCK GAME",
|
||||
"Skyrim Stock",
|
||||
"Stock Game Folder",
|
||||
"Stock Folder",
|
||||
Path("root/Skyrim Special Edition"),
|
||||
"Game Root" # 'Game Root' is now last
|
||||
]
|
||||
|
||||
found_path = None
|
||||
for name in preferred_order:
|
||||
potential_path = modlist_path / name
|
||||
if potential_path.is_dir():
|
||||
found_path = str(potential_path)
|
||||
self.logger.info(f"Found potential stock game directory: {found_path}")
|
||||
break # Found the first match
|
||||
if found_path:
|
||||
self.stock_game_path = found_path
|
||||
return True
|
||||
else:
|
||||
self.stock_game_path = None
|
||||
self.logger.info("No common Stock Game/Game Root directory found. Will assume vanilla game path is needed for some operations.")
|
||||
return True
|
||||
|
||||
# --- Add robust path formatters for INI fields ---
|
||||
@staticmethod
|
||||
def _format_gamepath_for_mo2(path: str) -> str:
|
||||
import re
|
||||
path = path.replace('/', '\\')
|
||||
path = re.sub(r'\\+', r'\\', path) # Collapse multiple backslashes
|
||||
# Ensure only one double backslash after drive letter
|
||||
path = re.sub(r'^([A-Z]:)\\+', r'\1\\', path)
|
||||
return path
|
||||
|
||||
@staticmethod
|
||||
def _format_binary_for_mo2(path: str) -> str:
|
||||
import re
|
||||
path = path.replace('\\', '/')
|
||||
# Collapse multiple forward slashes after drive letter
|
||||
path = re.sub(r'^([A-Z]:)//+', r'\1/', path)
|
||||
return path
|
||||
|
||||
@staticmethod
|
||||
def _format_workingdir_for_mo2(path: str) -> str:
|
||||
import re
|
||||
path = path.replace('/', '\\')
|
||||
path = path.replace('\\', '\\\\') # Double all backslashes
|
||||
# Ensure only one double backslash after drive letter
|
||||
path = re.sub(r'^([A-Z]:)\\\\+', r'\1\\\\', path)
|
||||
return path
|
||||
|
||||
# --- End of PathHandler ---
|
||||
255
jackify/backend/handlers/progress_aggregator.py
Normal file
255
jackify/backend/handlers/progress_aggregator.py
Normal file
@@ -0,0 +1,255 @@
|
||||
"""
|
||||
Progress Aggregator
|
||||
|
||||
Handles aggregation and cleanup of download progress messages to provide
|
||||
a cleaner, less disorienting user experience when multiple downloads are running.
|
||||
"""
|
||||
|
||||
import re
|
||||
import time
|
||||
from typing import Dict, Optional, List, NamedTuple
|
||||
from collections import defaultdict, deque
|
||||
from dataclasses import dataclass
|
||||
|
||||
|
||||
@dataclass
|
||||
class DownloadProgress:
|
||||
"""Represents progress for a single download."""
|
||||
file_name: str
|
||||
current_size: int
|
||||
total_size: int
|
||||
speed: float
|
||||
percentage: float
|
||||
last_update: float
|
||||
|
||||
|
||||
class ProgressStats(NamedTuple):
|
||||
"""Aggregated progress statistics."""
|
||||
total_files: int
|
||||
completed_files: int
|
||||
active_files: int
|
||||
total_bytes: int
|
||||
downloaded_bytes: int
|
||||
overall_percentage: float
|
||||
average_speed: float
|
||||
|
||||
|
||||
class ProgressAggregator:
|
||||
"""
|
||||
Aggregates download progress from multiple concurrent downloads and provides
|
||||
cleaner progress reporting to avoid UI spam.
|
||||
"""
|
||||
|
||||
def __init__(self, update_interval: float = 2.0, max_displayed_downloads: int = 3):
|
||||
self.update_interval = update_interval
|
||||
self.max_displayed_downloads = max_displayed_downloads
|
||||
|
||||
# Track individual download progress
|
||||
self._downloads: Dict[str, DownloadProgress] = {}
|
||||
self._completed_downloads: set = set()
|
||||
|
||||
# Track overall statistics
|
||||
self._last_update_time = 0.0
|
||||
self._recent_speeds = deque(maxlen=10) # For speed averaging
|
||||
|
||||
# Pattern matching for different progress formats
|
||||
self._progress_patterns = [
|
||||
# Common download progress patterns
|
||||
r'(?:Downloading|Download)\s+(.+?):\s*(\d+)%',
|
||||
r'(?:Downloading|Download)\s+(.+?)\s+\[([^\]]+)\]',
|
||||
r'\[(\d+)/(\d+)\]\s*(.+?)\s*(\d+)%',
|
||||
# Extraction progress patterns
|
||||
r'(?:Extracting|Extract)\s+(.+?):\s*(\d+)%',
|
||||
r'(?:Extracting|Extract)\s+(.+?)\s+\[([^\]]+)\]',
|
||||
]
|
||||
|
||||
def update_progress(self, message: str) -> Optional[str]:
|
||||
"""
|
||||
Update progress with a new message and return aggregated progress if it's time to update.
|
||||
|
||||
Args:
|
||||
message: Raw progress message from jackify-engine
|
||||
|
||||
Returns:
|
||||
Cleaned progress message if update interval has passed, None otherwise
|
||||
"""
|
||||
current_time = time.time()
|
||||
|
||||
# Parse the progress message
|
||||
parsed = self._parse_progress_message(message)
|
||||
if parsed:
|
||||
self._downloads[parsed.file_name] = parsed
|
||||
|
||||
# Check if it's time for an update
|
||||
if current_time - self._last_update_time >= self.update_interval:
|
||||
self._last_update_time = current_time
|
||||
return self._generate_aggregated_message()
|
||||
|
||||
return None
|
||||
|
||||
def mark_completed(self, file_name: str):
|
||||
"""Mark a download as completed."""
|
||||
self._completed_downloads.add(file_name)
|
||||
if file_name in self._downloads:
|
||||
del self._downloads[file_name]
|
||||
|
||||
def get_stats(self) -> ProgressStats:
|
||||
"""Get current aggregated statistics."""
|
||||
active_downloads = list(self._downloads.values())
|
||||
|
||||
if not active_downloads:
|
||||
return ProgressStats(0, len(self._completed_downloads), 0, 0, 0, 0.0, 0.0)
|
||||
|
||||
total_files = len(active_downloads) + len(self._completed_downloads)
|
||||
total_bytes = sum(d.total_size for d in active_downloads)
|
||||
downloaded_bytes = sum(d.current_size for d in active_downloads)
|
||||
|
||||
# Calculate overall percentage
|
||||
if total_bytes > 0:
|
||||
overall_percentage = (downloaded_bytes / total_bytes) * 100
|
||||
else:
|
||||
overall_percentage = 0.0
|
||||
|
||||
# Calculate average speed
|
||||
speeds = [d.speed for d in active_downloads if d.speed > 0]
|
||||
average_speed = sum(speeds) / len(speeds) if speeds else 0.0
|
||||
|
||||
return ProgressStats(
|
||||
total_files=total_files,
|
||||
completed_files=len(self._completed_downloads),
|
||||
active_files=len(active_downloads),
|
||||
total_bytes=total_bytes,
|
||||
downloaded_bytes=downloaded_bytes,
|
||||
overall_percentage=overall_percentage,
|
||||
average_speed=average_speed
|
||||
)
|
||||
|
||||
def _parse_progress_message(self, message: str) -> Optional[DownloadProgress]:
|
||||
"""Parse a progress message into structured data."""
|
||||
# Clean up the message
|
||||
clean_message = message.strip()
|
||||
|
||||
# Try each pattern
|
||||
for pattern in self._progress_patterns:
|
||||
match = re.search(pattern, clean_message, re.IGNORECASE)
|
||||
if match:
|
||||
try:
|
||||
if len(match.groups()) >= 2:
|
||||
file_name = match.group(1).strip()
|
||||
|
||||
# Extract percentage or progress info
|
||||
progress_str = match.group(2)
|
||||
|
||||
# Handle different progress formats
|
||||
if progress_str.endswith('%'):
|
||||
percentage = float(progress_str[:-1])
|
||||
# Estimate size based on percentage (we don't have exact sizes)
|
||||
current_size = int(percentage * 1000) # Arbitrary scaling
|
||||
total_size = 100000
|
||||
speed = 0.0
|
||||
else:
|
||||
# Try to parse size/speed format like "45.2MB/s"
|
||||
percentage = 0.0
|
||||
current_size = 0
|
||||
total_size = 1
|
||||
speed = self._parse_speed(progress_str)
|
||||
|
||||
return DownloadProgress(
|
||||
file_name=file_name,
|
||||
current_size=current_size,
|
||||
total_size=total_size,
|
||||
speed=speed,
|
||||
percentage=percentage,
|
||||
last_update=time.time()
|
||||
)
|
||||
except (ValueError, IndexError):
|
||||
continue
|
||||
|
||||
return None
|
||||
|
||||
def _parse_speed(self, speed_str: str) -> float:
|
||||
"""Parse speed string like '45.2MB/s' into bytes per second."""
|
||||
try:
|
||||
# Remove '/s' suffix
|
||||
speed_str = speed_str.replace('/s', '').strip()
|
||||
|
||||
# Extract number and unit
|
||||
match = re.match(r'([\d.]+)\s*([KMGT]?B)', speed_str, re.IGNORECASE)
|
||||
if not match:
|
||||
return 0.0
|
||||
|
||||
value = float(match.group(1))
|
||||
unit = match.group(2).upper()
|
||||
|
||||
# Convert to bytes per second
|
||||
multipliers = {
|
||||
'B': 1,
|
||||
'KB': 1024,
|
||||
'MB': 1024 * 1024,
|
||||
'GB': 1024 * 1024 * 1024,
|
||||
'TB': 1024 * 1024 * 1024 * 1024
|
||||
}
|
||||
|
||||
return value * multipliers.get(unit, 1)
|
||||
|
||||
except (ValueError, AttributeError):
|
||||
return 0.0
|
||||
|
||||
def _generate_aggregated_message(self) -> str:
|
||||
"""Generate a clean, aggregated progress message."""
|
||||
stats = self.get_stats()
|
||||
|
||||
if stats.total_files == 0:
|
||||
return "Processing..."
|
||||
|
||||
# Get most recent active downloads to display
|
||||
recent_downloads = sorted(
|
||||
self._downloads.values(),
|
||||
key=lambda d: d.last_update,
|
||||
reverse=True
|
||||
)[:self.max_displayed_downloads]
|
||||
|
||||
# Build message components
|
||||
components = []
|
||||
|
||||
# Overall progress
|
||||
if stats.total_files > 1:
|
||||
components.append(f"Progress: {stats.completed_files}/{stats.total_files} files")
|
||||
if stats.overall_percentage > 0:
|
||||
components.append(f"({stats.overall_percentage:.1f}%)")
|
||||
|
||||
# Current active downloads
|
||||
if recent_downloads:
|
||||
if len(recent_downloads) == 1:
|
||||
download = recent_downloads[0]
|
||||
if download.percentage > 0:
|
||||
components.append(f"Downloading: {download.file_name} ({download.percentage:.1f}%)")
|
||||
else:
|
||||
components.append(f"Downloading: {download.file_name}")
|
||||
else:
|
||||
components.append(f"Downloading {len(recent_downloads)} files")
|
||||
|
||||
# Speed info
|
||||
if stats.average_speed > 0:
|
||||
speed_str = self._format_speed(stats.average_speed)
|
||||
components.append(f"@ {speed_str}")
|
||||
|
||||
return " - ".join(components) if components else "Processing..."
|
||||
|
||||
def _format_speed(self, speed_bytes: float) -> str:
|
||||
"""Format speed in bytes/sec to human readable format."""
|
||||
if speed_bytes < 1024:
|
||||
return f"{speed_bytes:.1f} B/s"
|
||||
elif speed_bytes < 1024 * 1024:
|
||||
return f"{speed_bytes / 1024:.1f} KB/s"
|
||||
elif speed_bytes < 1024 * 1024 * 1024:
|
||||
return f"{speed_bytes / (1024 * 1024):.1f} MB/s"
|
||||
else:
|
||||
return f"{speed_bytes / (1024 * 1024 * 1024):.1f} GB/s"
|
||||
|
||||
def reset(self):
|
||||
"""Reset all progress tracking."""
|
||||
self._downloads.clear()
|
||||
self._completed_downloads.clear()
|
||||
self._recent_speeds.clear()
|
||||
self._last_update_time = 0.0
|
||||
708
jackify/backend/handlers/protontricks_handler.py
Normal file
708
jackify/backend/handlers/protontricks_handler.py
Normal file
@@ -0,0 +1,708 @@
|
||||
#!/usr/bin/env python3
|
||||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
Protontricks Handler Module
|
||||
Handles detection and operation of Protontricks
|
||||
"""
|
||||
|
||||
import os
|
||||
import re
|
||||
import subprocess
|
||||
from pathlib import Path
|
||||
import shutil
|
||||
import logging
|
||||
from typing import Dict, Optional, List
|
||||
import sys
|
||||
|
||||
# Initialize logger
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class ProtontricksHandler:
|
||||
"""
|
||||
Handles operations related to Protontricks detection and usage
|
||||
"""
|
||||
|
||||
def __init__(self, steamdeck: bool, logger=None):
|
||||
self.logger = logger or logging.getLogger(__name__)
|
||||
self.which_protontricks = None # 'flatpak' or 'native'
|
||||
self.protontricks_version = None
|
||||
self.protontricks_path = None
|
||||
self.steamdeck = steamdeck # Store steamdeck status
|
||||
|
||||
def _get_clean_subprocess_env(self):
|
||||
"""
|
||||
Create a clean environment for subprocess calls by removing PyInstaller-specific
|
||||
environment variables that can interfere with external program execution.
|
||||
|
||||
Returns:
|
||||
dict: Cleaned environment dictionary
|
||||
"""
|
||||
env = os.environ.copy()
|
||||
|
||||
# Remove PyInstaller-specific environment variables
|
||||
env.pop('_MEIPASS', None)
|
||||
env.pop('_MEIPASS2', None)
|
||||
|
||||
# Clean library path variables that PyInstaller modifies (Linux/Unix)
|
||||
if 'LD_LIBRARY_PATH_ORIG' in env:
|
||||
# Restore original LD_LIBRARY_PATH if it was backed up by PyInstaller
|
||||
env['LD_LIBRARY_PATH'] = env['LD_LIBRARY_PATH_ORIG']
|
||||
else:
|
||||
# Remove PyInstaller-modified LD_LIBRARY_PATH
|
||||
env.pop('LD_LIBRARY_PATH', None)
|
||||
|
||||
# Clean PATH of PyInstaller-specific entries
|
||||
if 'PATH' in env and hasattr(sys, '_MEIPASS'):
|
||||
path_entries = env['PATH'].split(os.pathsep)
|
||||
# Remove any PATH entries that point to PyInstaller temp directory
|
||||
cleaned_path = [p for p in path_entries if not p.startswith(sys._MEIPASS)]
|
||||
env['PATH'] = os.pathsep.join(cleaned_path)
|
||||
|
||||
# Clean macOS library path (if present)
|
||||
if 'DYLD_LIBRARY_PATH' in env and hasattr(sys, '_MEIPASS'):
|
||||
dyld_entries = env['DYLD_LIBRARY_PATH'].split(os.pathsep)
|
||||
cleaned_dyld = [p for p in dyld_entries if not p.startswith(sys._MEIPASS)]
|
||||
if cleaned_dyld:
|
||||
env['DYLD_LIBRARY_PATH'] = os.pathsep.join(cleaned_dyld)
|
||||
else:
|
||||
env.pop('DYLD_LIBRARY_PATH', None)
|
||||
|
||||
return env
|
||||
|
||||
def detect_protontricks(self):
|
||||
"""
|
||||
Detect if protontricks is installed and whether it's flatpak or native.
|
||||
If not found, prompts the user to install the Flatpak version.
|
||||
|
||||
Returns True if protontricks is found or successfully installed, False otherwise
|
||||
"""
|
||||
logger.debug("Detecting if protontricks is installed...")
|
||||
|
||||
# Check if protontricks exists as a command
|
||||
protontricks_path_which = shutil.which("protontricks")
|
||||
self.flatpak_path = shutil.which("flatpak") # Store for later use
|
||||
|
||||
if protontricks_path_which:
|
||||
# Check if it's a flatpak wrapper
|
||||
try:
|
||||
with open(protontricks_path_which, 'r') as f:
|
||||
content = f.read()
|
||||
if "flatpak run" in content:
|
||||
logger.debug(f"Detected Protontricks is a Flatpak wrapper at {protontricks_path_which}")
|
||||
self.which_protontricks = 'flatpak'
|
||||
# Continue to check flatpak list just to be sure
|
||||
else:
|
||||
logger.info(f"Native Protontricks found at {protontricks_path_which}")
|
||||
self.which_protontricks = 'native'
|
||||
self.protontricks_path = protontricks_path_which
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.error(f"Error reading protontricks executable: {e}")
|
||||
|
||||
# Check if flatpak protontricks is installed (or if wrapper check indicated flatpak)
|
||||
flatpak_installed = False
|
||||
try:
|
||||
# PyInstaller fix: Comprehensive environment cleaning for subprocess calls
|
||||
env = self._get_clean_subprocess_env()
|
||||
|
||||
result = subprocess.run(
|
||||
["flatpak", "list"],
|
||||
capture_output=True,
|
||||
text=True,
|
||||
check=True,
|
||||
env=env # Use comprehensively cleaned environment
|
||||
)
|
||||
if "com.github.Matoking.protontricks" in result.stdout:
|
||||
logger.info("Flatpak Protontricks is installed")
|
||||
self.which_protontricks = 'flatpak'
|
||||
flatpak_installed = True
|
||||
return True
|
||||
except FileNotFoundError:
|
||||
logger.warning("'flatpak' command not found. Cannot check for Flatpak Protontricks.")
|
||||
except subprocess.CalledProcessError as e:
|
||||
logger.warning(f"Error checking flatpak list: {e}")
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error checking flatpak: {e}")
|
||||
|
||||
# If neither native nor flatpak found, prompt for installation
|
||||
if not self.which_protontricks:
|
||||
logger.warning("Protontricks not found (native or flatpak).")
|
||||
|
||||
should_install = False
|
||||
if self.steamdeck:
|
||||
logger.info("Running on Steam Deck, attempting automatic Flatpak installation.")
|
||||
# Maybe add a brief pause or message?
|
||||
print("Protontricks not found. Attempting automatic installation via Flatpak...")
|
||||
should_install = True
|
||||
else:
|
||||
try:
|
||||
response = input("Protontricks not found. Install the Flatpak version? (Y/n): ").lower()
|
||||
if response == 'y' or response == '':
|
||||
should_install = True
|
||||
except KeyboardInterrupt:
|
||||
print("\nInstallation cancelled.")
|
||||
return False
|
||||
|
||||
if should_install:
|
||||
try:
|
||||
logger.info("Attempting to install Flatpak Protontricks...")
|
||||
# Use --noninteractive for automatic install where applicable
|
||||
install_cmd = ["flatpak", "install", "-u", "-y", "--noninteractive", "flathub", "com.github.Matoking.protontricks"]
|
||||
|
||||
# PyInstaller fix: Comprehensive environment cleaning for subprocess calls
|
||||
env = self._get_clean_subprocess_env()
|
||||
|
||||
# Run with output visible to user
|
||||
process = subprocess.run(install_cmd, check=True, text=True, env=env)
|
||||
logger.info("Flatpak Protontricks installation successful.")
|
||||
print("Flatpak Protontricks installed successfully.")
|
||||
self.which_protontricks = 'flatpak'
|
||||
return True
|
||||
except FileNotFoundError:
|
||||
logger.error("'flatpak' command not found. Cannot install.")
|
||||
print("Error: 'flatpak' command not found. Please install Flatpak first.")
|
||||
return False
|
||||
except subprocess.CalledProcessError as e:
|
||||
logger.error(f"Flatpak installation failed: {e}")
|
||||
print(f"Error: Flatpak installation failed (Command: {' '.join(e.cmd)}). Please try installing manually.")
|
||||
return False
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error during Flatpak installation: {e}")
|
||||
print("An unexpected error occurred during installation.")
|
||||
return False
|
||||
else:
|
||||
logger.error("User chose not to install Protontricks or installation skipped.")
|
||||
print("Protontricks installation skipped. Cannot continue without Protontricks.")
|
||||
return False
|
||||
|
||||
# Should not reach here if logic is correct, but acts as a fallback
|
||||
logger.error("Protontricks detection failed unexpectedly.")
|
||||
return False
|
||||
|
||||
def check_protontricks_version(self):
|
||||
"""
|
||||
Check if the protontricks version is sufficient
|
||||
Returns True if version is sufficient, False otherwise
|
||||
"""
|
||||
try:
|
||||
if self.which_protontricks == 'flatpak':
|
||||
cmd = ["flatpak", "run", "com.github.Matoking.protontricks", "-V"]
|
||||
else:
|
||||
cmd = ["protontricks", "-V"]
|
||||
|
||||
result = subprocess.run(cmd, capture_output=True, text=True)
|
||||
version_str = result.stdout.split(' ')[1].strip('()')
|
||||
|
||||
# Clean version string
|
||||
cleaned_version = re.sub(r'[^0-9.]', '', version_str)
|
||||
self.protontricks_version = cleaned_version
|
||||
|
||||
# Parse version components
|
||||
version_parts = cleaned_version.split('.')
|
||||
if len(version_parts) >= 2:
|
||||
major, minor = int(version_parts[0]), int(version_parts[1])
|
||||
if major < 1 or (major == 1 and minor < 12):
|
||||
logger.error(f"Protontricks version {cleaned_version} is too old. Version 1.12.0 or newer is required.")
|
||||
return False
|
||||
return True
|
||||
else:
|
||||
logger.error(f"Could not parse protontricks version: {cleaned_version}")
|
||||
return False
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error checking protontricks version: {e}")
|
||||
return False
|
||||
|
||||
def run_protontricks(self, *args, **kwargs):
|
||||
"""
|
||||
Run protontricks with the given arguments and keyword arguments.
|
||||
kwargs are passed directly to subprocess.run (e.g., stderr=subprocess.DEVNULL).
|
||||
Use stdout=subprocess.PIPE, stderr=subprocess.PIPE/DEVNULL instead of capture_output=True.
|
||||
Returns subprocess.CompletedProcess object
|
||||
"""
|
||||
# Ensure protontricks is detected first
|
||||
if self.which_protontricks is None:
|
||||
if not self.detect_protontricks():
|
||||
logger.error("Could not detect protontricks installation")
|
||||
return None
|
||||
|
||||
if self.which_protontricks == 'flatpak':
|
||||
cmd = ["flatpak", "run", "com.github.Matoking.protontricks"]
|
||||
else:
|
||||
cmd = ["protontricks"]
|
||||
|
||||
cmd.extend(args)
|
||||
|
||||
# Default to capturing stdout/stderr unless specified otherwise in kwargs
|
||||
run_kwargs = {
|
||||
'stdout': subprocess.PIPE,
|
||||
'stderr': subprocess.PIPE,
|
||||
'text': True,
|
||||
**kwargs # Allow overriding defaults (like stderr=DEVNULL)
|
||||
}
|
||||
# PyInstaller fix: Use cleaned environment for all protontricks calls
|
||||
env = self._get_clean_subprocess_env()
|
||||
# Suppress Wine debug output
|
||||
env['WINEDEBUG'] = '-all'
|
||||
run_kwargs['env'] = env
|
||||
try:
|
||||
return subprocess.run(cmd, **run_kwargs)
|
||||
except Exception as e:
|
||||
logger.error(f"Error running protontricks: {e}")
|
||||
# Consider returning a mock CompletedProcess with an error code?
|
||||
return None
|
||||
|
||||
def set_protontricks_permissions(self, modlist_dir, steamdeck=False):
|
||||
"""
|
||||
Set permissions for Protontricks to access the modlist directory
|
||||
Returns True on success, False on failure
|
||||
"""
|
||||
if self.which_protontricks != 'flatpak':
|
||||
logger.debug("Using Native protontricks, skip setting permissions")
|
||||
return True
|
||||
|
||||
logger.info("Setting Protontricks permissions...")
|
||||
try:
|
||||
# PyInstaller fix: Use cleaned environment
|
||||
env = self._get_clean_subprocess_env()
|
||||
|
||||
subprocess.run(["flatpak", "override", "--user", "com.github.Matoking.protontricks",
|
||||
f"--filesystem={modlist_dir}"], check=True, env=env)
|
||||
|
||||
if steamdeck:
|
||||
logger.warn("Checking for SDCard and setting permissions appropriately...")
|
||||
# Find sdcard path
|
||||
result = subprocess.run(["df", "-h"], capture_output=True, text=True, env=env)
|
||||
for line in result.stdout.splitlines():
|
||||
if "/run/media" in line:
|
||||
sdcard_path = line.split()[-1]
|
||||
logger.debug(f"SDCard path: {sdcard_path}")
|
||||
subprocess.run(["flatpak", "override", "--user", f"--filesystem={sdcard_path}",
|
||||
"com.github.Matoking.protontricks"], check=True, env=env)
|
||||
# Add standard Steam Deck SD card path as fallback
|
||||
subprocess.run(["flatpak", "override", "--user", "--filesystem=/run/media/mmcblk0p1",
|
||||
"com.github.Matoking.protontricks"], check=True, env=env)
|
||||
logger.debug("Permissions set successfully")
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to set Protontricks permissions: {e}")
|
||||
return False
|
||||
|
||||
def create_protontricks_alias(self):
|
||||
"""
|
||||
Create aliases for protontricks in ~/.bashrc if using flatpak
|
||||
Returns True if created or already exists, False on failure
|
||||
"""
|
||||
if self.which_protontricks != 'flatpak':
|
||||
logger.debug("Not using flatpak, skipping alias creation")
|
||||
return True
|
||||
|
||||
try:
|
||||
bashrc_path = os.path.expanduser("~/.bashrc")
|
||||
|
||||
# Check if file exists and read content
|
||||
if os.path.exists(bashrc_path):
|
||||
with open(bashrc_path, 'r') as f:
|
||||
content = f.read()
|
||||
|
||||
# Check if aliases already exist
|
||||
protontricks_alias_exists = "alias protontricks=" in content
|
||||
launch_alias_exists = "alias protontricks-launch" in content
|
||||
|
||||
# Add missing aliases
|
||||
with open(bashrc_path, 'a') as f:
|
||||
if not protontricks_alias_exists:
|
||||
logger.info("Adding protontricks alias to ~/.bashrc")
|
||||
f.write("\nalias protontricks='flatpak run com.github.Matoking.protontricks'\n")
|
||||
|
||||
if not launch_alias_exists:
|
||||
logger.info("Adding protontricks-launch alias to ~/.bashrc")
|
||||
f.write("\nalias protontricks-launch='flatpak run --command=protontricks-launch com.github.Matoking.protontricks'\n")
|
||||
|
||||
return True
|
||||
else:
|
||||
logger.error("~/.bashrc not found, skipping alias creation")
|
||||
return False
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to create protontricks aliases: {e}")
|
||||
return False
|
||||
|
||||
# def get_modlists(self): # Keep commented out or remove old method
|
||||
# """
|
||||
# Get a list of Skyrim, Fallout, Oblivion modlists from Steam via protontricks
|
||||
# Returns a list of modlist names
|
||||
# """
|
||||
# ... (old implementation with filtering) ...
|
||||
|
||||
# Renamed from list_non_steam_games for clarity and purpose
|
||||
def list_non_steam_shortcuts(self) -> Dict[str, str]:
|
||||
"""List ALL non-Steam shortcuts recognized by Protontricks.
|
||||
|
||||
Runs 'protontricks -l' and parses the output for lines matching
|
||||
"Non-Steam shortcut: [Name] ([AppID])".
|
||||
|
||||
Returns:
|
||||
A dictionary mapping the shortcut name (AppName) to its AppID.
|
||||
Returns an empty dictionary if none are found or an error occurs.
|
||||
"""
|
||||
logger.info("Listing ALL non-Steam shortcuts via protontricks...")
|
||||
non_steam_shortcuts = {}
|
||||
# --- Ensure protontricks is detected before proceeding ---
|
||||
if not self.which_protontricks:
|
||||
self.logger.info("Protontricks type/path not yet determined. Running detection...")
|
||||
if not self.detect_protontricks():
|
||||
self.logger.error("Protontricks detection failed. Cannot list shortcuts.")
|
||||
return {}
|
||||
self.logger.info(f"Protontricks detection successful: {self.which_protontricks}")
|
||||
# --- End detection check ---
|
||||
try:
|
||||
cmd = [] # Initialize cmd list
|
||||
if self.which_protontricks == 'flatpak':
|
||||
cmd = ["flatpak", "run", "com.github.Matoking.protontricks", "-l"]
|
||||
elif self.protontricks_path:
|
||||
cmd = [self.protontricks_path, "-l"]
|
||||
else:
|
||||
logger.error("Protontricks path not determined, cannot list shortcuts.")
|
||||
return {}
|
||||
self.logger.debug(f"Running command: {' '.join(cmd)}")
|
||||
# PyInstaller fix: Use cleaned environment
|
||||
env = self._get_clean_subprocess_env()
|
||||
result = subprocess.run(cmd, capture_output=True, text=True, check=True, encoding='utf-8', errors='ignore', env=env)
|
||||
# Regex to capture name and AppID
|
||||
pattern = re.compile(r"Non-Steam shortcut:\s+(.+)\s+\((\d+)\)")
|
||||
for line in result.stdout.splitlines():
|
||||
line = line.strip()
|
||||
match = pattern.match(line)
|
||||
if match:
|
||||
app_name = match.group(1).strip() # Get the name
|
||||
app_id = match.group(2).strip() # Get the AppID
|
||||
non_steam_shortcuts[app_name] = app_id
|
||||
logger.debug(f"Found non-Steam shortcut: '{app_name}' with AppID {app_id}")
|
||||
if not non_steam_shortcuts:
|
||||
logger.warning("No non-Steam shortcuts found in protontricks output.")
|
||||
except FileNotFoundError:
|
||||
logger.error(f"Protontricks command not found. Path: {cmd[0] if cmd else 'N/A'}")
|
||||
return {}
|
||||
except subprocess.CalledProcessError as e:
|
||||
# Log error but don't necessarily stop; might have partial output
|
||||
logger.error(f"Error running protontricks -l (Exit code: {e.returncode}): {e}")
|
||||
logger.error(f"Stderr (truncated): {e.stderr[:500] if e.stderr else ''}")
|
||||
# Return what we have, might be useful
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error listing non-Steam shortcuts: {e}", exc_info=True)
|
||||
return {}
|
||||
return non_steam_shortcuts
|
||||
|
||||
def enable_dotfiles(self, appid):
|
||||
"""
|
||||
Enable visibility of (.)dot files in the Wine prefix
|
||||
Returns True on success, False on failure
|
||||
|
||||
Args:
|
||||
appid (str): The app ID to use
|
||||
|
||||
Returns:
|
||||
bool: True on success, False on failure
|
||||
"""
|
||||
logger.debug(f"APPID={appid}")
|
||||
logger.info("Enabling visibility of (.)dot files...")
|
||||
|
||||
try:
|
||||
# Check current setting
|
||||
result = self.run_protontricks(
|
||||
"-c", "WINEDEBUG=-all wine reg query \"HKEY_CURRENT_USER\\Software\\Wine\" /v ShowDotFiles",
|
||||
appid,
|
||||
stderr=subprocess.DEVNULL # Suppress stderr for this query
|
||||
)
|
||||
|
||||
# Check if the initial query command ran successfully and contained expected output
|
||||
if result and result.returncode == 0 and "ShowDotFiles" in result.stdout and "Y" in result.stdout:
|
||||
logger.info("DotFiles already enabled via registry... skipping")
|
||||
return True
|
||||
elif result and result.returncode != 0:
|
||||
# Log as info/debug since non-zero exit is expected if key doesn't exist
|
||||
logger.info(f"Initial query for ShowDotFiles likely failed because the key doesn't exist yet (Exit Code: {result.returncode}). Proceeding to set it. Stderr: {result.stderr}")
|
||||
elif not result:
|
||||
logger.error("Failed to execute initial dotfile query command.")
|
||||
# Proceed cautiously
|
||||
|
||||
# --- Try to set the value ---
|
||||
dotfiles_set_success = False
|
||||
|
||||
# Method 1: Set registry key (Primary Method)
|
||||
logger.debug("Attempting to set ShowDotFiles registry key...")
|
||||
result_add = self.run_protontricks(
|
||||
"-c", "WINEDEBUG=-all wine reg add \"HKEY_CURRENT_USER\\Software\\Wine\" /v ShowDotFiles /d Y /f",
|
||||
appid,
|
||||
# Keep stderr for this one to log potential errors from reg add
|
||||
# stderr=subprocess.DEVNULL
|
||||
)
|
||||
if result_add and result_add.returncode == 0:
|
||||
logger.info("'wine reg add' command executed successfully.")
|
||||
dotfiles_set_success = True # Tentative success
|
||||
elif result_add:
|
||||
logger.warning(f"'wine reg add' command failed (Exit Code: {result_add.returncode}). Stderr: {result_add.stderr}")
|
||||
else:
|
||||
logger.error("Failed to execute 'wine reg add' command.")
|
||||
|
||||
# Method 2: Create user.reg entry (Backup Method)
|
||||
# This is useful if registry commands fail but direct file access works
|
||||
logger.debug("Ensuring user.reg has correct entry...")
|
||||
prefix_path = self.get_wine_prefix_path(appid)
|
||||
if prefix_path:
|
||||
user_reg_path = Path(prefix_path) / "user.reg"
|
||||
try:
|
||||
if user_reg_path.exists():
|
||||
content = user_reg_path.read_text(encoding='utf-8', errors='ignore')
|
||||
if "ShowDotFiles" not in content:
|
||||
logger.debug(f"Adding ShowDotFiles entry to {user_reg_path}")
|
||||
with open(user_reg_path, 'a', encoding='utf-8') as f:
|
||||
f.write('\n[Software\\Wine] 1603891765\n')
|
||||
f.write('"ShowDotFiles"="Y"\n')
|
||||
dotfiles_set_success = True # Count file write as success too
|
||||
else:
|
||||
logger.debug("ShowDotFiles already present in user.reg")
|
||||
dotfiles_set_success = True # Already there counts as success
|
||||
else:
|
||||
logger.warning(f"user.reg not found at {user_reg_path}, creating it.")
|
||||
with open(user_reg_path, 'w', encoding='utf-8') as f:
|
||||
f.write('[Software\\Wine] 1603891765\n')
|
||||
f.write('"ShowDotFiles"="Y"\n')
|
||||
dotfiles_set_success = True # Creating file counts as success
|
||||
except Exception as e:
|
||||
logger.warning(f"Error reading/writing user.reg: {e}")
|
||||
else:
|
||||
logger.warning("Could not get WINEPREFIX path, skipping user.reg modification.")
|
||||
|
||||
# --- Verification Step ---
|
||||
logger.debug("Verifying dotfile setting after attempts...")
|
||||
verify_result = self.run_protontricks(
|
||||
"-c", "WINEDEBUG=-all wine reg query \"HKEY_CURRENT_USER\\Software\\Wine\" /v ShowDotFiles",
|
||||
appid,
|
||||
stderr=subprocess.DEVNULL # Suppress stderr for verification query
|
||||
)
|
||||
|
||||
query_verified = False
|
||||
if verify_result and verify_result.returncode == 0 and "ShowDotFiles" in verify_result.stdout and "Y" in verify_result.stdout:
|
||||
logger.debug("Verification query successful and key is set.")
|
||||
query_verified = True
|
||||
elif verify_result:
|
||||
# Change Warning to Info - verification failing right after setting is common
|
||||
logger.info(f"Verification query failed or key not found (Exit Code: {verify_result.returncode}). Stderr: {verify_result.stderr}")
|
||||
else:
|
||||
logger.error("Failed to execute verification query command.")
|
||||
|
||||
# --- Final Decision ---
|
||||
if dotfiles_set_success:
|
||||
# If the add command or file write succeeded, we report overall success,
|
||||
# even if the verification query failed, but log the query status.
|
||||
if query_verified:
|
||||
logger.info("Dotfiles enabled and verified successfully!")
|
||||
else:
|
||||
# Change Warning to Info - verification failing right after setting is common
|
||||
logger.info("Dotfiles potentially enabled (reg add/user.reg succeeded), but verification query failed.")
|
||||
return True # Report success based on the setting action
|
||||
else:
|
||||
# If both the reg add and user.reg steps failed
|
||||
logger.error("Failed to enable dotfiles using registry and user.reg methods.")
|
||||
return False
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error enabling dotfiles: {e}", exc_info=True)
|
||||
return False
|
||||
|
||||
def set_win10_prefix(self, appid):
|
||||
"""
|
||||
Set Windows 10 version in the proton prefix
|
||||
Returns True on success, False on failure
|
||||
"""
|
||||
try:
|
||||
# PyInstaller fix: Use cleaned environment
|
||||
env = self._get_clean_subprocess_env()
|
||||
env["WINEDEBUG"] = "-all"
|
||||
|
||||
if self.which_protontricks == 'flatpak':
|
||||
cmd = ["flatpak", "run", "com.github.Matoking.protontricks", "--no-bwrap", appid, "win10"]
|
||||
else:
|
||||
cmd = ["protontricks", "--no-bwrap", appid, "win10"]
|
||||
|
||||
subprocess.run(cmd, env=env, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.error(f"Error setting Windows 10 prefix: {e}")
|
||||
return False
|
||||
|
||||
def protontricks_alias(self):
|
||||
"""
|
||||
Create protontricks alias in ~/.bashrc
|
||||
"""
|
||||
logger.info("Creating protontricks alias in ~/.bashrc...")
|
||||
|
||||
try:
|
||||
if self.which_protontricks == 'flatpak':
|
||||
# Check if aliases already exist
|
||||
bashrc_path = os.path.expanduser("~/.bashrc")
|
||||
protontricks_alias_exists = False
|
||||
launch_alias_exists = False
|
||||
|
||||
if os.path.exists(bashrc_path):
|
||||
with open(bashrc_path, 'r') as f:
|
||||
content = f.read()
|
||||
protontricks_alias_exists = "alias protontricks='flatpak run com.github.Matoking.protontricks'" in content
|
||||
launch_alias_exists = "alias protontricks-launch='flatpak run --command=protontricks-launch com.github.Matoking.protontricks'" in content
|
||||
|
||||
# Add aliases if they don't exist
|
||||
with open(bashrc_path, 'a') as f:
|
||||
if not protontricks_alias_exists:
|
||||
f.write("\n# Jackify: Protontricks alias\n")
|
||||
f.write("alias protontricks='flatpak run com.github.Matoking.protontricks'\n")
|
||||
logger.debug("Added protontricks alias to ~/.bashrc")
|
||||
|
||||
if not launch_alias_exists:
|
||||
f.write("\n# Jackify: Protontricks-launch alias\n")
|
||||
f.write("alias protontricks-launch='flatpak run --command=protontricks-launch com.github.Matoking.protontricks'\n")
|
||||
logger.debug("Added protontricks-launch alias to ~/.bashrc")
|
||||
|
||||
logger.info("Protontricks aliases created successfully")
|
||||
return True
|
||||
else:
|
||||
logger.info("Protontricks is not installed via flatpak, skipping alias creation")
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.error(f"Error creating protontricks alias: {e}")
|
||||
return False
|
||||
|
||||
def get_wine_prefix_path(self, appid) -> Optional[str]:
|
||||
"""Gets the WINEPREFIX path for a given AppID.
|
||||
|
||||
Args:
|
||||
appid (str): The Steam AppID.
|
||||
|
||||
Returns:
|
||||
The WINEPREFIX path as a string, or None if detection fails.
|
||||
"""
|
||||
logger.debug(f"Getting WINEPREFIX for AppID {appid}")
|
||||
result = self.run_protontricks("-c", "echo $WINEPREFIX", appid)
|
||||
if result and result.returncode == 0 and result.stdout.strip():
|
||||
prefix_path = result.stdout.strip()
|
||||
logger.debug(f"Detected WINEPREFIX: {prefix_path}")
|
||||
return prefix_path
|
||||
else:
|
||||
logger.error(f"Failed to get WINEPREFIX for AppID {appid}. Stderr: {result.stderr if result else 'N/A'}")
|
||||
return None
|
||||
|
||||
def run_protontricks_launch(self, appid, installer_path, *extra_args):
|
||||
"""
|
||||
Run protontricks-launch (for WebView or similar installers) using the correct method for flatpak or native.
|
||||
Returns subprocess.CompletedProcess object.
|
||||
"""
|
||||
if self.which_protontricks is None:
|
||||
if not self.detect_protontricks():
|
||||
self.logger.error("Could not detect protontricks installation")
|
||||
return None
|
||||
if self.which_protontricks == 'flatpak':
|
||||
cmd = ["flatpak", "run", "--command=protontricks-launch", "com.github.Matoking.protontricks", "--appid", appid, str(installer_path)]
|
||||
else:
|
||||
launch_path = shutil.which("protontricks-launch")
|
||||
if not launch_path:
|
||||
self.logger.error("protontricks-launch command not found in PATH.")
|
||||
return None
|
||||
cmd = [launch_path, "--appid", appid, str(installer_path)]
|
||||
if extra_args:
|
||||
cmd.extend(extra_args)
|
||||
self.logger.debug(f"Running protontricks-launch: {' '.join(map(str, cmd))}")
|
||||
try:
|
||||
# PyInstaller fix: Use cleaned environment
|
||||
env = self._get_clean_subprocess_env()
|
||||
return subprocess.run(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, text=True, env=env)
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error running protontricks-launch: {e}")
|
||||
return None
|
||||
|
||||
def install_wine_components(self, appid, game_var, specific_components: Optional[List[str]] = None):
|
||||
"""
|
||||
Install the specified Wine components into the given prefix using protontricks.
|
||||
If specific_components is None, use the default set (fontsmooth=rgb, xact, xact_x64, vcrun2022).
|
||||
"""
|
||||
env = self._get_clean_subprocess_env()
|
||||
env["WINEDEBUG"] = "-all"
|
||||
if specific_components is not None:
|
||||
components_to_install = specific_components
|
||||
self.logger.info(f"Installing specific components: {components_to_install}")
|
||||
else:
|
||||
components_to_install = ["fontsmooth=rgb", "xact", "xact_x64", "vcrun2022"]
|
||||
self.logger.info(f"Installing default components: {components_to_install}")
|
||||
if not components_to_install:
|
||||
self.logger.info("No Wine components to install.")
|
||||
return True
|
||||
self.logger.info(f"AppID: {appid}, Game: {game_var}, Components: {components_to_install}")
|
||||
# print(f"\n[Jackify] Installing Wine components for AppID {appid} ({game_var}):\n {', '.join(components_to_install)}\n") # Suppressed per user request
|
||||
max_attempts = 3
|
||||
for attempt in range(1, max_attempts + 1):
|
||||
if attempt > 1:
|
||||
self.logger.warning(f"Retrying component installation (attempt {attempt}/{max_attempts})...")
|
||||
self._cleanup_wine_processes()
|
||||
try:
|
||||
result = self.run_protontricks("--no-bwrap", appid, "-q", *components_to_install, env=env, timeout=600)
|
||||
self.logger.debug(f"Protontricks output: {result.stdout if result else ''}")
|
||||
if result and result.returncode == 0:
|
||||
self.logger.info("Wine Component installation command completed successfully.")
|
||||
return True
|
||||
else:
|
||||
self.logger.error(f"Protontricks command failed (Attempt {attempt}/{max_attempts}). Return Code: {result.returncode if result else 'N/A'}")
|
||||
self.logger.error(f"Stdout: {result.stdout.strip() if result else ''}")
|
||||
self.logger.error(f"Stderr: {result.stderr.strip() if result else ''}")
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error during protontricks run (Attempt {attempt}/{max_attempts}): {e}", exc_info=True)
|
||||
self.logger.error(f"Failed to install Wine components after {max_attempts} attempts.")
|
||||
return False
|
||||
|
||||
def _cleanup_wine_processes(self):
|
||||
"""
|
||||
Internal method to clean up wine processes during component installation
|
||||
"""
|
||||
try:
|
||||
subprocess.run("pgrep -f 'win7|win10|ShowDotFiles|protontricks' | xargs -r kill -9",
|
||||
shell=True, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)
|
||||
subprocess.run("pkill -9 winetricks",
|
||||
shell=True, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)
|
||||
except Exception as e:
|
||||
logger.error(f"Error cleaning up wine processes: {e}")
|
||||
|
||||
def check_and_setup_protontricks(self) -> bool:
|
||||
"""
|
||||
Runs all necessary checks and setup steps for Protontricks.
|
||||
- Detects (and prompts for install if missing)
|
||||
- Checks version
|
||||
- Creates aliases if using Flatpak
|
||||
|
||||
Returns:
|
||||
bool: True if Protontricks is ready to use, False otherwise.
|
||||
"""
|
||||
logger.info("Checking and setting up Protontricks...")
|
||||
|
||||
logger.info("Checking Protontricks installation...")
|
||||
if not self.detect_protontricks():
|
||||
# Error message already printed by detect_protontricks if install fails/skipped
|
||||
return False
|
||||
logger.info(f"Protontricks detected: {self.which_protontricks}")
|
||||
|
||||
logger.info("Checking Protontricks version...")
|
||||
if not self.check_protontricks_version():
|
||||
# Error message already printed by check_protontricks_version
|
||||
print(f"Error: Protontricks version {self.protontricks_version} is too old or could not be checked.")
|
||||
return False
|
||||
logger.info(f"Protontricks version {self.protontricks_version} is sufficient.")
|
||||
|
||||
# Aliases are non-critical, log warning if creation fails
|
||||
if self.which_protontricks == 'flatpak':
|
||||
logger.info("Ensuring Flatpak aliases exist in ~/.bashrc...")
|
||||
if not self.protontricks_alias():
|
||||
# Logged by protontricks_alias, maybe add print?
|
||||
print("Warning: Failed to create/verify protontricks aliases in ~/.bashrc")
|
||||
# Don't necessarily fail the whole setup for this
|
||||
|
||||
logger.info("Protontricks check and setup completed successfully.")
|
||||
return True
|
||||
503
jackify/backend/handlers/resolution_handler.py
Normal file
503
jackify/backend/handlers/resolution_handler.py
Normal file
@@ -0,0 +1,503 @@
|
||||
#!/usr/bin/env python3
|
||||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
Resolution Handler Module
|
||||
Handles setting resolution in various INI files
|
||||
"""
|
||||
|
||||
import os
|
||||
import re
|
||||
import glob
|
||||
import logging
|
||||
import subprocess
|
||||
from pathlib import Path
|
||||
from typing import Optional, List, Dict
|
||||
# Import colors from the new central location
|
||||
from .ui_colors import COLOR_PROMPT, COLOR_RESET, COLOR_ERROR, COLOR_INFO
|
||||
|
||||
# Initialize logger
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class ResolutionHandler:
|
||||
"""
|
||||
Handles resolution selection and configuration for games
|
||||
"""
|
||||
|
||||
def __init__(self, modlist_dir=None, game_var=None, resolution=None):
|
||||
self.modlist_dir = modlist_dir
|
||||
self.game_var = game_var # Short version (e.g., "Skyrim")
|
||||
self.game_var_full = None # Full version (e.g., "Skyrim Special Edition")
|
||||
self.resolution = resolution
|
||||
# Add logger initialization
|
||||
self.logger = logging.getLogger(__name__)
|
||||
|
||||
# Set the full game name based on the short version
|
||||
if self.game_var:
|
||||
game_lookup = {
|
||||
"Skyrim": "Skyrim Special Edition",
|
||||
"Fallout": "Fallout 4",
|
||||
"Fallout 4": "Fallout 4",
|
||||
"Fallout New Vegas": "Fallout New Vegas",
|
||||
"FNV": "Fallout New Vegas",
|
||||
"Oblivion": "Oblivion"
|
||||
}
|
||||
self.game_var_full = game_lookup.get(self.game_var, self.game_var)
|
||||
|
||||
def set_resolution(self, resolution):
|
||||
"""
|
||||
Set the target resolution, e.g. "1280x800"
|
||||
"""
|
||||
self.resolution = resolution
|
||||
logger.debug(f"Resolution set to: {self.resolution}")
|
||||
return True
|
||||
|
||||
def get_resolution_components(self):
|
||||
"""
|
||||
Split resolution into width and height components
|
||||
"""
|
||||
if not self.resolution:
|
||||
logger.error("Resolution not set")
|
||||
return None, None
|
||||
|
||||
try:
|
||||
width, height = self.resolution.split('x')
|
||||
return width, height
|
||||
except ValueError:
|
||||
logger.error(f"Invalid resolution format: {self.resolution}")
|
||||
return None, None
|
||||
|
||||
def detect_steamdeck_resolution(self):
|
||||
"""
|
||||
Set resolution to Steam Deck native if on a Steam Deck
|
||||
"""
|
||||
try:
|
||||
if os.path.exists("/etc/os-release"):
|
||||
with open("/etc/os-release", "r") as f:
|
||||
if "steamdeck" in f.read():
|
||||
self.resolution = "1280x800"
|
||||
logger.debug("Steam Deck detected, setting resolution to 1280x800")
|
||||
return True
|
||||
|
||||
return False
|
||||
except Exception as e:
|
||||
logger.error(f"Error detecting Steam Deck resolution: {e}")
|
||||
return False
|
||||
|
||||
def select_resolution(self, steamdeck=False) -> Optional[str]:
|
||||
"""
|
||||
Ask the user if they want to set resolution, then prompt and validate.
|
||||
Returns the selected resolution string (e.g., "1920x1080") or None if skipped/cancelled.
|
||||
"""
|
||||
if steamdeck:
|
||||
logger.info("Steam Deck detected - Setting resolution to 1280x800")
|
||||
return "1280x800"
|
||||
|
||||
# Ask user if they want to set resolution
|
||||
response = input(f"{COLOR_PROMPT}Do you wish to set the display resolution now? (y/N): {COLOR_RESET}").lower()
|
||||
|
||||
if response == 'y':
|
||||
while True:
|
||||
user_res = input(f"{COLOR_PROMPT}Enter desired resolution (e.g., 1920x1080): {COLOR_RESET}").strip()
|
||||
if self._validate_resolution_format(user_res):
|
||||
# Optional: Add confirmation step here if desired
|
||||
# confirm = input(f"{COLOR_PROMPT}Use resolution {user_res}? (Y/n): {COLOR_RESET}").lower()
|
||||
# if confirm != 'n':
|
||||
# return user_res
|
||||
return user_res # Return validated resolution
|
||||
else:
|
||||
print(f"{COLOR_ERROR}Invalid format. Please use format WxH (e.g., 1920x1080){COLOR_RESET}")
|
||||
else:
|
||||
self.logger.info("Resolution setup skipped by user.")
|
||||
return None
|
||||
|
||||
def _validate_resolution_format(self, resolution: str) -> bool:
|
||||
"""Validates the resolution format WxH (e.g., 1920x1080)."""
|
||||
if not resolution:
|
||||
return False
|
||||
# Simple regex to match one or more digits, 'x', one or more digits
|
||||
if re.match(r"^[0-9]+x[0-9]+$", resolution):
|
||||
self.logger.debug(f"Resolution format validated: {resolution}")
|
||||
return True
|
||||
else:
|
||||
self.logger.warning(f"Invalid resolution format provided: {resolution}")
|
||||
return False
|
||||
|
||||
@staticmethod
|
||||
def get_available_resolutions() -> List[str]:
|
||||
"""Gets available display resolutions using xrandr."""
|
||||
resolutions = []
|
||||
try:
|
||||
result = subprocess.run(["xrandr"], capture_output=True, text=True, check=True)
|
||||
# Regex to find lines like ' 1920x1080 59.96*+'
|
||||
matches = re.finditer(r"^\s*(\d+x\d+)\s", result.stdout, re.MULTILINE)
|
||||
for match in matches:
|
||||
res = match.group(1)
|
||||
if res not in resolutions:
|
||||
resolutions.append(res)
|
||||
# Add common resolutions if xrandr fails or doesn't list them
|
||||
common_res = ["1280x720", "1280x800", "1920x1080", "1920x1200", "2560x1440"]
|
||||
for res in common_res:
|
||||
if res not in resolutions:
|
||||
resolutions.append(res)
|
||||
resolutions.sort(key=lambda r: tuple(map(int, r.split('x'))))
|
||||
logger.debug(f"Detected resolutions: {resolutions}")
|
||||
return resolutions
|
||||
except (FileNotFoundError, subprocess.CalledProcessError, Exception) as e:
|
||||
logger.warning(f"Could not detect resolutions via xrandr: {e}. Falling back to common list.")
|
||||
# Fallback to a common list if xrandr is not available or fails
|
||||
return ["1280x720", "1280x800", "1920x1080", "1920x1200", "2560x1440"]
|
||||
|
||||
@staticmethod
|
||||
def update_ini_resolution(modlist_dir: str, game_var: str, set_res: str) -> bool:
|
||||
"""
|
||||
Updates the resolution in relevant INI files for the specified game.
|
||||
|
||||
Args:
|
||||
modlist_dir (str): Path to the modlist directory.
|
||||
game_var (str): The game identifier (e.g., "Skyrim Special Edition", "Fallout 4").
|
||||
set_res (str): The desired resolution (e.g., "1920x1080").
|
||||
|
||||
Returns:
|
||||
bool: True if successful or not applicable, False on error.
|
||||
"""
|
||||
logger.info(f"Attempting to set resolution to {set_res} for {game_var} in {modlist_dir}")
|
||||
|
||||
try:
|
||||
isize_w, isize_h = set_res.split('x')
|
||||
modlist_path = Path(modlist_dir)
|
||||
success_count = 0
|
||||
files_processed = 0
|
||||
|
||||
# 1. Handle SSEDisplayTweaks.ini (Skyrim SE only)
|
||||
if game_var == "Skyrim Special Edition":
|
||||
logger.debug("Processing SSEDisplayTweaks.ini...")
|
||||
sse_tweaks_files = list(modlist_path.rglob("SSEDisplayTweaks.ini"))
|
||||
if sse_tweaks_files:
|
||||
for ini_file in sse_tweaks_files:
|
||||
files_processed += 1
|
||||
logger.debug(f"Updating {ini_file}")
|
||||
if ResolutionHandler._modify_sse_tweaks(ini_file, set_res):
|
||||
success_count += 1
|
||||
else:
|
||||
logger.debug("No SSEDisplayTweaks.ini found, skipping.")
|
||||
|
||||
# 1.5. Handle HighFPSPhysicsFix.ini (Fallout 4 only)
|
||||
elif game_var == "Fallout 4":
|
||||
logger.debug("Processing HighFPSPhysicsFix.ini...")
|
||||
highfps_files = list(modlist_path.rglob("HighFPSPhysicsFix.ini"))
|
||||
if highfps_files:
|
||||
for ini_file in highfps_files:
|
||||
files_processed += 1
|
||||
logger.debug(f"Updating {ini_file}")
|
||||
if ResolutionHandler._modify_highfps_physics_fix(ini_file, set_res):
|
||||
success_count += 1
|
||||
else:
|
||||
logger.debug("No HighFPSPhysicsFix.ini found, skipping.")
|
||||
|
||||
# 2. Handle game-specific Prefs/INI files
|
||||
prefs_filenames = []
|
||||
if game_var == "Skyrim Special Edition":
|
||||
prefs_filenames = ["skyrimprefs.ini"]
|
||||
elif game_var == "Fallout 4":
|
||||
prefs_filenames = ["Fallout4Prefs.ini"]
|
||||
elif game_var == "Fallout New Vegas":
|
||||
prefs_filenames = ["falloutprefs.ini"]
|
||||
elif game_var == "Oblivion":
|
||||
prefs_filenames = ["Oblivion.ini"]
|
||||
else:
|
||||
logger.warning(f"Resolution setting not implemented for game: {game_var}")
|
||||
return True # Not an error, just not applicable
|
||||
|
||||
logger.debug(f"Processing {prefs_filenames}...")
|
||||
prefs_files_found = []
|
||||
# Search common locations: profiles/, stock game dirs
|
||||
search_dirs = [modlist_path / "profiles"]
|
||||
# Add potential stock game directories dynamically (case-insensitive)
|
||||
potential_stock_dirs = [d for d in modlist_path.iterdir() if d.is_dir() and
|
||||
d.name.lower() in ["stock game", "game root", "stock folder", "skyrim stock"]] # Add more if needed
|
||||
search_dirs.extend(potential_stock_dirs)
|
||||
|
||||
for search_dir in search_dirs:
|
||||
if search_dir.is_dir():
|
||||
for fname in prefs_filenames:
|
||||
prefs_files_found.extend(list(search_dir.rglob(fname)))
|
||||
|
||||
if not prefs_files_found:
|
||||
logger.warning(f"No preference files ({prefs_filenames}) found in standard locations ({search_dirs}). Manual INI edit might be needed.")
|
||||
# Consider this success as the main operation didn't fail?
|
||||
return True
|
||||
|
||||
for ini_file in prefs_files_found:
|
||||
files_processed += 1
|
||||
logger.debug(f"Updating {ini_file}")
|
||||
if ResolutionHandler._modify_prefs_resolution(ini_file, isize_w, isize_h, game_var == "Oblivion"):
|
||||
success_count += 1
|
||||
|
||||
logger.info(f"Resolution update: processed {files_processed} files, {success_count} successfully updated.")
|
||||
# Return True even if some updates failed, as the overall process didn't halt
|
||||
return True
|
||||
|
||||
except ValueError:
|
||||
logger.error(f"Invalid resolution format: {set_res}. Expected WxH (e.g., 1920x1080).")
|
||||
return False
|
||||
except Exception as e:
|
||||
logger.error(f"Error updating INI resolutions: {e}", exc_info=True)
|
||||
return False
|
||||
|
||||
@staticmethod
|
||||
def _modify_sse_tweaks(ini_path: Path, resolution: str) -> bool:
|
||||
"""Helper to modify SSEDisplayTweaks.ini"""
|
||||
try:
|
||||
with open(ini_path, 'r') as f:
|
||||
lines = f.readlines()
|
||||
|
||||
new_lines = []
|
||||
modified = False
|
||||
for line in lines:
|
||||
stripped_line = line.strip()
|
||||
# Use regex for flexibility with spacing and comments
|
||||
if re.match(r'^\s*(#?)\s*Resolution\s*=.*$', stripped_line, re.IGNORECASE):
|
||||
new_lines.append(f"Resolution={resolution}\n")
|
||||
modified = True
|
||||
elif re.match(r'^\s*(#?)\s*Fullscreen\s*=.*$', stripped_line, re.IGNORECASE):
|
||||
new_lines.append("Fullscreen=false\n")
|
||||
modified = True
|
||||
elif re.match(r'^\s*(#?)\s*Borderless\s*=.*$', stripped_line, re.IGNORECASE):
|
||||
new_lines.append("Borderless=true\n")
|
||||
modified = True
|
||||
else:
|
||||
new_lines.append(line)
|
||||
|
||||
if modified:
|
||||
with open(ini_path, 'w') as f:
|
||||
f.writelines(new_lines)
|
||||
logger.debug(f"Successfully modified {ini_path} for SSEDisplayTweaks")
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to modify {ini_path}: {e}")
|
||||
return False
|
||||
|
||||
@staticmethod
|
||||
def _modify_highfps_physics_fix(ini_path: Path, resolution: str) -> bool:
|
||||
"""Helper to modify HighFPSPhysicsFix.ini for Fallout 4"""
|
||||
try:
|
||||
with open(ini_path, 'r') as f:
|
||||
lines = f.readlines()
|
||||
|
||||
new_lines = []
|
||||
modified = False
|
||||
for line in lines:
|
||||
stripped_line = line.strip()
|
||||
# Look for Resolution line (commonly commented out by default)
|
||||
if re.match(r'^\s*(#?)\s*Resolution\s*=.*$', stripped_line, re.IGNORECASE):
|
||||
new_lines.append(f"Resolution={resolution}\n")
|
||||
modified = True
|
||||
else:
|
||||
new_lines.append(line)
|
||||
|
||||
if modified:
|
||||
with open(ini_path, 'w') as f:
|
||||
f.writelines(new_lines)
|
||||
logger.debug(f"Successfully modified {ini_path} for HighFPSPhysicsFix")
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to modify {ini_path}: {e}")
|
||||
return False
|
||||
|
||||
@staticmethod
|
||||
def _modify_prefs_resolution(ini_path: Path, width: str, height: str, is_oblivion: bool) -> bool:
|
||||
"""Helper to modify resolution in skyrimprefs.ini, Fallout4Prefs.ini, etc."""
|
||||
try:
|
||||
with open(ini_path, 'r') as f:
|
||||
lines = f.readlines()
|
||||
|
||||
new_lines = []
|
||||
modified = False
|
||||
# Prepare the replacement strings for width and height
|
||||
# Ensure correct spacing for Oblivion vs other games
|
||||
# Corrected f-string syntax for conditional expression
|
||||
equals_operator = "=" if is_oblivion else " = "
|
||||
width_replace = f"iSize W{equals_operator}{width}\n"
|
||||
height_replace = f"iSize H{equals_operator}{height}\n"
|
||||
|
||||
for line in lines:
|
||||
stripped_line = line.strip()
|
||||
if stripped_line.lower().endswith("isize w"):
|
||||
new_lines.append(width_replace)
|
||||
modified = True
|
||||
elif stripped_line.lower().endswith("isize h"):
|
||||
new_lines.append(height_replace)
|
||||
modified = True
|
||||
else:
|
||||
new_lines.append(line)
|
||||
|
||||
if modified:
|
||||
with open(ini_path, 'w') as f:
|
||||
f.writelines(new_lines)
|
||||
logger.debug(f"Successfully modified {ini_path} for resolution")
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to modify {ini_path}: {e}")
|
||||
return False
|
||||
|
||||
def edit_resolution(self, modlist_dir, game_var, selected_resolution=None):
|
||||
"""
|
||||
Edit resolution in INI files
|
||||
"""
|
||||
if selected_resolution:
|
||||
logger.debug(f"Applying resolution: {selected_resolution}")
|
||||
return self.update_ini_resolution(modlist_dir, game_var, selected_resolution)
|
||||
else:
|
||||
logger.debug("Resolution setup skipped")
|
||||
return True
|
||||
|
||||
def update_sse_display_tweaks(self):
|
||||
"""
|
||||
Update SSEDisplayTweaks.ini with the chosen resolution
|
||||
Returns True on success, False on failure
|
||||
"""
|
||||
if not self.modlist_dir or not self.game_var or not self.resolution:
|
||||
logger.error("Missing required parameters")
|
||||
return False
|
||||
|
||||
if self.game_var != "Skyrim Special Edition":
|
||||
logger.debug(f"Not Skyrim, skipping SSEDisplayTweaks")
|
||||
return False
|
||||
|
||||
try:
|
||||
# Find all SSEDisplayTweaks.ini files
|
||||
ini_files = glob.glob(f"{self.modlist_dir}/**/SSEDisplayTweaks.ini", recursive=True)
|
||||
|
||||
if not ini_files:
|
||||
logger.debug("No SSEDisplayTweaks.ini files found")
|
||||
return False
|
||||
|
||||
for ini_file in ini_files:
|
||||
# Read the file
|
||||
with open(ini_file, 'r', encoding='utf-8', errors='ignore') as f:
|
||||
content = f.readlines()
|
||||
|
||||
# Process and modify the content
|
||||
modified_content = []
|
||||
for line in content:
|
||||
if line.strip().startswith("Resolution=") or line.strip().startswith("#Resolution="):
|
||||
modified_content.append(f"Resolution={self.resolution}\n")
|
||||
elif line.strip().startswith("Fullscreen=") or line.strip().startswith("#Fullscreen="):
|
||||
modified_content.append(f"Fullscreen=false\n")
|
||||
elif line.strip().startswith("Borderless=") or line.strip().startswith("#Borderless="):
|
||||
modified_content.append(f"Borderless=true\n")
|
||||
else:
|
||||
modified_content.append(line)
|
||||
|
||||
# Write the modified content back
|
||||
with open(ini_file, 'w', encoding='utf-8') as f:
|
||||
f.writelines(modified_content)
|
||||
|
||||
logger.debug(f"Updated {ini_file} with Resolution={self.resolution}, Fullscreen=false, Borderless=true")
|
||||
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error updating SSEDisplayTweaks.ini: {e}")
|
||||
return False
|
||||
|
||||
def update_game_prefs_ini(self):
|
||||
"""
|
||||
Update game preference INI files with the chosen resolution
|
||||
Returns True on success, False on failure
|
||||
"""
|
||||
if not self.modlist_dir or not self.game_var or not self.resolution:
|
||||
logger.error("Missing required parameters")
|
||||
return False
|
||||
|
||||
try:
|
||||
# Get resolution components
|
||||
width, height = self.get_resolution_components()
|
||||
if not width or not height:
|
||||
return False
|
||||
|
||||
# Define possible stock game folders to search
|
||||
stock_folders = [
|
||||
"profiles", "Stock Game", "Game Root", "STOCK GAME",
|
||||
"Stock Game Folder", "Stock Folder", "Skyrim Stock"
|
||||
]
|
||||
|
||||
# Define the appropriate INI file based on game type
|
||||
ini_filename = None
|
||||
if self.game_var == "Skyrim Special Edition":
|
||||
ini_filename = "skyrimprefs.ini"
|
||||
elif self.game_var == "Fallout 4":
|
||||
ini_filename = "Fallout4Prefs.ini"
|
||||
elif self.game_var == "Fallout New Vegas":
|
||||
ini_filename = "falloutprefs.ini"
|
||||
elif self.game_var == "Oblivion":
|
||||
ini_filename = "Oblivion.ini"
|
||||
else:
|
||||
logger.error(f"Unsupported game: {self.game_var}")
|
||||
return False
|
||||
|
||||
# Search for INI files in the appropriate directories
|
||||
ini_files = []
|
||||
for folder in stock_folders:
|
||||
path_pattern = os.path.join(self.modlist_dir, folder, f"**/{ini_filename}")
|
||||
ini_files.extend(glob.glob(path_pattern, recursive=True))
|
||||
|
||||
if not ini_files:
|
||||
logger.warn(f"No {ini_filename} files found in specified directories")
|
||||
return False
|
||||
|
||||
for ini_file in ini_files:
|
||||
# Read the file
|
||||
with open(ini_file, 'r', encoding='utf-8', errors='ignore') as f:
|
||||
content = f.readlines()
|
||||
|
||||
# Process and modify the content
|
||||
modified_content = []
|
||||
for line in content:
|
||||
line_lower = line.lower()
|
||||
if "isize w" in line_lower:
|
||||
# Handle different formats (with = or space)
|
||||
if "=" in line and not " = " in line:
|
||||
modified_content.append(f"iSize W={width}\n")
|
||||
else:
|
||||
modified_content.append(f"iSize W = {width}\n")
|
||||
elif "isize h" in line_lower:
|
||||
# Handle different formats (with = or space)
|
||||
if "=" in line and not " = " in line:
|
||||
modified_content.append(f"iSize H={height}\n")
|
||||
else:
|
||||
modified_content.append(f"iSize H = {height}\n")
|
||||
else:
|
||||
modified_content.append(line)
|
||||
|
||||
# Write the modified content back
|
||||
with open(ini_file, 'w', encoding='utf-8') as f:
|
||||
f.writelines(modified_content)
|
||||
|
||||
logger.debug(f"Updated {ini_file} with iSize W={width}, iSize H={height}")
|
||||
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error updating game prefs INI: {e}")
|
||||
return False
|
||||
|
||||
def update_all_resolution_settings(self):
|
||||
"""
|
||||
Update all resolution-related settings in all relevant INI files
|
||||
Returns True if any files were updated, False if none were updated
|
||||
"""
|
||||
if not self.resolution:
|
||||
logger.error("Resolution not set")
|
||||
return False
|
||||
|
||||
success = False
|
||||
|
||||
# Update SSEDisplayTweaks.ini if applicable
|
||||
sse_success = self.update_sse_display_tweaks()
|
||||
|
||||
# Update game preferences INI
|
||||
prefs_success = self.update_game_prefs_ini()
|
||||
|
||||
return sse_success or prefs_success
|
||||
141
jackify/backend/handlers/self_update.py
Normal file
141
jackify/backend/handlers/self_update.py
Normal file
@@ -0,0 +1,141 @@
|
||||
import os
|
||||
import sys
|
||||
import json
|
||||
import requests
|
||||
import shutil
|
||||
import tempfile
|
||||
import time
|
||||
from pathlib import Path
|
||||
|
||||
GITHUB_OWNER = "Omni-guides"
|
||||
GITHUB_REPO = "Jackify"
|
||||
ASSET_NAME = "jackify"
|
||||
CONFIG_DIR = os.path.expanduser("~/.config/jackify")
|
||||
TOKEN_PATH = os.path.join(CONFIG_DIR, "github_token")
|
||||
LAST_CHECK_PATH = os.path.join(CONFIG_DIR, "last_update_check.json")
|
||||
|
||||
THROTTLE_HOURS = 6
|
||||
|
||||
def get_github_token():
|
||||
if os.path.exists(TOKEN_PATH):
|
||||
with open(TOKEN_PATH, "r") as f:
|
||||
return f.read().strip()
|
||||
return None
|
||||
|
||||
def get_latest_release_info():
|
||||
url = f"https://api.github.com/repos/{GITHUB_OWNER}/{GITHUB_REPO}/releases/latest"
|
||||
headers = {}
|
||||
token = get_github_token()
|
||||
if token:
|
||||
headers["Authorization"] = f"token {token}"
|
||||
resp = requests.get(url, headers=headers, verify=True)
|
||||
if resp.status_code == 200:
|
||||
return resp.json()
|
||||
else:
|
||||
raise RuntimeError(f"Failed to fetch release info: {resp.status_code} {resp.text}")
|
||||
|
||||
def get_current_version():
|
||||
# This should match however Jackify stores its version
|
||||
try:
|
||||
from src import version
|
||||
return version.__version__
|
||||
except ImportError:
|
||||
return None
|
||||
|
||||
def should_check_for_update():
|
||||
try:
|
||||
if os.path.exists(LAST_CHECK_PATH):
|
||||
with open(LAST_CHECK_PATH, "r") as f:
|
||||
data = json.load(f)
|
||||
last_check = data.get("last_check", 0)
|
||||
now = int(time.time())
|
||||
if now - last_check < THROTTLE_HOURS * 3600:
|
||||
return False
|
||||
return True
|
||||
except Exception as e:
|
||||
print(f"[WARN] Could not read last update check timestamp: {e}")
|
||||
return True
|
||||
|
||||
def record_update_check():
|
||||
try:
|
||||
with open(LAST_CHECK_PATH, "w") as f:
|
||||
json.dump({"last_check": int(time.time())}, f)
|
||||
except Exception as e:
|
||||
print(f"[WARN] Could not write last update check timestamp: {e}")
|
||||
|
||||
def check_for_update():
|
||||
if not should_check_for_update():
|
||||
return False, None, None
|
||||
try:
|
||||
release = get_latest_release_info()
|
||||
latest_version = release["tag_name"].lstrip("v")
|
||||
current_version = get_current_version()
|
||||
if current_version is None:
|
||||
print("[WARN] Could not determine current version.")
|
||||
record_update_check()
|
||||
return False, None, None
|
||||
if latest_version > current_version:
|
||||
record_update_check()
|
||||
return True, latest_version, release
|
||||
record_update_check()
|
||||
return False, latest_version, release
|
||||
except Exception as e:
|
||||
print(f"[ERROR] Update check failed: {e}")
|
||||
record_update_check()
|
||||
return False, None, None
|
||||
|
||||
def download_latest_asset(release):
|
||||
token = get_github_token()
|
||||
headers = {"Accept": "application/octet-stream"}
|
||||
if token:
|
||||
headers["Authorization"] = f"token {token}"
|
||||
for asset in release["assets"]:
|
||||
if asset["name"] == ASSET_NAME:
|
||||
download_url = asset["url"]
|
||||
resp = requests.get(download_url, headers=headers, stream=True, verify=True)
|
||||
if resp.status_code == 200:
|
||||
return resp.content
|
||||
else:
|
||||
raise RuntimeError(f"Failed to download asset: {resp.status_code} {resp.text}")
|
||||
raise RuntimeError(f"Asset '{ASSET_NAME}' not found in release.")
|
||||
|
||||
def replace_current_binary(new_binary_bytes):
|
||||
current_exe = os.path.realpath(sys.argv[0])
|
||||
backup_path = current_exe + ".bak"
|
||||
try:
|
||||
# Write to a temp file first
|
||||
with tempfile.NamedTemporaryFile(delete=False, dir=os.path.dirname(current_exe)) as tmpf:
|
||||
tmpf.write(new_binary_bytes)
|
||||
tmp_path = tmpf.name
|
||||
# Backup current binary
|
||||
shutil.copy2(current_exe, backup_path)
|
||||
# Replace atomically
|
||||
os.replace(tmp_path, current_exe)
|
||||
os.chmod(current_exe, 0o755)
|
||||
print(f"[INFO] Updated binary written to {current_exe}. Backup at {backup_path}.")
|
||||
return True
|
||||
except Exception as e:
|
||||
print(f"[ERROR] Failed to replace binary: {e}")
|
||||
return False
|
||||
|
||||
def main():
|
||||
if '--update' in sys.argv:
|
||||
print("Checking for updates...")
|
||||
update_available, latest_version, release = check_for_update()
|
||||
if update_available:
|
||||
print(f"A new version (v{latest_version}) is available. Downloading...")
|
||||
try:
|
||||
new_bin = download_latest_asset(release)
|
||||
if replace_current_binary(new_bin):
|
||||
print("Update complete! Please restart Jackify.")
|
||||
else:
|
||||
print("Update failed during binary replacement.")
|
||||
except Exception as e:
|
||||
print(f"[ERROR] Update failed: {e}")
|
||||
else:
|
||||
print("You are already running the latest version.")
|
||||
sys.exit(0)
|
||||
|
||||
# For direct CLI testing
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
1369
jackify/backend/handlers/shortcut_handler.py
Normal file
1369
jackify/backend/handlers/shortcut_handler.py
Normal file
File diff suppressed because it is too large
Load Diff
10
jackify/backend/handlers/status_utils.py
Normal file
10
jackify/backend/handlers/status_utils.py
Normal file
@@ -0,0 +1,10 @@
|
||||
from .ui_colors import COLOR_INFO, COLOR_RESET
|
||||
|
||||
def show_status(message: str):
|
||||
"""Show a single-line status message, overwriting the current line."""
|
||||
status_width = 80 # Pad to clear previous text
|
||||
print(f"\r\033[K{COLOR_INFO}{message:<{status_width}}{COLOR_RESET}", end="", flush=True)
|
||||
|
||||
def clear_status():
|
||||
"""Clear the current status line."""
|
||||
print("\r\033[K", end="", flush=True)
|
||||
137
jackify/backend/handlers/subprocess_utils.py
Normal file
137
jackify/backend/handlers/subprocess_utils.py
Normal file
@@ -0,0 +1,137 @@
|
||||
import os
|
||||
import signal
|
||||
import subprocess
|
||||
import time
|
||||
import resource
|
||||
|
||||
def get_clean_subprocess_env(extra_env=None):
|
||||
"""
|
||||
Returns a copy of os.environ with PyInstaller and other problematic variables removed.
|
||||
Optionally merges in extra_env dict.
|
||||
"""
|
||||
env = os.environ.copy()
|
||||
# Remove PyInstaller-specific variables
|
||||
for k in list(env):
|
||||
if k.startswith('_MEIPASS'):
|
||||
del env[k]
|
||||
# Optionally restore LD_LIBRARY_PATH to system default if needed
|
||||
# (You can add more logic here if you know your system's default)
|
||||
if extra_env:
|
||||
env.update(extra_env)
|
||||
return env
|
||||
|
||||
def increase_file_descriptor_limit(target_limit=1048576):
|
||||
"""
|
||||
Temporarily increase the file descriptor limit for the current process.
|
||||
|
||||
Args:
|
||||
target_limit (int): Desired file descriptor limit (default: 1048576)
|
||||
|
||||
Returns:
|
||||
tuple: (success: bool, old_limit: int, new_limit: int, message: str)
|
||||
"""
|
||||
try:
|
||||
# Get current soft and hard limits
|
||||
soft_limit, hard_limit = resource.getrlimit(resource.RLIMIT_NOFILE)
|
||||
|
||||
# Don't decrease the limit if it's already higher
|
||||
if soft_limit >= target_limit:
|
||||
return True, soft_limit, soft_limit, f"Current limit ({soft_limit}) already sufficient"
|
||||
|
||||
# Set new limit (can't exceed hard limit)
|
||||
new_limit = min(target_limit, hard_limit)
|
||||
resource.setrlimit(resource.RLIMIT_NOFILE, (new_limit, hard_limit))
|
||||
|
||||
return True, soft_limit, new_limit, f"Increased file descriptor limit from {soft_limit} to {new_limit}"
|
||||
|
||||
except (OSError, ValueError) as e:
|
||||
# Get current limit for reporting
|
||||
try:
|
||||
soft_limit, _ = resource.getrlimit(resource.RLIMIT_NOFILE)
|
||||
except:
|
||||
soft_limit = "unknown"
|
||||
|
||||
return False, soft_limit, soft_limit, f"Failed to increase file descriptor limit: {e}"
|
||||
|
||||
class ProcessManager:
|
||||
"""
|
||||
Shared process manager for robust subprocess launching, tracking, and cancellation.
|
||||
"""
|
||||
def __init__(self, cmd, env=None, cwd=None, text=False, bufsize=0):
|
||||
self.cmd = cmd
|
||||
self.env = env
|
||||
self.cwd = cwd
|
||||
self.text = text
|
||||
self.bufsize = bufsize
|
||||
self.proc = None
|
||||
self.process_group_pid = None
|
||||
self._start_process()
|
||||
|
||||
def _start_process(self):
|
||||
self.proc = subprocess.Popen(
|
||||
self.cmd,
|
||||
stdout=subprocess.PIPE,
|
||||
stderr=subprocess.STDOUT,
|
||||
env=self.env,
|
||||
cwd=self.cwd,
|
||||
text=self.text,
|
||||
bufsize=self.bufsize,
|
||||
start_new_session=True
|
||||
)
|
||||
self.process_group_pid = os.getpgid(self.proc.pid)
|
||||
|
||||
def cancel(self, timeout_terminate=2, timeout_kill=1, max_cleanup_attempts=3):
|
||||
"""
|
||||
Attempt to robustly terminate the process and its children.
|
||||
"""
|
||||
cleanup_attempts = 0
|
||||
if self.proc:
|
||||
try:
|
||||
self.proc.terminate()
|
||||
try:
|
||||
self.proc.wait(timeout=timeout_terminate)
|
||||
return
|
||||
except subprocess.TimeoutExpired:
|
||||
pass
|
||||
except Exception:
|
||||
pass
|
||||
try:
|
||||
self.proc.kill()
|
||||
try:
|
||||
self.proc.wait(timeout=timeout_kill)
|
||||
return
|
||||
except subprocess.TimeoutExpired:
|
||||
pass
|
||||
except Exception:
|
||||
pass
|
||||
# Kill process group if possible
|
||||
if self.process_group_pid:
|
||||
try:
|
||||
os.killpg(self.process_group_pid, signal.SIGKILL)
|
||||
except Exception:
|
||||
pass
|
||||
# Last resort: pkill by command name
|
||||
while cleanup_attempts < max_cleanup_attempts:
|
||||
try:
|
||||
subprocess.run(['pkill', '-f', os.path.basename(self.cmd[0])], timeout=5, capture_output=True)
|
||||
except Exception:
|
||||
pass
|
||||
cleanup_attempts += 1
|
||||
|
||||
def is_running(self):
|
||||
return self.proc and self.proc.poll() is None
|
||||
|
||||
def wait(self, timeout=None):
|
||||
if self.proc:
|
||||
return self.proc.wait(timeout=timeout)
|
||||
return None
|
||||
|
||||
def read_stdout_line(self):
|
||||
if self.proc and self.proc.stdout:
|
||||
return self.proc.stdout.readline()
|
||||
return None
|
||||
|
||||
def read_stdout_char(self):
|
||||
if self.proc and self.proc.stdout:
|
||||
return self.proc.stdout.read(1)
|
||||
return None
|
||||
16
jackify/backend/handlers/ui_colors.py
Normal file
16
jackify/backend/handlers/ui_colors.py
Normal file
@@ -0,0 +1,16 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
UI Color Constants
|
||||
"""
|
||||
|
||||
COLOR_PROMPT = '\033[93m' # Yellow
|
||||
COLOR_SELECTION = '\033[96m' # Cyan
|
||||
COLOR_RESET = '\033[0m'
|
||||
COLOR_INFO = '\033[94m' # Blue
|
||||
COLOR_ERROR = '\033[91m' # Red
|
||||
COLOR_SUCCESS = '\033[92m' # Green
|
||||
COLOR_WARNING = '\033[93m' # Yellow (reusing prompt color)
|
||||
COLOR_DISABLED = '\033[90m' # Grey
|
||||
|
||||
COLOR_ACTION = '\033[97m' # Bright White for action/descriptions
|
||||
COLOR_INPUT = '\033[97m' # Bright White for input prompts
|
||||
180
jackify/backend/handlers/ui_handler.py
Normal file
180
jackify/backend/handlers/ui_handler.py
Normal file
@@ -0,0 +1,180 @@
|
||||
"""
|
||||
UIHandler module for managing user interface operations.
|
||||
This module handles menus, prompts, and user interaction.
|
||||
"""
|
||||
|
||||
import os
|
||||
import logging
|
||||
from typing import Optional, List, Dict, Tuple, Callable, Any
|
||||
from pathlib import Path
|
||||
|
||||
class UIHandler:
|
||||
def __init__(self):
|
||||
self.logger = logging.getLogger(__name__)
|
||||
|
||||
def show_menu(self, title: str, options: List[Dict[str, Any]]) -> Optional[str]:
|
||||
"""Display a menu and get user selection."""
|
||||
try:
|
||||
print(f"\n{title}")
|
||||
print("=" * len(title))
|
||||
|
||||
for i, option in enumerate(options, 1):
|
||||
print(f"{i}. {option['label']}")
|
||||
|
||||
while True:
|
||||
try:
|
||||
choice = input("\nEnter your choice (or 'q' to quit): ")
|
||||
if choice.lower() == 'q':
|
||||
return None
|
||||
|
||||
choice = int(choice)
|
||||
if 1 <= choice <= len(options):
|
||||
return options[choice - 1]['value']
|
||||
else:
|
||||
print("Invalid choice. Please try again.")
|
||||
except ValueError:
|
||||
print("Please enter a number.")
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to show menu: {e}")
|
||||
return None
|
||||
|
||||
def show_progress(self, message: str, total: int = 100) -> None:
|
||||
"""Display a progress indicator."""
|
||||
try:
|
||||
print(f"\n{message}")
|
||||
print("[" + " " * 50 + "] 0%", end="\r")
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to show progress: {e}")
|
||||
|
||||
def update_progress(self, current: int, message: Optional[str] = None) -> None:
|
||||
"""Update the progress indicator."""
|
||||
try:
|
||||
if message:
|
||||
print(f"\n{message}")
|
||||
progress = int(current / 2)
|
||||
print("[" + "=" * progress + " " * (50 - progress) + f"] {current}%", end="\r")
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to update progress: {e}")
|
||||
|
||||
def show_error(self, message: str, details: Optional[str] = None) -> None:
|
||||
"""Display an error message."""
|
||||
try:
|
||||
print(f"\nError: {message}")
|
||||
if details:
|
||||
print(f"Details: {details}")
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to show error: {e}")
|
||||
|
||||
def show_success(self, message: str, details: Optional[str] = None) -> None:
|
||||
"""Display a success message."""
|
||||
try:
|
||||
print(f"\n✓ Success: {message}")
|
||||
if details:
|
||||
print(f"Details: {details}")
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to show success: {e}")
|
||||
|
||||
def show_warning(self, message: str, details: Optional[str] = None) -> None:
|
||||
"""Display a warning message."""
|
||||
try:
|
||||
print(f"\nWarning: {message}")
|
||||
if details:
|
||||
print(f"Details: {details}")
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to show warning: {e}")
|
||||
|
||||
def get_input(self, prompt: str, default: Optional[str] = None) -> str:
|
||||
"""Get user input with optional default value."""
|
||||
try:
|
||||
if default:
|
||||
user_input = input(f"{prompt} [{default}]: ")
|
||||
return user_input if user_input else default
|
||||
return input(f"{prompt}: ")
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to get input: {e}")
|
||||
return ""
|
||||
|
||||
def get_confirmation(self, message: str, default: bool = True) -> bool:
|
||||
"""Get user confirmation for an action."""
|
||||
try:
|
||||
default_str = "Y/n" if default else "y/N"
|
||||
while True:
|
||||
response = input(f"{message} [{default_str}]: ").lower()
|
||||
if not response:
|
||||
return default
|
||||
if response in ['y', 'yes']:
|
||||
return True
|
||||
if response in ['n', 'no']:
|
||||
return False
|
||||
print("Please enter 'y' or 'n'.")
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to get confirmation: {e}")
|
||||
return default
|
||||
|
||||
def show_list(self, title: str, items: List[str], selectable: bool = True) -> Optional[str]:
|
||||
"""Display a list of items, optionally selectable."""
|
||||
try:
|
||||
print(f"\n{title}")
|
||||
print("=" * len(title))
|
||||
|
||||
for i, item in enumerate(items, 1):
|
||||
print(f"{i}. {item}")
|
||||
|
||||
if selectable:
|
||||
while True:
|
||||
try:
|
||||
choice = input("\nEnter your choice (or 'q' to quit): ")
|
||||
if choice.lower() == 'q':
|
||||
return None
|
||||
|
||||
choice = int(choice)
|
||||
if 1 <= choice <= len(items):
|
||||
return items[choice - 1]
|
||||
else:
|
||||
print("Invalid choice. Please try again.")
|
||||
except ValueError:
|
||||
print("Please enter a number.")
|
||||
return None
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to show list: {e}")
|
||||
return None
|
||||
|
||||
def show_table(self, title: str, headers: List[str], rows: List[List[str]]) -> None:
|
||||
"""Display data in a table format."""
|
||||
try:
|
||||
print(f"\n{title}")
|
||||
print("=" * len(title))
|
||||
|
||||
# Calculate column widths
|
||||
widths = [len(h) for h in headers]
|
||||
for row in rows:
|
||||
for i, cell in enumerate(row):
|
||||
widths[i] = max(widths[i], len(str(cell)))
|
||||
|
||||
# Print headers
|
||||
header_str = " | ".join(f"{h:<{w}}" for h, w in zip(headers, widths))
|
||||
print(header_str)
|
||||
print("-" * len(header_str))
|
||||
|
||||
# Print rows
|
||||
for row in rows:
|
||||
print(" | ".join(f"{str(cell):<{w}}" for cell, w in zip(row, widths)))
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to show table: {e}")
|
||||
|
||||
def show_help(self, topic: str) -> None:
|
||||
"""Display help information for a topic."""
|
||||
try:
|
||||
# This would typically load help content from a file or database
|
||||
print(f"\nHelp: {topic}")
|
||||
print("=" * (len(topic) + 6))
|
||||
print("Help content would be displayed here.")
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to show help: {e}")
|
||||
|
||||
def clear_screen(self) -> None:
|
||||
"""Clear the terminal screen."""
|
||||
try:
|
||||
os.system('clear' if os.name == 'posix' else 'cls')
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to clear screen: {e}")
|
||||
318
jackify/backend/handlers/validation_handler.py
Normal file
318
jackify/backend/handlers/validation_handler.py
Normal file
@@ -0,0 +1,318 @@
|
||||
"""
|
||||
ValidationHandler module for managing validation operations.
|
||||
This module handles input validation, path validation, and configuration validation.
|
||||
"""
|
||||
|
||||
import os
|
||||
import logging
|
||||
import re
|
||||
import shutil
|
||||
import vdf
|
||||
from pathlib import Path
|
||||
from typing import Optional, Dict, List, Tuple, Any
|
||||
|
||||
class ValidationHandler:
|
||||
def __init__(self):
|
||||
self.logger = logging.getLogger(__name__)
|
||||
|
||||
def validate_path(self, path: Path, must_exist: bool = True) -> Tuple[bool, str]:
|
||||
"""Validate a path."""
|
||||
try:
|
||||
if not isinstance(path, Path):
|
||||
return False, "Path must be a Path object"
|
||||
|
||||
if must_exist and not path.exists():
|
||||
return False, f"Path does not exist: {path}"
|
||||
|
||||
if not os.access(path, os.R_OK | os.W_OK):
|
||||
return False, f"Path is not accessible: {path}"
|
||||
|
||||
return True, "Path is valid"
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to validate path {path}: {e}")
|
||||
return False, str(e)
|
||||
|
||||
def validate_input(self, value: Any, rules: Dict) -> Tuple[bool, str]:
|
||||
"""Validate user input against rules."""
|
||||
try:
|
||||
# Check required
|
||||
if rules.get('required', False) and not value:
|
||||
return False, "Value is required"
|
||||
|
||||
# Check type
|
||||
if 'type' in rules and not isinstance(value, rules['type']):
|
||||
return False, f"Value must be of type {rules['type'].__name__}"
|
||||
|
||||
# Check min/max length for strings
|
||||
if isinstance(value, str):
|
||||
if 'min_length' in rules and len(value) < rules['min_length']:
|
||||
return False, f"Value must be at least {rules['min_length']} characters"
|
||||
if 'max_length' in rules and len(value) > rules['max_length']:
|
||||
return False, f"Value must be at most {rules['max_length']} characters"
|
||||
|
||||
# Check min/max value for numbers
|
||||
if isinstance(value, (int, float)):
|
||||
if 'min_value' in rules and value < rules['min_value']:
|
||||
return False, f"Value must be at least {rules['min_value']}"
|
||||
if 'max_value' in rules and value > rules['max_value']:
|
||||
return False, f"Value must be at most {rules['max_value']}"
|
||||
|
||||
# Check pattern for strings
|
||||
if isinstance(value, str) and 'pattern' in rules:
|
||||
if not re.match(rules['pattern'], value):
|
||||
return False, f"Value must match pattern: {rules['pattern']}"
|
||||
|
||||
# Check custom validation function
|
||||
if 'validate' in rules and callable(rules['validate']):
|
||||
result = rules['validate'](value)
|
||||
if isinstance(result, tuple):
|
||||
return result
|
||||
elif not result:
|
||||
return False, "Custom validation failed"
|
||||
|
||||
return True, "Input is valid"
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to validate input: {e}")
|
||||
return False, str(e)
|
||||
|
||||
def validate_config(self, config: Dict, schema: Dict) -> Tuple[bool, List[str]]:
|
||||
"""Validate configuration against a schema."""
|
||||
try:
|
||||
errors = []
|
||||
|
||||
# Check required fields
|
||||
for field, rules in schema.items():
|
||||
if rules.get('required', False) and field not in config:
|
||||
errors.append(f"Missing required field: {field}")
|
||||
|
||||
# Check field types and values
|
||||
for field, value in config.items():
|
||||
if field not in schema:
|
||||
errors.append(f"Unknown field: {field}")
|
||||
continue
|
||||
|
||||
rules = schema[field]
|
||||
if 'type' in rules and not isinstance(value, rules['type']):
|
||||
errors.append(f"Invalid type for {field}: expected {rules['type'].__name__}")
|
||||
|
||||
if isinstance(value, str):
|
||||
if 'min_length' in rules and len(value) < rules['min_length']:
|
||||
errors.append(f"{field} must be at least {rules['min_length']} characters")
|
||||
if 'max_length' in rules and len(value) > rules['max_length']:
|
||||
errors.append(f"{field} must be at most {rules['max_length']} characters")
|
||||
if 'pattern' in rules and not re.match(rules['pattern'], value):
|
||||
errors.append(f"{field} must match pattern: {rules['pattern']}")
|
||||
|
||||
if isinstance(value, (int, float)):
|
||||
if 'min_value' in rules and value < rules['min_value']:
|
||||
errors.append(f"{field} must be at least {rules['min_value']}")
|
||||
if 'max_value' in rules and value > rules['max_value']:
|
||||
errors.append(f"{field} must be at most {rules['max_value']}")
|
||||
|
||||
if 'validate' in rules and callable(rules['validate']):
|
||||
result = rules['validate'](value)
|
||||
if isinstance(result, tuple):
|
||||
if not result[0]:
|
||||
errors.append(f"{field}: {result[1]}")
|
||||
elif not result:
|
||||
errors.append(f"Custom validation failed for {field}")
|
||||
|
||||
return len(errors) == 0, errors
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to validate config: {e}")
|
||||
return False, [str(e)]
|
||||
|
||||
def validate_dependencies(self, dependencies: List[str]) -> Tuple[bool, List[str]]:
|
||||
"""Validate system dependencies."""
|
||||
try:
|
||||
missing = []
|
||||
for dep in dependencies:
|
||||
if not shutil.which(dep):
|
||||
missing.append(dep)
|
||||
return len(missing) == 0, missing
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to validate dependencies: {e}")
|
||||
return False, [str(e)]
|
||||
|
||||
def validate_game_installation(self, game_type: str, path: Path) -> Tuple[bool, str]:
|
||||
"""Validate a game installation."""
|
||||
try:
|
||||
# Check if path exists
|
||||
if not path.exists():
|
||||
return False, f"Game path does not exist: {path}"
|
||||
|
||||
# Check if path is accessible
|
||||
if not os.access(path, os.R_OK | os.W_OK):
|
||||
return False, f"Game path is not accessible: {path}"
|
||||
|
||||
# Check for game-specific files
|
||||
if game_type == 'skyrim':
|
||||
if not (path / 'SkyrimSE.exe').exists():
|
||||
return False, "SkyrimSE.exe not found"
|
||||
elif game_type == 'fallout4':
|
||||
if not (path / 'Fallout4.exe').exists():
|
||||
return False, "Fallout4.exe not found"
|
||||
elif game_type == 'falloutnv':
|
||||
if not (path / 'FalloutNV.exe').exists():
|
||||
return False, "FalloutNV.exe not found"
|
||||
elif game_type == 'oblivion':
|
||||
if not (path / 'Oblivion.exe').exists():
|
||||
return False, "Oblivion.exe not found"
|
||||
else:
|
||||
return False, f"Unknown game type: {game_type}"
|
||||
|
||||
return True, "Game installation is valid"
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to validate game installation: {e}")
|
||||
return False, str(e)
|
||||
|
||||
def validate_modlist(self, modlist_path: Path) -> Tuple[bool, List[str]]:
|
||||
"""Validate a modlist installation."""
|
||||
try:
|
||||
errors = []
|
||||
|
||||
# Check if path exists
|
||||
if not modlist_path.exists():
|
||||
errors.append(f"Modlist path does not exist: {modlist_path}")
|
||||
return False, errors
|
||||
|
||||
# Check if path is accessible
|
||||
if not os.access(modlist_path, os.R_OK | os.W_OK):
|
||||
errors.append(f"Modlist path is not accessible: {modlist_path}")
|
||||
return False, errors
|
||||
|
||||
# Check for ModOrganizer.ini
|
||||
if not (modlist_path / 'ModOrganizer.ini').exists():
|
||||
errors.append("ModOrganizer.ini not found")
|
||||
|
||||
# Check for mods directory
|
||||
if not (modlist_path / 'mods').exists():
|
||||
errors.append("mods directory not found")
|
||||
|
||||
# Check for profiles directory
|
||||
if not (modlist_path / 'profiles').exists():
|
||||
errors.append("profiles directory not found")
|
||||
|
||||
return len(errors) == 0, errors
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to validate modlist: {e}")
|
||||
return False, [str(e)]
|
||||
|
||||
def validate_wine_prefix(self, app_id: str) -> Tuple[bool, str]:
|
||||
"""Validate a Wine prefix."""
|
||||
try:
|
||||
# Check if prefix exists
|
||||
prefix_path = Path.home() / '.steam' / 'steam' / 'steamapps' / 'compatdata' / app_id / 'pfx'
|
||||
if not prefix_path.exists():
|
||||
return False, f"Wine prefix does not exist: {prefix_path}"
|
||||
|
||||
# Check if prefix is accessible
|
||||
if not os.access(prefix_path, os.R_OK | os.W_OK):
|
||||
return False, f"Wine prefix is not accessible: {prefix_path}"
|
||||
|
||||
# Check for system.reg
|
||||
if not (prefix_path / 'system.reg').exists():
|
||||
return False, "system.reg not found"
|
||||
|
||||
return True, "Wine prefix is valid"
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to validate Wine prefix: {e}")
|
||||
return False, str(e)
|
||||
|
||||
def validate_steam_shortcut(self, app_id: str) -> Tuple[bool, str]:
|
||||
"""Validate a Steam shortcut."""
|
||||
try:
|
||||
# Check if shortcuts.vdf exists
|
||||
shortcuts_path = Path.home() / '.steam' / 'steam' / 'userdata' / '75424832' / 'config' / 'shortcuts.vdf'
|
||||
if not shortcuts_path.exists():
|
||||
return False, "shortcuts.vdf not found"
|
||||
|
||||
# Check if shortcuts.vdf is accessible
|
||||
if not os.access(shortcuts_path, os.R_OK | os.W_OK):
|
||||
return False, "shortcuts.vdf is not accessible"
|
||||
|
||||
# Parse shortcuts.vdf using VDFHandler
|
||||
shortcuts_data = VDFHandler.load(str(shortcuts_path), binary=True)
|
||||
|
||||
# Check if shortcut exists
|
||||
for shortcut in shortcuts_data.get('shortcuts', {}).values():
|
||||
if str(shortcut.get('appid')) == app_id:
|
||||
return True, "Steam shortcut is valid"
|
||||
|
||||
return False, f"Steam shortcut not found: {app_id}"
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to validate Steam shortcut: {e}")
|
||||
return False, str(e)
|
||||
|
||||
def validate_resolution(self, resolution: str) -> Tuple[bool, str]:
|
||||
"""Validate a resolution string."""
|
||||
try:
|
||||
# Check format
|
||||
if not re.match(r'^\d+x\d+$', resolution):
|
||||
return False, "Resolution must be in format WIDTHxHEIGHT"
|
||||
|
||||
# Parse dimensions
|
||||
width, height = map(int, resolution.split('x'))
|
||||
|
||||
# Check minimum dimensions
|
||||
if width < 640 or height < 480:
|
||||
return False, "Resolution must be at least 640x480"
|
||||
|
||||
# Check maximum dimensions
|
||||
if width > 7680 or height > 4320:
|
||||
return False, "Resolution must be at most 7680x4320"
|
||||
|
||||
return True, "Resolution is valid"
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to validate resolution: {e}")
|
||||
return False, str(e)
|
||||
|
||||
def validate_permissions(self, path: Path, required_permissions: int) -> Tuple[bool, str]:
|
||||
"""Validate file or directory permissions."""
|
||||
try:
|
||||
# Get current permissions
|
||||
current_permissions = os.stat(path).st_mode & 0o777
|
||||
|
||||
# Check if current permissions include required permissions
|
||||
if current_permissions & required_permissions != required_permissions:
|
||||
return False, f"Missing required permissions: {required_permissions:o}"
|
||||
|
||||
return True, "Permissions are valid"
|
||||
except Exception as e:
|
||||
self.logger.error(f"Failed to validate permissions: {e}")
|
||||
return False, str(e)
|
||||
|
||||
def is_dangerous_directory(self, path: Path) -> bool:
|
||||
"""Return True if the directory is a dangerous system or user root directory."""
|
||||
dangerous = [
|
||||
Path('/'), Path('/home'), Path('/root'), Path('/etc'), Path('/usr'), Path('/bin'), Path('/lib'),
|
||||
Path('/opt'), Path('/var'), Path('/tmp'), Path.home()
|
||||
]
|
||||
abs_path = path.resolve()
|
||||
return any(abs_path == d.resolve() for d in dangerous)
|
||||
|
||||
def looks_like_modlist_dir(self, path: Path) -> bool:
|
||||
"""Return True if the directory contains files/folders typical of a modlist install."""
|
||||
expected = [
|
||||
'ModOrganizer.exe', 'profiles', 'mods', 'downloads', '.wabbajack', '.jackify_modlist_marker', 'ModOrganizer.ini'
|
||||
]
|
||||
for item in expected:
|
||||
if (path / item).exists():
|
||||
return True
|
||||
return False
|
||||
|
||||
def has_jackify_marker(self, path: Path) -> bool:
|
||||
"""Return True if the directory contains a .jackify_modlist_marker file."""
|
||||
return (path / '.jackify_modlist_marker').exists()
|
||||
|
||||
def is_safe_install_directory(self, path: Path) -> (bool, str):
|
||||
"""Check if the directory is safe for install. Returns (True, reason) or (False, warning)."""
|
||||
if self.is_dangerous_directory(path):
|
||||
return False, f"The directory '{path}' is a system or user root and cannot be used for modlist installation."
|
||||
if not path.exists():
|
||||
return True, "Directory does not exist and will be created."
|
||||
if not any(path.iterdir()):
|
||||
return True, "Directory is empty."
|
||||
if self.looks_like_modlist_dir(path):
|
||||
return True, "Directory looks like a valid modlist install."
|
||||
return False, f"The directory '{path}' is not empty and does not look like a valid modlist install. Please choose an empty directory or a valid modlist directory."
|
||||
255
jackify/backend/handlers/vdf_handler.py
Normal file
255
jackify/backend/handlers/vdf_handler.py
Normal file
@@ -0,0 +1,255 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
VDFHandler module for safely handling VDF files.
|
||||
This module provides wrappers around the VDF library with additional safety checks.
|
||||
"""
|
||||
|
||||
import os
|
||||
import logging
|
||||
import vdf
|
||||
from pathlib import Path
|
||||
from typing import Dict, Any, Optional
|
||||
|
||||
# Initialize logger
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# List of protected VDF files that should never be modified
|
||||
PROTECTED_VDF_FILES = [
|
||||
"libraryfolders.vdf",
|
||||
"config.vdf",
|
||||
"loginusers.vdf",
|
||||
"registry.vdf",
|
||||
"localconfig.vdf",
|
||||
"remotecache.vdf",
|
||||
"sharedconfig.vdf",
|
||||
"appinfo.vdf",
|
||||
"packageinfo.vdf",
|
||||
"appmanifest_*.acf"
|
||||
]
|
||||
|
||||
# Critical Steam directories we should never modify
|
||||
CRITICAL_STEAM_DIRS = [
|
||||
"appcache",
|
||||
"controller_base",
|
||||
"config",
|
||||
"logs",
|
||||
"package",
|
||||
"public",
|
||||
"resource",
|
||||
"steam",
|
||||
"steamapps",
|
||||
"tenfoot"
|
||||
]
|
||||
|
||||
class VDFHandler:
|
||||
"""
|
||||
Safe handler for VDF operations with protection against modifying critical Steam files.
|
||||
"""
|
||||
|
||||
@staticmethod
|
||||
def is_protected_file(file_path: str) -> bool:
|
||||
"""
|
||||
Check if a file is protected from modification.
|
||||
|
||||
Args:
|
||||
file_path: Path to the VDF file
|
||||
|
||||
Returns:
|
||||
bool: True if the file is protected, False otherwise
|
||||
"""
|
||||
file_name = os.path.basename(file_path)
|
||||
|
||||
# Special exception for shortcuts.vdf - we always want to be able to modify this
|
||||
if file_name == "shortcuts.vdf":
|
||||
return False
|
||||
|
||||
# Check exact filename match
|
||||
if file_name in PROTECTED_VDF_FILES:
|
||||
return True
|
||||
|
||||
# Check pattern match (for appmanifest_*.acf)
|
||||
for pattern in PROTECTED_VDF_FILES:
|
||||
if '*' in pattern and pattern.replace('*', '') in file_name:
|
||||
return True
|
||||
|
||||
# Check if file is in critical Steam directories
|
||||
for dir_name in CRITICAL_STEAM_DIRS:
|
||||
if f"/{dir_name}/" in file_path or f"\\{dir_name}\\" in file_path:
|
||||
return True
|
||||
|
||||
return False
|
||||
|
||||
@staticmethod
|
||||
def load(file_path: str, binary: bool = True) -> Dict[str, Any]:
|
||||
"""
|
||||
Safely load a VDF file.
|
||||
|
||||
Args:
|
||||
file_path: Path to the VDF file
|
||||
binary: Whether the file is binary VDF format
|
||||
|
||||
Returns:
|
||||
Dict: Parsed VDF data
|
||||
|
||||
Raises:
|
||||
ValueError: If the file is protected and being loaded for writing
|
||||
"""
|
||||
# Always create a backup before reading critical files
|
||||
if VDFHandler.is_protected_file(file_path):
|
||||
backup_path = f"{file_path}.bak"
|
||||
if not os.path.exists(backup_path):
|
||||
try:
|
||||
import shutil
|
||||
shutil.copy2(file_path, backup_path)
|
||||
logger.debug(f"Created backup of {os.path.basename(file_path)} at {backup_path}")
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to create backup of {file_path}: {e}")
|
||||
|
||||
# Load the VDF file
|
||||
try:
|
||||
if binary:
|
||||
# Use ValvePython/vdf library for binary files
|
||||
logger.debug(f"Attempting to load binary VDF with ValvePython/vdf: {file_path}")
|
||||
if not os.path.exists(file_path):
|
||||
logger.error(f"Binary VDF file not found: {file_path}")
|
||||
return None
|
||||
with open(file_path, 'rb') as f_vdf:
|
||||
return vdf.binary_loads(f_vdf.read())
|
||||
else:
|
||||
# Handle text VDF files (e.g., config.vdf)
|
||||
logger.debug(f"Attempting to load text VDF with ValvePython/vdf: {file_path}")
|
||||
if not os.path.exists(file_path):
|
||||
logger.error(f"Text VDF file not found: {file_path}")
|
||||
return None
|
||||
with open(file_path, 'r', encoding='utf-8') as f_text:
|
||||
return vdf.load(f_text)
|
||||
|
||||
except FileNotFoundError:
|
||||
# This might be redundant due to os.path.exists checks, but keep for safety
|
||||
logger.error(f"VDF file not found during load operation: {file_path}")
|
||||
return None
|
||||
except PermissionError:
|
||||
logger.error(f"Permission denied when trying to read VDF file: {file_path}")
|
||||
return None
|
||||
except Exception as e:
|
||||
# Catch any other unexpected errors (including parsing errors from vdf.binary_loads)
|
||||
logger.error(f"Unexpected error loading VDF file {file_path}: {e}", exc_info=True)
|
||||
return None # Return None instead of {}
|
||||
|
||||
@staticmethod
|
||||
def save(file_path: str, data: Dict[str, Any], binary: bool = True) -> bool:
|
||||
"""
|
||||
Safely save a VDF file with protection for critical files.
|
||||
|
||||
Args:
|
||||
file_path: Path to the VDF file
|
||||
data: VDF data to save
|
||||
binary: Whether to save in binary VDF format
|
||||
|
||||
Returns:
|
||||
bool: True if save was successful, False otherwise
|
||||
|
||||
Raises:
|
||||
ValueError: If attempting to modify a protected file
|
||||
"""
|
||||
# Normalize path for consistent checks
|
||||
file_path = os.path.normpath(file_path)
|
||||
|
||||
# FIRST LINE OF DEFENSE: Prevent modification of protected files
|
||||
if VDFHandler.is_protected_file(file_path):
|
||||
error_msg = f"CRITICAL SAFETY ERROR: Attempted to modify protected Steam file: {file_path}"
|
||||
logger.error(error_msg)
|
||||
raise ValueError(error_msg)
|
||||
|
||||
# SECOND LINE OF DEFENSE: Only allow saving to shortcuts.vdf
|
||||
file_name = os.path.basename(file_path)
|
||||
if file_name != "shortcuts.vdf":
|
||||
error_msg = f"CRITICAL SAFETY ERROR: Only shortcuts.vdf can be modified, attempted: {file_path}"
|
||||
logger.error(error_msg)
|
||||
raise ValueError(error_msg)
|
||||
|
||||
# THIRD LINE OF DEFENSE: Create backup before saving
|
||||
if os.path.exists(file_path):
|
||||
# Create timestamped backup
|
||||
timestamp = Path(file_path).stat().st_mtime
|
||||
backup_path = f"{file_path}.{int(timestamp)}.bak"
|
||||
|
||||
# Also create a simple .bak file if it doesn't exist
|
||||
simple_backup = f"{file_path}.bak"
|
||||
|
||||
try:
|
||||
import shutil
|
||||
# Create timestamped backup
|
||||
shutil.copy2(file_path, backup_path)
|
||||
logger.info(f"Created timestamped backup of {file_name} at {backup_path}")
|
||||
|
||||
# Create simple backup if it doesn't exist
|
||||
if not os.path.exists(simple_backup):
|
||||
shutil.copy2(file_path, simple_backup)
|
||||
logger.info(f"Created backup of {file_name} at {simple_backup}")
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to create backup before modifying {file_path}: {e}")
|
||||
return False
|
||||
|
||||
# Save the file
|
||||
try:
|
||||
# Additional safety: Verify we're only saving to shortcuts.vdf again
|
||||
if not file_name == "shortcuts.vdf":
|
||||
raise ValueError(f"Final safety check failed: Attempted to save to non-shortcuts file: {file_path}")
|
||||
|
||||
if binary:
|
||||
with open(file_path, 'wb') as f:
|
||||
vdf.binary_dumps(data, f)
|
||||
else:
|
||||
with open(file_path, 'w', encoding='utf-8') as f:
|
||||
vdf.dump(data, f, pretty=True)
|
||||
|
||||
logger.info(f"Successfully saved changes to {file_path}")
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.error(f"Error saving VDF file {file_path}: {e}")
|
||||
return False
|
||||
|
||||
@staticmethod
|
||||
def update_shortcuts(shortcuts_path: str, update_function) -> bool:
|
||||
"""
|
||||
Safely update shortcuts.vdf using a callback function.
|
||||
|
||||
Args:
|
||||
shortcuts_path: Path to the shortcuts.vdf file
|
||||
update_function: Callback function that takes shortcuts data and returns updated data
|
||||
Signature: function(shortcuts_data) -> updated_shortcuts_data
|
||||
|
||||
Returns:
|
||||
bool: True if update was successful, False otherwise
|
||||
"""
|
||||
try:
|
||||
# Check that we're only operating on shortcuts.vdf
|
||||
if os.path.basename(shortcuts_path) != "shortcuts.vdf":
|
||||
error_msg = f"Can only update shortcuts.vdf, not: {shortcuts_path}"
|
||||
logger.error(error_msg)
|
||||
raise ValueError(error_msg)
|
||||
|
||||
# Load the shortcuts file
|
||||
logger.info(f"Loading shortcuts from: {shortcuts_path}")
|
||||
shortcuts_data = VDFHandler.load(shortcuts_path, binary=True)
|
||||
|
||||
if not shortcuts_data:
|
||||
logger.error(f"Failed to load shortcuts data from {shortcuts_path}")
|
||||
return False
|
||||
|
||||
# Apply the update function
|
||||
logger.info("Applying updates to shortcuts data")
|
||||
updated_data = update_function(shortcuts_data)
|
||||
|
||||
if updated_data is None:
|
||||
logger.error("Update function returned None")
|
||||
return False
|
||||
|
||||
# Save the updated data
|
||||
logger.info(f"Saving updated shortcuts to: {shortcuts_path}")
|
||||
return VDFHandler.save(shortcuts_path, updated_data, binary=True)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error updating shortcuts: {e}")
|
||||
return False
|
||||
1652
jackify/backend/handlers/wabbajack_handler.py
Normal file
1652
jackify/backend/handlers/wabbajack_handler.py
Normal file
File diff suppressed because it is too large
Load Diff
152
jackify/backend/handlers/wabbajack_parser.py
Normal file
152
jackify/backend/handlers/wabbajack_parser.py
Normal file
@@ -0,0 +1,152 @@
|
||||
"""
|
||||
Wabbajack file parser for extracting game type information from .wabbajack files.
|
||||
|
||||
This module provides efficient parsing of .wabbajack files (which are ZIP archives)
|
||||
to extract game type information without loading the entire archive.
|
||||
"""
|
||||
|
||||
import json
|
||||
import logging
|
||||
import zipfile
|
||||
from pathlib import Path
|
||||
from typing import Optional, Dict, Any
|
||||
|
||||
|
||||
class WabbajackParser:
|
||||
"""Parser for .wabbajack files to extract game type information."""
|
||||
|
||||
def __init__(self):
|
||||
self.logger = logging.getLogger(__name__)
|
||||
|
||||
# Mapping from Wabbajack Game enum values to Jackify game types
|
||||
self.game_type_mapping = {
|
||||
'Starfield': 'starfield',
|
||||
'oblivionremastered': 'oblivion_remastered',
|
||||
'SkyrimSpecialEdition': 'skyrim',
|
||||
'Fallout4': 'fallout4',
|
||||
'FalloutNewVegas': 'falloutnv',
|
||||
'Oblivion': 'oblivion',
|
||||
'Skyrim': 'skyrim', # Legacy Skyrim
|
||||
'Fallout3': 'fallout3', # For completeness
|
||||
'SkyrimVR': 'skyrim', # Treat as Skyrim
|
||||
'Fallout4VR': 'fallout4', # Treat as Fallout 4
|
||||
'Enderal': 'enderal', # Enderal: Forgotten Stories
|
||||
'EnderalSpecialEdition': 'enderal', # Enderal SE
|
||||
}
|
||||
|
||||
# List of supported games in Jackify
|
||||
self.supported_games = [
|
||||
'skyrim', 'fallout4', 'falloutnv', 'oblivion',
|
||||
'starfield', 'oblivion_remastered', 'enderal'
|
||||
]
|
||||
|
||||
def parse_wabbajack_game_type(self, wabbajack_path: Path) -> Optional[tuple]:
|
||||
"""
|
||||
Parse a .wabbajack file to extract the game type.
|
||||
|
||||
Args:
|
||||
wabbajack_path: Path to the .wabbajack file
|
||||
|
||||
Returns:
|
||||
Tuple containing Jackify game type string (e.g., 'skyrim', 'starfield') and raw game type string
|
||||
"""
|
||||
try:
|
||||
if not wabbajack_path.exists():
|
||||
self.logger.error(f"Wabbajack file not found: {wabbajack_path}")
|
||||
return None
|
||||
|
||||
if not wabbajack_path.suffix.lower() == '.wabbajack':
|
||||
self.logger.error(f"File is not a .wabbajack file: {wabbajack_path}")
|
||||
return None
|
||||
|
||||
# Open the .wabbajack file as a ZIP archive
|
||||
with zipfile.ZipFile(wabbajack_path, 'r') as zip_file:
|
||||
# Look for the modlist file (could be 'modlist' or 'modlist.json')
|
||||
modlist_files = [f for f in zip_file.namelist() if f in ['modlist', 'modlist.json']]
|
||||
|
||||
if not modlist_files:
|
||||
self.logger.error(f"No modlist file found in {wabbajack_path}")
|
||||
return None
|
||||
|
||||
# Extract and parse the modlist file
|
||||
modlist_file = modlist_files[0]
|
||||
with zip_file.open(modlist_file) as modlist_stream:
|
||||
modlist_data = json.load(modlist_stream)
|
||||
|
||||
# Extract the game type
|
||||
game_type = modlist_data.get('GameType')
|
||||
if not game_type:
|
||||
self.logger.error(f"No GameType found in modlist: {wabbajack_path}")
|
||||
return None
|
||||
|
||||
# Map to Jackify game type
|
||||
jackify_game_type = self.game_type_mapping.get(game_type)
|
||||
if jackify_game_type:
|
||||
self.logger.info(f"Detected game type: {game_type} -> {jackify_game_type}")
|
||||
return jackify_game_type, game_type
|
||||
else:
|
||||
self.logger.warning(f"Unknown game type in modlist: {game_type}")
|
||||
return 'unknown', game_type
|
||||
|
||||
except zipfile.BadZipFile:
|
||||
self.logger.error(f"Invalid ZIP file: {wabbajack_path}")
|
||||
return None
|
||||
except json.JSONDecodeError as e:
|
||||
self.logger.error(f"Invalid JSON in modlist file: {e}")
|
||||
return None
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error parsing .wabbajack file {wabbajack_path}: {e}")
|
||||
return None
|
||||
|
||||
def is_supported_game(self, game_type: str) -> bool:
|
||||
"""
|
||||
Check if a game type is supported by Jackify's post-install configuration.
|
||||
|
||||
Args:
|
||||
game_type: Jackify game type string
|
||||
|
||||
Returns:
|
||||
True if the game is supported, False otherwise
|
||||
"""
|
||||
return game_type in self.supported_games
|
||||
|
||||
def get_supported_games_list(self) -> list:
|
||||
"""
|
||||
Get the list of games supported by Jackify's post-install configuration.
|
||||
|
||||
Returns:
|
||||
List of supported game names
|
||||
"""
|
||||
return self.supported_games.copy()
|
||||
|
||||
def get_supported_games_display_names(self) -> list:
|
||||
"""
|
||||
Get the display names of supported games for user-facing messages.
|
||||
|
||||
Returns:
|
||||
List of display names for supported games
|
||||
"""
|
||||
display_names = {
|
||||
'skyrim': 'Skyrim Special Edition',
|
||||
'fallout4': 'Fallout 4',
|
||||
'falloutnv': 'Fallout New Vegas',
|
||||
'oblivion': 'Oblivion',
|
||||
'starfield': 'Starfield',
|
||||
'oblivion_remastered': 'Oblivion Remastered'
|
||||
}
|
||||
return [display_names.get(game, game) for game in self.supported_games]
|
||||
|
||||
|
||||
# Convenience function for easy access
|
||||
def parse_wabbajack_game_type(wabbajack_path: Path) -> Optional[tuple]:
|
||||
"""
|
||||
Convenience function to parse a .wabbajack file and get the game type.
|
||||
|
||||
Args:
|
||||
wabbajack_path: Path to the .wabbajack file
|
||||
|
||||
Returns:
|
||||
Tuple containing Jackify game type string and raw game type string or None if parsing fails
|
||||
"""
|
||||
parser = WabbajackParser()
|
||||
return parser.parse_wabbajack_game_type(wabbajack_path)
|
||||
701
jackify/backend/handlers/wine_utils.py
Normal file
701
jackify/backend/handlers/wine_utils.py
Normal file
@@ -0,0 +1,701 @@
|
||||
#!/usr/bin/env python3
|
||||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
Wine Utilities Module
|
||||
Handles wine-related operations and utilities
|
||||
"""
|
||||
|
||||
import os
|
||||
import re
|
||||
import subprocess
|
||||
import logging
|
||||
import shutil
|
||||
import time
|
||||
from pathlib import Path
|
||||
import glob
|
||||
from typing import Optional, Tuple
|
||||
from .subprocess_utils import get_clean_subprocess_env
|
||||
|
||||
# Initialize logger
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class WineUtils:
|
||||
"""
|
||||
Utilities for wine-related operations
|
||||
"""
|
||||
|
||||
@staticmethod
|
||||
def cleanup_wine_processes():
|
||||
"""
|
||||
Clean up wine processes
|
||||
Returns True on success, False on failure
|
||||
"""
|
||||
try:
|
||||
# Find and kill processes containing various process names
|
||||
processes = subprocess.run(
|
||||
"pgrep -f 'win7|win10|ShowDotFiles|protontricks'",
|
||||
shell=True,
|
||||
capture_output=True,
|
||||
text=True,
|
||||
env=get_clean_subprocess_env()
|
||||
).stdout.strip()
|
||||
|
||||
if processes:
|
||||
for pid in processes.split("\n"):
|
||||
try:
|
||||
subprocess.run(f"kill -9 {pid}", shell=True, check=True, env=get_clean_subprocess_env())
|
||||
except subprocess.CalledProcessError:
|
||||
logger.warning(f"Failed to kill process {pid}")
|
||||
logger.debug("Processes killed successfully")
|
||||
else:
|
||||
logger.debug("No matching processes found")
|
||||
|
||||
# Kill winetricks processes
|
||||
subprocess.run("pkill -9 winetricks", shell=True, env=get_clean_subprocess_env())
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to cleanup wine processes: {e}")
|
||||
return False
|
||||
|
||||
@staticmethod
|
||||
def edit_binary_working_paths(modlist_ini, modlist_dir, modlist_sdcard, steam_library, basegame_sdcard):
|
||||
"""
|
||||
Edit binary and working directory paths in ModOrganizer.ini
|
||||
Returns True on success, False on failure
|
||||
"""
|
||||
if not os.path.isfile(modlist_ini):
|
||||
logger.error(f"ModOrganizer.ini not found at {modlist_ini}")
|
||||
return False
|
||||
|
||||
try:
|
||||
# Read the file
|
||||
with open(modlist_ini, 'r', encoding='utf-8', errors='ignore') as f:
|
||||
content = f.readlines()
|
||||
|
||||
modified_content = []
|
||||
found_skse = False
|
||||
|
||||
# First pass to identify SKSE/F4SE launcher entries
|
||||
skse_lines = []
|
||||
for i, line in enumerate(content):
|
||||
if re.search(r'skse64_loader\.exe|f4se_loader\.exe', line):
|
||||
skse_lines.append((i, line))
|
||||
found_skse = True
|
||||
|
||||
if not found_skse:
|
||||
logger.debug("No SKSE/F4SE launcher entries found")
|
||||
return False
|
||||
|
||||
# Process each SKSE/F4SE entry
|
||||
for line_num, orig_line in skse_lines:
|
||||
# Split the line into key and value
|
||||
if '=' not in orig_line:
|
||||
continue
|
||||
|
||||
binary_num, skse_loc = orig_line.split('=', 1)
|
||||
|
||||
# Set drive letter based on whether using SD card
|
||||
if modlist_sdcard:
|
||||
drive_letter = " = D:"
|
||||
else:
|
||||
drive_letter = " = Z:"
|
||||
|
||||
# Determine the working directory key
|
||||
just_num = binary_num.split('\\')[0]
|
||||
bin_path_start = binary_num.strip().replace('\\', '\\\\')
|
||||
path_start = f"{just_num}\\\\workingDirectory".replace('\\', '\\\\')
|
||||
|
||||
# Process the path based on its type
|
||||
if "mods" in orig_line:
|
||||
# mods path type
|
||||
if modlist_sdcard:
|
||||
path_middle = WineUtils._strip_sdcard_path(modlist_dir)
|
||||
else:
|
||||
path_middle = modlist_dir
|
||||
|
||||
path_end = re.sub(r'.*/mods', '/mods', skse_loc.split('/')[0])
|
||||
bin_path_end = re.sub(r'.*/mods', '/mods', skse_loc)
|
||||
|
||||
elif any(term in orig_line for term in ["Stock Game", "Game Root", "STOCK GAME", "Stock Game Folder", "Stock Folder", "Skyrim Stock", "root/Skyrim Special Edition"]):
|
||||
# Stock Game or Game Root type
|
||||
if modlist_sdcard:
|
||||
path_middle = WineUtils._strip_sdcard_path(modlist_dir)
|
||||
else:
|
||||
path_middle = modlist_dir
|
||||
|
||||
# Determine the specific stock folder type
|
||||
if "Stock Game" in orig_line:
|
||||
dir_type = "stockgame"
|
||||
path_end = re.sub(r'.*/Stock Game', '/Stock Game', os.path.dirname(skse_loc))
|
||||
bin_path_end = re.sub(r'.*/Stock Game', '/Stock Game', skse_loc)
|
||||
elif "Game Root" in orig_line:
|
||||
dir_type = "gameroot"
|
||||
path_end = re.sub(r'.*/Game Root', '/Game Root', os.path.dirname(skse_loc))
|
||||
bin_path_end = re.sub(r'.*/Game Root', '/Game Root', skse_loc)
|
||||
elif "STOCK GAME" in orig_line:
|
||||
dir_type = "STOCKGAME"
|
||||
path_end = re.sub(r'.*/STOCK GAME', '/STOCK GAME', os.path.dirname(skse_loc))
|
||||
bin_path_end = re.sub(r'.*/STOCK GAME', '/STOCK GAME', skse_loc)
|
||||
elif "Stock Folder" in orig_line:
|
||||
dir_type = "stockfolder"
|
||||
path_end = re.sub(r'.*/Stock Folder', '/Stock Folder', os.path.dirname(skse_loc))
|
||||
bin_path_end = re.sub(r'.*/Stock Folder', '/Stock Folder', skse_loc)
|
||||
elif "Skyrim Stock" in orig_line:
|
||||
dir_type = "skyrimstock"
|
||||
path_end = re.sub(r'.*/Skyrim Stock', '/Skyrim Stock', os.path.dirname(skse_loc))
|
||||
bin_path_end = re.sub(r'.*/Skyrim Stock', '/Skyrim Stock', skse_loc)
|
||||
elif "Stock Game Folder" in orig_line:
|
||||
dir_type = "stockgamefolder"
|
||||
path_end = re.sub(r'.*/Stock Game Folder', '/Stock Game Folder', skse_loc)
|
||||
bin_path_end = path_end
|
||||
elif "root/Skyrim Special Edition" in orig_line:
|
||||
dir_type = "rootskyrimse"
|
||||
path_end = '/' + skse_loc.lstrip()
|
||||
bin_path_end = path_end
|
||||
else:
|
||||
logger.error(f"Unknown stock game type in line: {orig_line}")
|
||||
continue
|
||||
|
||||
elif "steamapps" in orig_line:
|
||||
# Steam apps path type
|
||||
if basegame_sdcard:
|
||||
path_middle = WineUtils._strip_sdcard_path(steam_library)
|
||||
drive_letter = " = D:"
|
||||
else:
|
||||
path_middle = steam_library.split('steamapps')[0]
|
||||
|
||||
path_end = re.sub(r'.*/steamapps', '/steamapps', os.path.dirname(skse_loc))
|
||||
bin_path_end = re.sub(r'.*/steamapps', '/steamapps', skse_loc)
|
||||
|
||||
else:
|
||||
logger.warning(f"No matching pattern found in the path: {orig_line}")
|
||||
continue
|
||||
|
||||
# Combine paths
|
||||
full_bin_path = f"{bin_path_start}{drive_letter}{path_middle}{bin_path_end}"
|
||||
full_path = f"{path_start}{drive_letter}{path_middle}{path_end}"
|
||||
|
||||
# Replace forward slashes with double backslashes for Windows paths
|
||||
new_path = full_path.replace('/', '\\\\')
|
||||
|
||||
# Update the content with new paths
|
||||
for i, line in enumerate(content):
|
||||
if line.startswith(bin_path_start):
|
||||
content[i] = f"{full_bin_path}\n"
|
||||
elif line.startswith(path_start):
|
||||
content[i] = f"{new_path}\n"
|
||||
|
||||
# Write back the modified content
|
||||
with open(modlist_ini, 'w', encoding='utf-8') as f:
|
||||
f.writelines(content)
|
||||
|
||||
logger.debug("Updated binary and working directory paths successfully")
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error editing binary working paths: {e}")
|
||||
return False
|
||||
|
||||
@staticmethod
|
||||
def _strip_sdcard_path(path):
|
||||
"""
|
||||
Strip /run/media/deck/UUID from SD card paths
|
||||
Internal helper method
|
||||
"""
|
||||
if path.startswith("/run/media/deck/"):
|
||||
parts = path.split("/", 5)
|
||||
if len(parts) >= 6:
|
||||
return "/" + parts[5]
|
||||
return path
|
||||
|
||||
@staticmethod
|
||||
def all_owned_by_user(path):
|
||||
"""
|
||||
Returns True if all files and directories under 'path' are owned by the current user.
|
||||
"""
|
||||
uid = os.getuid()
|
||||
gid = os.getgid()
|
||||
for root, dirs, files in os.walk(path):
|
||||
for name in dirs + files:
|
||||
full_path = os.path.join(root, name)
|
||||
try:
|
||||
stat = os.stat(full_path)
|
||||
if stat.st_uid != uid or stat.st_gid != gid:
|
||||
return False
|
||||
except Exception:
|
||||
return False
|
||||
return True
|
||||
|
||||
@staticmethod
|
||||
def chown_chmod_modlist_dir(modlist_dir):
|
||||
"""
|
||||
Change ownership and permissions of modlist directory
|
||||
Returns True on success, False on failure
|
||||
"""
|
||||
if WineUtils.all_owned_by_user(modlist_dir):
|
||||
logger.info(f"All files in {modlist_dir} are already owned by the current user. Skipping sudo chown/chmod.")
|
||||
return True
|
||||
logger.warn("Changing Ownership and Permissions of modlist directory (may require sudo password)")
|
||||
|
||||
try:
|
||||
user = subprocess.run("whoami", shell=True, capture_output=True, text=True).stdout.strip()
|
||||
group = subprocess.run("id -gn", shell=True, capture_output=True, text=True).stdout.strip()
|
||||
|
||||
logger.debug(f"User is {user} and Group is {group}")
|
||||
|
||||
# Change ownership
|
||||
result1 = subprocess.run(
|
||||
f"sudo chown -R {user}:{group} \"{modlist_dir}\"",
|
||||
shell=True,
|
||||
capture_output=True,
|
||||
text=True
|
||||
)
|
||||
|
||||
# Change permissions
|
||||
result2 = subprocess.run(
|
||||
f"sudo chmod -R 755 \"{modlist_dir}\"",
|
||||
shell=True,
|
||||
capture_output=True,
|
||||
text=True
|
||||
)
|
||||
|
||||
if result1.returncode != 0 or result2.returncode != 0:
|
||||
logger.error("Failed to change ownership/permissions")
|
||||
logger.error(f"chown output: {result1.stderr}")
|
||||
logger.error(f"chmod output: {result2.stderr}")
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error changing ownership and permissions: {e}")
|
||||
return False
|
||||
|
||||
@staticmethod
|
||||
def create_dxvk_file(modlist_dir, modlist_sdcard, steam_library, basegame_sdcard, game_var_full):
|
||||
"""
|
||||
Create DXVK file in the modlist directory
|
||||
"""
|
||||
try:
|
||||
# Construct the path to the game directory
|
||||
game_dir = os.path.join(steam_library, game_var_full)
|
||||
|
||||
# Create the DXVK file
|
||||
dxvk_file = os.path.join(modlist_dir, "DXVK")
|
||||
with open(dxvk_file, 'w') as f:
|
||||
f.write(game_dir)
|
||||
|
||||
logger.debug(f"Created DXVK file at {dxvk_file} pointing to {game_dir}")
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.error(f"Error creating DXVK file: {e}")
|
||||
return False
|
||||
|
||||
@staticmethod
|
||||
def small_additional_tasks(modlist_dir, compat_data_path):
|
||||
"""
|
||||
Perform small additional tasks like deleting unsupported plugins
|
||||
Returns True on success, False on failure
|
||||
"""
|
||||
try:
|
||||
# Delete MO2 plugins that don't work via Proton
|
||||
file_to_delete = os.path.join(modlist_dir, "plugins/FixGameRegKey.py")
|
||||
if os.path.exists(file_to_delete):
|
||||
os.remove(file_to_delete)
|
||||
logger.debug(f"File deleted: {file_to_delete}")
|
||||
|
||||
# Download Font to support Bethini
|
||||
if compat_data_path and os.path.isdir(compat_data_path):
|
||||
font_path = os.path.join(compat_data_path, "pfx/drive_c/windows/Fonts/seguisym.ttf")
|
||||
font_dir = os.path.dirname(font_path)
|
||||
|
||||
# Ensure the directory exists
|
||||
os.makedirs(font_dir, exist_ok=True)
|
||||
|
||||
# Download the font
|
||||
font_url = "https://github.com/mrbvrz/segoe-ui-linux/raw/refs/heads/master/font/seguisym.ttf"
|
||||
subprocess.run(
|
||||
f"wget {font_url} -q -nc -O \"{font_path}\"",
|
||||
shell=True,
|
||||
check=True
|
||||
)
|
||||
logger.debug(f"Downloaded font to: {font_path}")
|
||||
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error performing additional tasks: {e}")
|
||||
return False
|
||||
|
||||
@staticmethod
|
||||
def modlist_specific_steps(modlist, appid):
|
||||
"""
|
||||
Perform modlist-specific steps
|
||||
Returns True on success, False on failure
|
||||
"""
|
||||
try:
|
||||
# Define modlist-specific configurations
|
||||
modlist_configs = {
|
||||
"wildlander": ["dotnet48", "dotnet472", "vcrun2019"],
|
||||
"septimus|sigernacollection|licentia|aldrnari|phoenix": ["dotnet48", "dotnet472"],
|
||||
"masterstroke": ["dotnet48", "dotnet472"],
|
||||
"diablo": ["dotnet48", "dotnet472"],
|
||||
"living_skyrim": ["dotnet48", "dotnet472", "dotnet462"],
|
||||
"nolvus": ["dotnet8"]
|
||||
}
|
||||
|
||||
modlist_lower = modlist.lower().replace(" ", "")
|
||||
|
||||
# Check for wildlander special case
|
||||
if "wildlander" in modlist_lower:
|
||||
logger.info(f"Running steps specific to {modlist}. This can take some time, be patient!")
|
||||
# Implementation for wildlander-specific steps
|
||||
return True
|
||||
|
||||
# Check for other modlists
|
||||
for pattern, components in modlist_configs.items():
|
||||
if re.search(pattern.replace("|", "|.*"), modlist_lower):
|
||||
logger.info(f"Running steps specific to {modlist}. This can take some time, be patient!")
|
||||
|
||||
# Install components
|
||||
for component in components:
|
||||
if component == "dotnet8":
|
||||
# Special handling for .NET 8
|
||||
logger.info("Downloading .NET 8 Runtime")
|
||||
# Implementation for .NET 8 installation
|
||||
pass
|
||||
else:
|
||||
# Standard component installation
|
||||
logger.info(f"Installing {component}...")
|
||||
# Implementation for standard component installation
|
||||
pass
|
||||
|
||||
# Set Windows 10 prefix
|
||||
# Implementation for setting Windows 10 prefix
|
||||
|
||||
return True
|
||||
|
||||
# No specific steps for this modlist
|
||||
logger.debug(f"No specific steps needed for {modlist}")
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error performing modlist-specific steps: {e}")
|
||||
return False
|
||||
|
||||
@staticmethod
|
||||
def fnv_launch_options(game_var, compat_data_path, modlist):
|
||||
"""
|
||||
Set up Fallout New Vegas launch options
|
||||
Returns True on success, False on failure
|
||||
"""
|
||||
if game_var != "Fallout New Vegas":
|
||||
return True
|
||||
|
||||
try:
|
||||
appid_to_check = "22380" # Fallout New Vegas AppID
|
||||
|
||||
for path in [
|
||||
os.path.expanduser("~/.local/share/Steam/steamapps/compatdata"),
|
||||
os.path.expanduser("~/.steam/steam/steamapps/compatdata"),
|
||||
os.path.expanduser("~/.steam/root/steamapps/compatdata")
|
||||
]:
|
||||
compat_path = os.path.join(path, appid_to_check)
|
||||
if os.path.exists(compat_path):
|
||||
logger.warning(f"\nFor {modlist}, please add the following line to the Launch Options in Steam for your '{modlist}' entry:")
|
||||
logger.info(f"\nSTEAM_COMPAT_DATA_PATH=\"{compat_path}\" %command%")
|
||||
logger.warning("\nThis is essential for the modlist to load correctly.")
|
||||
return True
|
||||
|
||||
logger.error("Could not determine the compatdata path for Fallout New Vegas")
|
||||
return False
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error setting FNV launch options: {e}")
|
||||
return False
|
||||
|
||||
@staticmethod
|
||||
def get_proton_version(compat_data_path):
|
||||
"""
|
||||
Detect the Proton version used by a Steam game/shortcut
|
||||
|
||||
Args:
|
||||
compat_data_path (str): Path to the compatibility data directory
|
||||
|
||||
Returns:
|
||||
str: Detected Proton version or 'Unknown' if not found
|
||||
"""
|
||||
logger.info("Detecting Proton version...")
|
||||
|
||||
# Validate the compatdata path exists
|
||||
if not os.path.isdir(compat_data_path):
|
||||
logger.warning(f"Compatdata directory not found at '{compat_data_path}'")
|
||||
return "Unknown"
|
||||
|
||||
# First try to get Proton version from the registry
|
||||
system_reg_path = os.path.join(compat_data_path, "pfx", "system.reg")
|
||||
if os.path.isfile(system_reg_path):
|
||||
try:
|
||||
with open(system_reg_path, "r", encoding="utf-8", errors="ignore") as f:
|
||||
content = f.read()
|
||||
|
||||
# Use regex to find SteamClientProtonVersion entry
|
||||
match = re.search(r'"SteamClientProtonVersion"="([^"]+)"', content)
|
||||
if match:
|
||||
version = match.group(1).strip()
|
||||
# Keep GE versions as is, otherwise prefix with "Proton"
|
||||
if "GE" in version:
|
||||
proton_ver = version
|
||||
else:
|
||||
proton_ver = f"Proton {version}"
|
||||
|
||||
logger.debug(f"Detected Proton version from registry: {proton_ver}")
|
||||
return proton_ver
|
||||
except Exception as e:
|
||||
logger.debug(f"Error reading system.reg: {e}")
|
||||
|
||||
# Fallback to config_info if registry method fails
|
||||
config_info_path = os.path.join(compat_data_path, "config_info")
|
||||
if os.path.isfile(config_info_path):
|
||||
try:
|
||||
with open(config_info_path, "r") as f:
|
||||
config_ver = f.readline().strip()
|
||||
|
||||
if config_ver:
|
||||
# Keep GE versions as is, otherwise prefix with "Proton"
|
||||
if "GE" in config_ver:
|
||||
proton_ver = config_ver
|
||||
else:
|
||||
proton_ver = f"Proton {config_ver}"
|
||||
|
||||
logger.debug(f"Detected Proton version from config_info: {proton_ver}")
|
||||
return proton_ver
|
||||
except Exception as e:
|
||||
logger.debug(f"Error reading config_info: {e}")
|
||||
|
||||
logger.warning("Could not detect Proton version")
|
||||
return "Unknown"
|
||||
|
||||
@staticmethod
|
||||
def update_executables(modlist_ini, modlist_dir, modlist_sdcard, steam_library, basegame_sdcard):
|
||||
"""
|
||||
Update executable paths in ModOrganizer.ini
|
||||
"""
|
||||
logger.info("Updating executable paths in ModOrganizer.ini...")
|
||||
|
||||
try:
|
||||
# Find SKSE or F4SE loader entries
|
||||
with open(modlist_ini, 'r') as f:
|
||||
lines = f.readlines()
|
||||
|
||||
# Process each line
|
||||
for i, line in enumerate(lines):
|
||||
if "skse64_loader.exe" in line or "f4se_loader.exe" in line:
|
||||
# Extract the binary path
|
||||
binary_path = line.strip().split('=', 1)[1] if '=' in line else ""
|
||||
|
||||
# Determine drive letter
|
||||
drive_letter = "D:" if modlist_sdcard else "Z:"
|
||||
|
||||
# Extract binary number
|
||||
binary_num = line.strip().split('=', 1)[0] if '=' in line else ""
|
||||
|
||||
# Find the equivalent workingDirectory
|
||||
justnum = binary_num.split('\\')[0] if '\\' in binary_num else binary_num
|
||||
bin_path_start = binary_num.replace('\\', '\\\\')
|
||||
path_start = f"{justnum}\\workingDirectory".replace('\\', '\\\\')
|
||||
|
||||
# Determine path type and construct new paths
|
||||
if "mods" in binary_path:
|
||||
# mods path type found
|
||||
if modlist_sdcard:
|
||||
path_middle = modlist_dir.split('mmcblk0p1', 1)[1] if 'mmcblk0p1' in modlist_dir else modlist_dir
|
||||
# Strip /run/media/deck/UUID if present
|
||||
if '/run/media/' in path_middle:
|
||||
path_middle = '/' + path_middle.split('/run/media/', 1)[1].split('/', 2)[2]
|
||||
else:
|
||||
path_middle = modlist_dir
|
||||
|
||||
path_end = '/' + '/'.join(binary_path.split('/mods/', 1)[1].split('/')[:-1]) if '/mods/' in binary_path else ""
|
||||
bin_path_end = '/' + '/'.join(binary_path.split('/mods/', 1)[1].split('/')) if '/mods/' in binary_path else ""
|
||||
|
||||
elif any(x in binary_path for x in ["Stock Game", "Game Root", "STOCK GAME", "Stock Game Folder", "Stock Folder", "Skyrim Stock", "root/Skyrim Special Edition"]):
|
||||
# Stock/Game Root found
|
||||
if modlist_sdcard:
|
||||
path_middle = modlist_dir.split('mmcblk0p1', 1)[1] if 'mmcblk0p1' in modlist_dir else modlist_dir
|
||||
# Strip /run/media/deck/UUID if present
|
||||
if '/run/media/' in path_middle:
|
||||
path_middle = '/' + path_middle.split('/run/media/', 1)[1].split('/', 2)[2]
|
||||
else:
|
||||
path_middle = modlist_dir
|
||||
|
||||
# Determine directory type
|
||||
if "Stock Game" in binary_path:
|
||||
dir_type = "stockgame"
|
||||
path_end = '/' + '/'.join(binary_path.split('/Stock Game/', 1)[1].split('/')[:-1]) if '/Stock Game/' in binary_path else ""
|
||||
bin_path_end = '/' + '/'.join(binary_path.split('/Stock Game/', 1)[1].split('/')) if '/Stock Game/' in binary_path else ""
|
||||
elif "Game Root" in binary_path:
|
||||
dir_type = "gameroot"
|
||||
path_end = '/' + '/'.join(binary_path.split('/Game Root/', 1)[1].split('/')[:-1]) if '/Game Root/' in binary_path else ""
|
||||
bin_path_end = '/' + '/'.join(binary_path.split('/Game Root/', 1)[1].split('/')) if '/Game Root/' in binary_path else ""
|
||||
elif "STOCK GAME" in binary_path:
|
||||
dir_type = "STOCKGAME"
|
||||
path_end = '/' + '/'.join(binary_path.split('/STOCK GAME/', 1)[1].split('/')[:-1]) if '/STOCK GAME/' in binary_path else ""
|
||||
bin_path_end = '/' + '/'.join(binary_path.split('/STOCK GAME/', 1)[1].split('/')) if '/STOCK GAME/' in binary_path else ""
|
||||
elif "Stock Folder" in binary_path:
|
||||
dir_type = "stockfolder"
|
||||
path_end = '/' + '/'.join(binary_path.split('/Stock Folder/', 1)[1].split('/')[:-1]) if '/Stock Folder/' in binary_path else ""
|
||||
bin_path_end = '/' + '/'.join(binary_path.split('/Stock Folder/', 1)[1].split('/')) if '/Stock Folder/' in binary_path else ""
|
||||
elif "Skyrim Stock" in binary_path:
|
||||
dir_type = "skyrimstock"
|
||||
path_end = '/' + '/'.join(binary_path.split('/Skyrim Stock/', 1)[1].split('/')[:-1]) if '/Skyrim Stock/' in binary_path else ""
|
||||
bin_path_end = '/' + '/'.join(binary_path.split('/Skyrim Stock/', 1)[1].split('/')) if '/Skyrim Stock/' in binary_path else ""
|
||||
elif "Stock Game Folder" in binary_path:
|
||||
dir_type = "stockgamefolder"
|
||||
path_end = '/' + '/'.join(binary_path.split('/Stock Game Folder/', 1)[1].split('/')) if '/Stock Game Folder/' in binary_path else ""
|
||||
elif "root/Skyrim Special Edition" in binary_path:
|
||||
dir_type = "rootskyrimse"
|
||||
path_end = '/' + binary_path.split('root/Skyrim Special Edition', 1)[1] if 'root/Skyrim Special Edition' in binary_path else ""
|
||||
bin_path_end = '/' + binary_path.split('root/Skyrim Special Edition', 1)[1] if 'root/Skyrim Special Edition' in binary_path else ""
|
||||
|
||||
elif "steamapps" in binary_path:
|
||||
# Steamapps found
|
||||
if basegame_sdcard:
|
||||
path_middle = steam_library.split('mmcblk0p1', 1)[1] if 'mmcblk0p1' in steam_library else steam_library
|
||||
drive_letter = "D:"
|
||||
else:
|
||||
path_middle = steam_library.split('steamapps', 1)[0] if 'steamapps' in steam_library else steam_library
|
||||
|
||||
path_end = '/' + '/'.join(binary_path.split('/steamapps/', 1)[1].split('/')[:-1]) if '/steamapps/' in binary_path else ""
|
||||
bin_path_end = '/' + '/'.join(binary_path.split('/steamapps/', 1)[1].split('/')) if '/steamapps/' in binary_path else ""
|
||||
|
||||
else:
|
||||
logger.warning(f"No matching pattern found in the path: {binary_path}")
|
||||
continue
|
||||
|
||||
# Combine paths
|
||||
full_bin_path = f"{bin_path_start}={drive_letter}{path_middle}{bin_path_end}"
|
||||
full_path = f"{path_start}={drive_letter}{path_middle}{path_end}"
|
||||
|
||||
# Replace forward slashes with double backslashes
|
||||
new_path = full_path.replace('/', '\\\\')
|
||||
|
||||
# Update the lines
|
||||
lines[i] = f"{full_bin_path}\n"
|
||||
|
||||
# Find and update the workingDirectory line
|
||||
for j, working_line in enumerate(lines):
|
||||
if working_line.startswith(path_start):
|
||||
lines[j] = f"{new_path}\n"
|
||||
break
|
||||
|
||||
# Write the updated content back to the file
|
||||
with open(modlist_ini, 'w') as f:
|
||||
f.writelines(lines)
|
||||
|
||||
logger.info("Executable paths updated successfully")
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.error(f"Error updating executable paths: {e}")
|
||||
return False
|
||||
|
||||
@staticmethod
|
||||
def find_proton_binary(proton_version: str):
|
||||
"""
|
||||
Find the full path to the Proton binary given a version string (e.g., 'Proton 8.0', 'GE-Proton8-15').
|
||||
Searches standard Steam library locations.
|
||||
Returns the path to the 'files/bin/wine' executable, or None if not found.
|
||||
"""
|
||||
# Clean up the version string for directory matching
|
||||
version_patterns = [proton_version, proton_version.replace(' ', '_'), proton_version.replace(' ', '')]
|
||||
# Standard Steam library locations
|
||||
steam_common_paths = [
|
||||
Path.home() / ".steam/steam/steamapps/common",
|
||||
Path.home() / ".local/share/Steam/steamapps/common",
|
||||
Path.home() / ".steam/root/steamapps/common"
|
||||
]
|
||||
# Special handling for Proton 9: try all possible directory names
|
||||
if proton_version.strip().startswith("Proton 9"):
|
||||
proton9_candidates = ["Proton 9.0", "Proton 9.0 (Beta)"]
|
||||
for base_path in steam_common_paths:
|
||||
for name in proton9_candidates:
|
||||
candidate = base_path / name / "files/bin/wine"
|
||||
if candidate.is_file():
|
||||
return str(candidate)
|
||||
# Fallback: any Proton 9* directory
|
||||
for subdir in base_path.glob("Proton 9*"):
|
||||
wine_bin = subdir / "files/bin/wine"
|
||||
if wine_bin.is_file():
|
||||
return str(wine_bin)
|
||||
# General case: try version patterns
|
||||
for base_path in steam_common_paths:
|
||||
if not base_path.is_dir():
|
||||
continue
|
||||
for pattern in version_patterns:
|
||||
# Try direct match for Proton directory
|
||||
proton_dir = base_path / pattern
|
||||
wine_bin = proton_dir / "files/bin/wine"
|
||||
if wine_bin.is_file():
|
||||
return str(wine_bin)
|
||||
# Try glob for GE/other variants
|
||||
for subdir in base_path.glob(f"*{pattern}*"):
|
||||
wine_bin = subdir / "files/bin/wine"
|
||||
if wine_bin.is_file():
|
||||
return str(wine_bin)
|
||||
# Fallback: Try 'Proton - Experimental' if present
|
||||
for base_path in steam_common_paths:
|
||||
wine_bin = base_path / "Proton - Experimental" / "files/bin/wine"
|
||||
if wine_bin.is_file():
|
||||
logger.warning(f"Requested Proton version '{proton_version}' not found. Falling back to 'Proton - Experimental'.")
|
||||
return str(wine_bin)
|
||||
return None
|
||||
|
||||
@staticmethod
|
||||
def get_proton_paths(appid: str) -> Tuple[Optional[str], Optional[str], Optional[str]]:
|
||||
"""
|
||||
Get the Proton paths for a given AppID.
|
||||
|
||||
Args:
|
||||
appid (str): The Steam AppID to get paths for
|
||||
|
||||
Returns:
|
||||
tuple: (compatdata_path, proton_path, wine_bin) or (None, None, None) if not found
|
||||
"""
|
||||
logger.info(f"Getting Proton paths for AppID {appid}")
|
||||
|
||||
# Find compatdata path
|
||||
possible_compat_bases = [
|
||||
Path.home() / ".steam/steam/steamapps/compatdata",
|
||||
Path.home() / ".local/share/Steam/steamapps/compatdata"
|
||||
]
|
||||
|
||||
compatdata_path = None
|
||||
for base_path in possible_compat_bases:
|
||||
potential_compat_path = base_path / appid
|
||||
if potential_compat_path.is_dir():
|
||||
compatdata_path = str(potential_compat_path)
|
||||
logger.debug(f"Found compatdata directory: {compatdata_path}")
|
||||
break
|
||||
|
||||
if not compatdata_path:
|
||||
logger.error(f"Could not find compatdata directory for AppID {appid}")
|
||||
return None, None, None
|
||||
|
||||
# Get Proton version
|
||||
proton_version = WineUtils.get_proton_version(compatdata_path)
|
||||
if proton_version == "Unknown":
|
||||
logger.error(f"Could not determine Proton version for AppID {appid}")
|
||||
return None, None, None
|
||||
|
||||
# Find Proton binary
|
||||
wine_bin = WineUtils.find_proton_binary(proton_version)
|
||||
if not wine_bin:
|
||||
logger.error(f"Could not find Proton binary for version {proton_version}")
|
||||
return None, None, None
|
||||
|
||||
# Get Proton path (parent of wine binary)
|
||||
proton_path = str(Path(wine_bin).parent.parent)
|
||||
logger.debug(f"Found Proton path: {proton_path}")
|
||||
|
||||
return compatdata_path, proton_path, wine_bin
|
||||
5
jackify/backend/models/__init__.py
Normal file
5
jackify/backend/models/__init__.py
Normal file
@@ -0,0 +1,5 @@
|
||||
"""
|
||||
Backend Data Models
|
||||
|
||||
Data structures for passing context between frontend and backend.
|
||||
"""
|
||||
79
jackify/backend/models/configuration.py
Normal file
79
jackify/backend/models/configuration.py
Normal file
@@ -0,0 +1,79 @@
|
||||
"""
|
||||
Configuration Data Models
|
||||
|
||||
Data structures for configuration context between frontend and backend.
|
||||
"""
|
||||
|
||||
from pathlib import Path
|
||||
from typing import Optional, Dict, Any
|
||||
from dataclasses import dataclass
|
||||
|
||||
|
||||
@dataclass
|
||||
class ConfigurationContext:
|
||||
"""Context object for modlist configuration operations."""
|
||||
modlist_name: str
|
||||
install_dir: Path
|
||||
mo2_exe_path: Optional[Path] = None
|
||||
resolution: Optional[str] = None
|
||||
download_dir: Optional[Path] = None
|
||||
nexus_api_key: Optional[str] = None
|
||||
modlist_value: Optional[str] = None
|
||||
modlist_source: Optional[str] = None
|
||||
skip_confirmation: bool = False
|
||||
|
||||
def __post_init__(self):
|
||||
"""Convert string paths to Path objects."""
|
||||
if isinstance(self.install_dir, str):
|
||||
self.install_dir = Path(self.install_dir)
|
||||
if isinstance(self.download_dir, str):
|
||||
self.download_dir = Path(self.download_dir)
|
||||
if isinstance(self.mo2_exe_path, str):
|
||||
self.mo2_exe_path = Path(self.mo2_exe_path)
|
||||
|
||||
def to_dict(self) -> Dict[str, Any]:
|
||||
"""Convert to dictionary for legacy compatibility."""
|
||||
return {
|
||||
'name': self.modlist_name,
|
||||
'path': str(self.install_dir),
|
||||
'mo2_exe_path': str(self.mo2_exe_path) if self.mo2_exe_path else None,
|
||||
'resolution': self.resolution,
|
||||
'download_dir': str(self.download_dir) if self.download_dir else None,
|
||||
'nexus_api_key': self.nexus_api_key,
|
||||
'modlist_value': self.modlist_value,
|
||||
'modlist_source': self.modlist_source,
|
||||
'skip_confirmation': self.skip_confirmation,
|
||||
}
|
||||
|
||||
@classmethod
|
||||
def from_dict(cls, data: Dict[str, Any]) -> 'ConfigurationContext':
|
||||
"""Create from dictionary for legacy compatibility."""
|
||||
return cls(
|
||||
modlist_name=data.get('name', data.get('modlist_name', '')),
|
||||
install_dir=Path(data.get('path', data.get('install_dir', ''))),
|
||||
mo2_exe_path=Path(data['mo2_exe_path']) if data.get('mo2_exe_path') else None,
|
||||
resolution=data.get('resolution'),
|
||||
download_dir=Path(data['download_dir']) if data.get('download_dir') else None,
|
||||
nexus_api_key=data.get('nexus_api_key'),
|
||||
modlist_value=data.get('modlist_value'),
|
||||
modlist_source=data.get('modlist_source'),
|
||||
skip_confirmation=data.get('skip_confirmation', False),
|
||||
)
|
||||
|
||||
|
||||
@dataclass
|
||||
class SystemInfo:
|
||||
"""System information context."""
|
||||
is_steamdeck: bool
|
||||
steam_root: Optional[Path] = None
|
||||
steam_user_id: Optional[str] = None
|
||||
proton_version: Optional[str] = None
|
||||
|
||||
def to_dict(self) -> Dict[str, Any]:
|
||||
"""Convert to dictionary."""
|
||||
return {
|
||||
'is_steamdeck': self.is_steamdeck,
|
||||
'steam_root': str(self.steam_root) if self.steam_root else None,
|
||||
'steam_user_id': self.steam_user_id,
|
||||
'proton_version': self.proton_version,
|
||||
}
|
||||
105
jackify/backend/models/modlist.py
Normal file
105
jackify/backend/models/modlist.py
Normal file
@@ -0,0 +1,105 @@
|
||||
"""
|
||||
Modlist Data Models
|
||||
|
||||
Data structures for passing modlist context between frontend and backend.
|
||||
"""
|
||||
|
||||
from pathlib import Path
|
||||
from typing import Optional, Dict, Any
|
||||
from dataclasses import dataclass
|
||||
|
||||
|
||||
@dataclass
|
||||
class ModlistContext:
|
||||
"""Context object for modlist operations."""
|
||||
name: str
|
||||
install_dir: Path
|
||||
download_dir: Path
|
||||
game_type: str
|
||||
nexus_api_key: str
|
||||
modlist_value: Optional[str] = None
|
||||
modlist_source: Optional[str] = None # 'identifier' or 'file'
|
||||
resolution: Optional[str] = None
|
||||
mo2_exe_path: Optional[Path] = None
|
||||
skip_confirmation: bool = False
|
||||
engine_installed: bool = False # True if installed via jackify-engine
|
||||
|
||||
def __post_init__(self):
|
||||
"""Convert string paths to Path objects."""
|
||||
if isinstance(self.install_dir, str):
|
||||
self.install_dir = Path(self.install_dir)
|
||||
if isinstance(self.download_dir, str):
|
||||
self.download_dir = Path(self.download_dir)
|
||||
if isinstance(self.mo2_exe_path, str):
|
||||
self.mo2_exe_path = Path(self.mo2_exe_path)
|
||||
|
||||
def to_dict(self) -> Dict[str, Any]:
|
||||
"""Convert to dictionary for legacy compatibility."""
|
||||
return {
|
||||
'modlist_name': self.name,
|
||||
'install_dir': str(self.install_dir),
|
||||
'download_dir': str(self.download_dir),
|
||||
'game_type': self.game_type,
|
||||
'nexus_api_key': self.nexus_api_key,
|
||||
'modlist_value': self.modlist_value,
|
||||
'modlist_source': self.modlist_source,
|
||||
'resolution': self.resolution,
|
||||
'mo2_exe_path': str(self.mo2_exe_path) if self.mo2_exe_path else None,
|
||||
'skip_confirmation': self.skip_confirmation,
|
||||
'engine_installed': self.engine_installed,
|
||||
}
|
||||
|
||||
@classmethod
|
||||
def from_dict(cls, data: Dict[str, Any]) -> 'ModlistContext':
|
||||
"""Create from dictionary for legacy compatibility."""
|
||||
return cls(
|
||||
name=data.get('modlist_name', ''),
|
||||
install_dir=Path(data.get('install_dir', '')),
|
||||
download_dir=Path(data.get('download_dir', '')),
|
||||
game_type=data.get('game_type', ''),
|
||||
nexus_api_key=data.get('nexus_api_key', ''),
|
||||
modlist_value=data.get('modlist_value'),
|
||||
modlist_source=data.get('modlist_source'),
|
||||
resolution=data.get('resolution'),
|
||||
mo2_exe_path=Path(data['mo2_exe_path']) if data.get('mo2_exe_path') else None,
|
||||
skip_confirmation=data.get('skip_confirmation', False),
|
||||
engine_installed=data.get('engine_installed', False),
|
||||
)
|
||||
|
||||
|
||||
@dataclass
|
||||
class ModlistInfo:
|
||||
"""Information about a modlist from the engine."""
|
||||
id: str
|
||||
name: str
|
||||
game: str
|
||||
description: Optional[str] = None
|
||||
version: Optional[str] = None
|
||||
size: Optional[str] = None
|
||||
|
||||
def to_dict(self) -> Dict[str, Any]:
|
||||
"""Convert to dictionary."""
|
||||
result = {
|
||||
'id': self.id,
|
||||
'name': self.name,
|
||||
'game': self.game,
|
||||
'description': self.description,
|
||||
'version': self.version,
|
||||
'size': self.size,
|
||||
}
|
||||
|
||||
# Include any dynamically added attributes
|
||||
if hasattr(self, 'machine_url'):
|
||||
result['machine_url'] = self.machine_url
|
||||
if hasattr(self, 'download_size'):
|
||||
result['download_size'] = self.download_size
|
||||
if hasattr(self, 'install_size'):
|
||||
result['install_size'] = self.install_size
|
||||
if hasattr(self, 'total_size'):
|
||||
result['total_size'] = self.total_size
|
||||
if hasattr(self, 'status_down'):
|
||||
result['status_down'] = self.status_down
|
||||
if hasattr(self, 'status_nsfw'):
|
||||
result['status_nsfw'] = self.status_nsfw
|
||||
|
||||
return result
|
||||
5
jackify/backend/services/__init__.py
Normal file
5
jackify/backend/services/__init__.py
Normal file
@@ -0,0 +1,5 @@
|
||||
"""
|
||||
Backend Services
|
||||
|
||||
High-level service classes that orchestrate handlers.
|
||||
"""
|
||||
271
jackify/backend/services/api_key_service.py
Normal file
271
jackify/backend/services/api_key_service.py
Normal file
@@ -0,0 +1,271 @@
|
||||
#!/usr/bin/env python3
|
||||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
API Key Service Module
|
||||
Centralized service for managing Nexus API keys across CLI and GUI frontends
|
||||
"""
|
||||
|
||||
import logging
|
||||
from typing import Optional, Tuple
|
||||
from ..handlers.config_handler import ConfigHandler
|
||||
|
||||
# Initialize logger
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class APIKeyService:
|
||||
"""
|
||||
Centralized service for managing Nexus API keys
|
||||
Handles saving, loading, and validation of API keys
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
"""Initialize the API key service"""
|
||||
self.config_handler = ConfigHandler()
|
||||
logger.debug("APIKeyService initialized")
|
||||
|
||||
def save_api_key(self, api_key: str) -> bool:
|
||||
"""
|
||||
Save an API key to configuration
|
||||
|
||||
Args:
|
||||
api_key (str): The API key to save
|
||||
|
||||
Returns:
|
||||
bool: True if saved successfully, False otherwise
|
||||
"""
|
||||
try:
|
||||
# Validate API key format (basic check)
|
||||
if not self._validate_api_key_format(api_key):
|
||||
logger.warning("Invalid API key format provided")
|
||||
return False
|
||||
|
||||
# Check if we can write to config directory
|
||||
import os
|
||||
config_dir = os.path.dirname(self.config_handler.config_file)
|
||||
if not os.path.exists(config_dir):
|
||||
try:
|
||||
os.makedirs(config_dir, exist_ok=True)
|
||||
logger.debug(f"Created config directory: {config_dir}")
|
||||
except PermissionError:
|
||||
logger.error(f"Permission denied creating config directory: {config_dir}")
|
||||
return False
|
||||
except Exception as dir_error:
|
||||
logger.error(f"Error creating config directory: {dir_error}")
|
||||
return False
|
||||
|
||||
# Check write permissions
|
||||
if not os.access(config_dir, os.W_OK):
|
||||
logger.error(f"No write permission for config directory: {config_dir}")
|
||||
return False
|
||||
|
||||
success = self.config_handler.save_api_key(api_key)
|
||||
if success:
|
||||
logger.info("API key saved successfully")
|
||||
# Verify the save worked by reading it back
|
||||
saved_key = self.config_handler.get_api_key()
|
||||
if saved_key != api_key:
|
||||
logger.error("API key save verification failed - key mismatch")
|
||||
return False
|
||||
else:
|
||||
logger.error("Failed to save API key via config handler")
|
||||
|
||||
return success
|
||||
except Exception as e:
|
||||
logger.error(f"Error in save_api_key: {e}")
|
||||
return False
|
||||
|
||||
def get_saved_api_key(self) -> Optional[str]:
|
||||
"""
|
||||
Retrieve the saved API key from configuration
|
||||
|
||||
Returns:
|
||||
str: The decoded API key or None if not saved
|
||||
"""
|
||||
try:
|
||||
api_key = self.config_handler.get_api_key()
|
||||
if api_key:
|
||||
logger.debug("Retrieved saved API key")
|
||||
else:
|
||||
logger.debug("No saved API key found")
|
||||
return api_key
|
||||
except Exception as e:
|
||||
logger.error(f"Error retrieving API key: {e}")
|
||||
return None
|
||||
|
||||
def has_saved_api_key(self) -> bool:
|
||||
"""
|
||||
Check if an API key is saved in configuration
|
||||
|
||||
Returns:
|
||||
bool: True if API key exists, False otherwise
|
||||
"""
|
||||
try:
|
||||
return self.config_handler.has_saved_api_key()
|
||||
except Exception as e:
|
||||
logger.error(f"Error checking for saved API key: {e}")
|
||||
return False
|
||||
|
||||
def clear_saved_api_key(self) -> bool:
|
||||
"""
|
||||
Clear the saved API key from configuration
|
||||
|
||||
Returns:
|
||||
bool: True if cleared successfully, False otherwise
|
||||
"""
|
||||
try:
|
||||
success = self.config_handler.clear_api_key()
|
||||
if success:
|
||||
logger.info("API key cleared successfully")
|
||||
else:
|
||||
logger.error("Failed to clear API key")
|
||||
return success
|
||||
except Exception as e:
|
||||
logger.error(f"Error clearing API key: {e}")
|
||||
return False
|
||||
|
||||
def get_api_key_for_session(self, provided_key: Optional[str] = None,
|
||||
use_saved: bool = True) -> Tuple[Optional[str], str]:
|
||||
"""
|
||||
Get the API key to use for a session, with priority logic
|
||||
|
||||
Args:
|
||||
provided_key (str, optional): API key provided by user for this session
|
||||
use_saved (bool): Whether to use saved API key if no key provided
|
||||
|
||||
Returns:
|
||||
tuple: (api_key, source) where source is 'provided', 'saved', or 'none'
|
||||
"""
|
||||
try:
|
||||
# Priority 1: Use provided key if given
|
||||
if provided_key and self._validate_api_key_format(provided_key):
|
||||
logger.debug("Using provided API key for session")
|
||||
return provided_key, 'provided'
|
||||
|
||||
# Priority 2: Use saved key if enabled and available
|
||||
if use_saved and self.has_saved_api_key():
|
||||
saved_key = self.get_saved_api_key()
|
||||
if saved_key:
|
||||
logger.debug("Using saved API key for session")
|
||||
return saved_key, 'saved'
|
||||
|
||||
# No valid API key available
|
||||
logger.debug("No valid API key available for session")
|
||||
return None, 'none'
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error getting API key for session: {e}")
|
||||
return None, 'none'
|
||||
|
||||
def _validate_api_key_format(self, api_key: str) -> bool:
|
||||
"""
|
||||
Validate basic API key format
|
||||
|
||||
Args:
|
||||
api_key (str): API key to validate
|
||||
|
||||
Returns:
|
||||
bool: True if format appears valid, False otherwise
|
||||
"""
|
||||
if not api_key or not isinstance(api_key, str):
|
||||
return False
|
||||
|
||||
# Basic validation: should be alphanumeric string of reasonable length
|
||||
# Nexus API keys are typically 32+ characters, alphanumeric with some special chars
|
||||
api_key = api_key.strip()
|
||||
if len(api_key) < 10: # Too short to be valid
|
||||
return False
|
||||
|
||||
if len(api_key) > 200: # Unreasonably long
|
||||
return False
|
||||
|
||||
# Should contain some alphanumeric characters
|
||||
if not any(c.isalnum() for c in api_key):
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
def get_api_key_display(self, api_key: str, mask_after_chars: int = 4) -> str:
|
||||
"""
|
||||
Get a masked version of the API key for display purposes
|
||||
|
||||
Args:
|
||||
api_key (str): The API key to mask
|
||||
mask_after_chars (int): Number of characters to show before masking
|
||||
|
||||
Returns:
|
||||
str: Masked API key for display
|
||||
"""
|
||||
if not api_key:
|
||||
return ""
|
||||
|
||||
if len(api_key) <= mask_after_chars:
|
||||
return "*" * len(api_key)
|
||||
|
||||
visible_part = api_key[:mask_after_chars]
|
||||
masked_part = "*" * (len(api_key) - mask_after_chars)
|
||||
return visible_part + masked_part
|
||||
|
||||
def validate_api_key_works(self, api_key: str) -> Tuple[bool, str]:
|
||||
"""
|
||||
Validate that an API key actually works with Nexus API
|
||||
Tests the key against the Nexus Mods validation endpoint
|
||||
|
||||
Args:
|
||||
api_key (str): API key to validate
|
||||
|
||||
Returns:
|
||||
tuple: (is_valid, message)
|
||||
"""
|
||||
# First check format
|
||||
if not self._validate_api_key_format(api_key):
|
||||
return False, "API key format is invalid"
|
||||
|
||||
try:
|
||||
import requests
|
||||
import time
|
||||
|
||||
# Nexus API validation endpoint
|
||||
url = "https://api.nexusmods.com/v1/users/validate.json"
|
||||
headers = {
|
||||
'apikey': api_key,
|
||||
'User-Agent': 'Jackify/1.0' # Required by Nexus API
|
||||
}
|
||||
|
||||
# Set a reasonable timeout
|
||||
response = requests.get(url, headers=headers, timeout=10)
|
||||
|
||||
if response.status_code == 200:
|
||||
# API key is valid
|
||||
try:
|
||||
data = response.json()
|
||||
username = data.get('name', 'Unknown')
|
||||
# Don't log the actual API key - use masking
|
||||
masked_key = self.get_api_key_display(api_key)
|
||||
logger.info(f"API key validation successful for user: {username} (key: {masked_key})")
|
||||
return True, f"API key valid for user: {username}"
|
||||
except Exception as json_error:
|
||||
logger.warning(f"API key valid but couldn't parse user info: {json_error}")
|
||||
return True, "API key is valid"
|
||||
elif response.status_code == 401:
|
||||
# Invalid API key
|
||||
logger.warning("API key validation failed: Invalid key")
|
||||
return False, "Invalid API key"
|
||||
elif response.status_code == 429:
|
||||
# Rate limited
|
||||
logger.warning("API key validation rate limited")
|
||||
return False, "Rate limited - try again later"
|
||||
else:
|
||||
# Other error
|
||||
logger.warning(f"API key validation failed with status {response.status_code}")
|
||||
return False, f"Validation failed (HTTP {response.status_code})"
|
||||
|
||||
except requests.exceptions.Timeout:
|
||||
logger.warning("API key validation timed out")
|
||||
return False, "Validation timed out - check connection"
|
||||
except requests.exceptions.ConnectionError:
|
||||
logger.warning("API key validation connection error")
|
||||
return False, "Connection error - check internet"
|
||||
except Exception as e:
|
||||
logger.error(f"API key validation error: {e}")
|
||||
return False, f"Validation error: {str(e)}"
|
||||
2739
jackify/backend/services/automated_prefix_service.py
Normal file
2739
jackify/backend/services/automated_prefix_service.py
Normal file
File diff suppressed because it is too large
Load Diff
690
jackify/backend/services/modlist_service.py
Normal file
690
jackify/backend/services/modlist_service.py
Normal file
@@ -0,0 +1,690 @@
|
||||
"""
|
||||
Modlist Service
|
||||
|
||||
High-level service for modlist operations, orchestrating various handlers.
|
||||
"""
|
||||
|
||||
import logging
|
||||
from typing import List, Optional, Dict, Any
|
||||
from pathlib import Path
|
||||
|
||||
from ..models.modlist import ModlistContext, ModlistInfo
|
||||
from ..models.configuration import SystemInfo
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class ModlistService:
|
||||
"""Service for managing modlist operations."""
|
||||
|
||||
def __init__(self, system_info: SystemInfo):
|
||||
"""Initialize the modlist service.
|
||||
|
||||
Args:
|
||||
system_info: System information context
|
||||
"""
|
||||
self.system_info = system_info
|
||||
|
||||
# Handlers will be initialized when needed
|
||||
self._modlist_handler = None
|
||||
self._wabbajack_handler = None
|
||||
self._filesystem_handler = None
|
||||
|
||||
def _get_modlist_handler(self):
|
||||
"""Lazy initialization of modlist handler."""
|
||||
if self._modlist_handler is None:
|
||||
from ..handlers.modlist_handler import ModlistHandler
|
||||
# Initialize with proper dependencies
|
||||
self._modlist_handler = ModlistHandler()
|
||||
return self._modlist_handler
|
||||
|
||||
def _get_wabbajack_handler(self):
|
||||
"""Lazy initialization of wabbajack handler."""
|
||||
if self._wabbajack_handler is None:
|
||||
from ..handlers.wabbajack_handler import InstallWabbajackHandler
|
||||
# Initialize with proper dependencies
|
||||
self._wabbajack_handler = InstallWabbajackHandler()
|
||||
return self._wabbajack_handler
|
||||
|
||||
def _get_filesystem_handler(self):
|
||||
"""Lazy initialization of filesystem handler."""
|
||||
if self._filesystem_handler is None:
|
||||
from ..handlers.filesystem_handler import FileSystemHandler
|
||||
self._filesystem_handler = FileSystemHandler()
|
||||
return self._filesystem_handler
|
||||
|
||||
def list_modlists(self, game_type: Optional[str] = None) -> List[ModlistInfo]:
|
||||
"""List available modlists.
|
||||
|
||||
Args:
|
||||
game_type: Optional filter by game type
|
||||
|
||||
Returns:
|
||||
List of available modlists
|
||||
"""
|
||||
logger.info(f"Listing modlists for game_type: {game_type}")
|
||||
|
||||
try:
|
||||
# Use the working ModlistInstallCLI to get modlists from engine
|
||||
from ..core.modlist_operations import ModlistInstallCLI
|
||||
|
||||
# Use new SystemInfo pattern
|
||||
modlist_cli = ModlistInstallCLI(self.system_info)
|
||||
|
||||
# Get all modlists and do client-side filtering for better control
|
||||
raw_modlists = modlist_cli.get_all_modlists_from_engine(game_type=None)
|
||||
|
||||
# Apply client-side filtering based on game_type
|
||||
if game_type:
|
||||
game_type_lower = game_type.lower()
|
||||
|
||||
if game_type_lower == 'skyrim':
|
||||
# Include both "Skyrim" and "Skyrim Special Edition" and "Skyrim VR"
|
||||
raw_modlists = [m for m in raw_modlists if 'skyrim' in m.get('game', '').lower()]
|
||||
|
||||
elif game_type_lower == 'fallout4':
|
||||
raw_modlists = [m for m in raw_modlists if 'fallout 4' in m.get('game', '').lower()]
|
||||
|
||||
elif game_type_lower == 'falloutnv':
|
||||
raw_modlists = [m for m in raw_modlists if 'fallout new vegas' in m.get('game', '').lower()]
|
||||
|
||||
elif game_type_lower == 'oblivion':
|
||||
raw_modlists = [m for m in raw_modlists if 'oblivion' in m.get('game', '').lower() and 'remastered' not in m.get('game', '').lower()]
|
||||
|
||||
elif game_type_lower == 'starfield':
|
||||
raw_modlists = [m for m in raw_modlists if 'starfield' in m.get('game', '').lower()]
|
||||
|
||||
elif game_type_lower == 'oblivion_remastered':
|
||||
raw_modlists = [m for m in raw_modlists if 'oblivion remastered' in m.get('game', '').lower()]
|
||||
|
||||
elif game_type_lower == 'enderal':
|
||||
raw_modlists = [m for m in raw_modlists if 'enderal' in m.get('game', '').lower()]
|
||||
|
||||
elif game_type_lower == 'other':
|
||||
# Exclude all main category games to show only "Other" games
|
||||
main_category_keywords = ['skyrim', 'fallout 4', 'fallout new vegas', 'oblivion', 'starfield', 'enderal']
|
||||
def is_main_category(game_name):
|
||||
game_lower = game_name.lower()
|
||||
return any(keyword in game_lower for keyword in main_category_keywords)
|
||||
|
||||
raw_modlists = [m for m in raw_modlists if not is_main_category(m.get('game', ''))]
|
||||
|
||||
# Convert to ModlistInfo objects with enhanced metadata
|
||||
modlists = []
|
||||
for m_info in raw_modlists:
|
||||
modlist_info = ModlistInfo(
|
||||
id=m_info.get('id', ''),
|
||||
name=m_info.get('name', m_info.get('id', '')), # Use name from enhanced data
|
||||
game=m_info.get('game', ''),
|
||||
description='', # Engine doesn't provide description yet
|
||||
version='', # Engine doesn't provide version yet
|
||||
size=f"{m_info.get('download_size', '')}|{m_info.get('install_size', '')}|{m_info.get('total_size', '')}" # Store all three sizes
|
||||
)
|
||||
|
||||
# Add enhanced metadata as additional properties
|
||||
if hasattr(modlist_info, '__dict__'):
|
||||
modlist_info.__dict__.update({
|
||||
'download_size': m_info.get('download_size', ''),
|
||||
'install_size': m_info.get('install_size', ''),
|
||||
'total_size': m_info.get('total_size', ''),
|
||||
'machine_url': m_info.get('machine_url', ''), # Store machine URL for installation
|
||||
'status_down': m_info.get('status_down', False),
|
||||
'status_nsfw': m_info.get('status_nsfw', False)
|
||||
})
|
||||
|
||||
# No client-side filtering needed - engine handles it
|
||||
modlists.append(modlist_info)
|
||||
|
||||
logger.info(f"Found {len(modlists)} modlists")
|
||||
return modlists
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to list modlists: {e}")
|
||||
raise
|
||||
|
||||
def install_modlist(self, context: ModlistContext,
|
||||
progress_callback=None,
|
||||
output_callback=None) -> bool:
|
||||
"""Install a modlist (ONLY installation, no configuration).
|
||||
|
||||
This method only runs the engine installation phase.
|
||||
Configuration must be called separately after Steam setup.
|
||||
|
||||
Args:
|
||||
context: Modlist installation context
|
||||
progress_callback: Optional callback for progress updates
|
||||
output_callback: Optional callback for output/logging
|
||||
|
||||
Returns:
|
||||
True if installation successful, False otherwise
|
||||
"""
|
||||
logger.info(f"Installing modlist (INSTALLATION ONLY): {context.name}")
|
||||
|
||||
try:
|
||||
# Validate context
|
||||
if not self._validate_install_context(context):
|
||||
logger.error("Invalid installation context")
|
||||
return False
|
||||
|
||||
# Prepare directories
|
||||
fs_handler = self._get_filesystem_handler()
|
||||
fs_handler.ensure_directory(context.install_dir)
|
||||
fs_handler.ensure_directory(context.download_dir)
|
||||
|
||||
# Use the working ModlistInstallCLI for discovery phase only
|
||||
from ..core.modlist_operations import ModlistInstallCLI
|
||||
|
||||
# Use new SystemInfo pattern
|
||||
modlist_cli = ModlistInstallCLI(self.system_info)
|
||||
|
||||
# Build context for ModlistInstallCLI
|
||||
install_context = {
|
||||
'modlist_name': context.name,
|
||||
'install_dir': context.install_dir,
|
||||
'download_dir': context.download_dir,
|
||||
'nexus_api_key': context.nexus_api_key,
|
||||
'game_type': context.game_type,
|
||||
'modlist_value': context.modlist_value,
|
||||
'resolution': getattr(context, 'resolution', None),
|
||||
'skip_confirmation': True # Service layer should be non-interactive
|
||||
}
|
||||
|
||||
# Set GUI mode for non-interactive operation
|
||||
import os
|
||||
original_gui_mode = os.environ.get('JACKIFY_GUI_MODE')
|
||||
os.environ['JACKIFY_GUI_MODE'] = '1'
|
||||
|
||||
try:
|
||||
# Run discovery phase with pre-filled context
|
||||
confirmed_context = modlist_cli.run_discovery_phase(context_override=install_context)
|
||||
if not confirmed_context:
|
||||
logger.error("Discovery phase failed or was cancelled")
|
||||
return False
|
||||
|
||||
# Now run ONLY the installation part (NOT configuration)
|
||||
success = self._run_installation_only(
|
||||
confirmed_context,
|
||||
progress_callback=progress_callback,
|
||||
output_callback=output_callback
|
||||
)
|
||||
|
||||
if success:
|
||||
logger.info("Modlist installation completed successfully (configuration will be done separately)")
|
||||
return True
|
||||
else:
|
||||
logger.error("Modlist installation failed")
|
||||
return False
|
||||
|
||||
finally:
|
||||
# Restore original GUI mode
|
||||
if original_gui_mode is not None:
|
||||
os.environ['JACKIFY_GUI_MODE'] = original_gui_mode
|
||||
else:
|
||||
os.environ.pop('JACKIFY_GUI_MODE', None)
|
||||
|
||||
except Exception as e:
|
||||
error_message = str(e)
|
||||
logger.error(f"Failed to install modlist {context.name}: {error_message}")
|
||||
|
||||
# Check for file descriptor limit issues and attempt to handle them
|
||||
from .resource_manager import handle_file_descriptor_error
|
||||
try:
|
||||
if any(indicator in error_message.lower() for indicator in ['too many open files', 'emfile', 'resource temporarily unavailable']):
|
||||
result = handle_file_descriptor_error(error_message, "modlist installation")
|
||||
if result['auto_fix_success']:
|
||||
logger.info(f"File descriptor limit increased automatically. {result['recommendation']}")
|
||||
elif result['error_detected']:
|
||||
logger.warning(f"File descriptor limit issue detected but automatic fix failed. {result['recommendation']}")
|
||||
if result['manual_instructions']:
|
||||
distro = result['manual_instructions']['distribution']
|
||||
logger.info(f"Manual ulimit increase instructions available for {distro} distribution")
|
||||
except Exception as resource_error:
|
||||
logger.debug(f"Error checking for resource limit issues: {resource_error}")
|
||||
|
||||
return False
|
||||
|
||||
def _run_installation_only(self, context, progress_callback=None, output_callback=None) -> bool:
|
||||
"""Run only the installation phase using the engine (COPIED FROM WORKING CODE)."""
|
||||
import subprocess
|
||||
import os
|
||||
import sys
|
||||
from pathlib import Path
|
||||
from ..core.modlist_operations import get_jackify_engine_path
|
||||
|
||||
try:
|
||||
# COPIED EXACTLY from working Archive_Do_Not_Write/modules/modlist_install_cli.py
|
||||
|
||||
# Process paths (copied from working code)
|
||||
install_dir_context = context['install_dir']
|
||||
if isinstance(install_dir_context, tuple):
|
||||
actual_install_path = Path(install_dir_context[0])
|
||||
if install_dir_context[1]:
|
||||
actual_install_path.mkdir(parents=True, exist_ok=True)
|
||||
else:
|
||||
actual_install_path = Path(install_dir_context)
|
||||
install_dir_str = str(actual_install_path)
|
||||
|
||||
download_dir_context = context['download_dir']
|
||||
if isinstance(download_dir_context, tuple):
|
||||
actual_download_path = Path(download_dir_context[0])
|
||||
if download_dir_context[1]:
|
||||
actual_download_path.mkdir(parents=True, exist_ok=True)
|
||||
else:
|
||||
actual_download_path = Path(download_dir_context)
|
||||
download_dir_str = str(actual_download_path)
|
||||
|
||||
api_key = context['nexus_api_key']
|
||||
|
||||
# Path to the engine binary (copied from working code)
|
||||
engine_path = get_jackify_engine_path()
|
||||
engine_dir = os.path.dirname(engine_path)
|
||||
if not os.path.isfile(engine_path) or not os.access(engine_path, os.X_OK):
|
||||
if output_callback:
|
||||
output_callback(f"Jackify Install Engine not found or not executable at: {engine_path}")
|
||||
return False
|
||||
|
||||
# Build command (copied from working code)
|
||||
cmd = [engine_path, 'install']
|
||||
modlist_value = context.get('modlist_value')
|
||||
if modlist_value and modlist_value.endswith('.wabbajack') and os.path.isfile(modlist_value):
|
||||
cmd += ['-w', modlist_value]
|
||||
elif modlist_value:
|
||||
cmd += ['-m', modlist_value]
|
||||
elif context.get('machineid'):
|
||||
cmd += ['-m', context['machineid']]
|
||||
cmd += ['-o', install_dir_str, '-d', download_dir_str]
|
||||
|
||||
# Check for debug mode and add --debug flag
|
||||
from ..handlers.config_handler import ConfigHandler
|
||||
config_handler = ConfigHandler()
|
||||
debug_mode = config_handler.get('debug_mode', False)
|
||||
if debug_mode:
|
||||
cmd.append('--debug')
|
||||
logger.debug("DEBUG: Added --debug flag to jackify-engine command")
|
||||
|
||||
# NOTE: API key is passed via environment variable only, not as command line argument
|
||||
|
||||
# Store original environment values (copied from working code)
|
||||
original_env_values = {
|
||||
'NEXUS_API_KEY': os.environ.get('NEXUS_API_KEY'),
|
||||
'DOTNET_SYSTEM_GLOBALIZATION_INVARIANT': os.environ.get('DOTNET_SYSTEM_GLOBALIZATION_INVARIANT')
|
||||
}
|
||||
|
||||
try:
|
||||
# Environment setup (copied from working code)
|
||||
if api_key:
|
||||
os.environ['NEXUS_API_KEY'] = api_key
|
||||
elif 'NEXUS_API_KEY' in os.environ:
|
||||
del os.environ['NEXUS_API_KEY']
|
||||
|
||||
os.environ['DOTNET_SYSTEM_GLOBALIZATION_INVARIANT'] = "1"
|
||||
|
||||
pretty_cmd = ' '.join([f'"{arg}"' if ' ' in arg else arg for arg in cmd])
|
||||
if output_callback:
|
||||
output_callback(f"Launching Jackify Install Engine with command: {pretty_cmd}")
|
||||
|
||||
# Temporarily increase file descriptor limit for engine process
|
||||
from jackify.backend.handlers.subprocess_utils import increase_file_descriptor_limit
|
||||
success, old_limit, new_limit, message = increase_file_descriptor_limit()
|
||||
if output_callback:
|
||||
if success:
|
||||
output_callback(f"File descriptor limit: {message}")
|
||||
else:
|
||||
output_callback(f"File descriptor limit warning: {message}")
|
||||
|
||||
# Subprocess call (copied from working code)
|
||||
proc = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, text=False, env=None, cwd=engine_dir)
|
||||
|
||||
# Output processing (copied from working code)
|
||||
buffer = b''
|
||||
while True:
|
||||
chunk = proc.stdout.read(1)
|
||||
if not chunk:
|
||||
break
|
||||
buffer += chunk
|
||||
|
||||
if chunk == b'\n':
|
||||
line = buffer.decode('utf-8', errors='replace')
|
||||
if output_callback:
|
||||
output_callback(line.rstrip())
|
||||
buffer = b''
|
||||
elif chunk == b'\r':
|
||||
line = buffer.decode('utf-8', errors='replace')
|
||||
if output_callback:
|
||||
output_callback(line.rstrip())
|
||||
buffer = b''
|
||||
|
||||
if buffer:
|
||||
line = buffer.decode('utf-8', errors='replace')
|
||||
if output_callback:
|
||||
output_callback(line.rstrip())
|
||||
|
||||
proc.wait()
|
||||
if proc.returncode != 0:
|
||||
if output_callback:
|
||||
output_callback(f"Jackify Install Engine exited with code {proc.returncode}.")
|
||||
return False
|
||||
else:
|
||||
if output_callback:
|
||||
output_callback("Installation completed successfully")
|
||||
return True
|
||||
|
||||
finally:
|
||||
# Restore environment (copied from working code)
|
||||
for key, original_value in original_env_values.items():
|
||||
if original_value is not None:
|
||||
os.environ[key] = original_value
|
||||
else:
|
||||
if key in os.environ:
|
||||
del os.environ[key]
|
||||
|
||||
except Exception as e:
|
||||
error_msg = f"Error running Jackify Install Engine: {e}"
|
||||
logger.error(error_msg)
|
||||
if output_callback:
|
||||
output_callback(error_msg)
|
||||
return False
|
||||
|
||||
def configure_modlist_post_steam(self, context: ModlistContext,
|
||||
progress_callback=None,
|
||||
manual_steps_callback=None,
|
||||
completion_callback=None) -> bool:
|
||||
"""Configure a modlist after Steam setup is complete.
|
||||
|
||||
This method should only be called AFTER:
|
||||
1. Modlist installation is complete
|
||||
2. Steam shortcut has been created
|
||||
3. Steam has been restarted
|
||||
4. Manual Proton steps have been completed
|
||||
|
||||
Args:
|
||||
context: Modlist context with updated app_id
|
||||
progress_callback: Optional callback for progress updates
|
||||
manual_steps_callback: Called when manual steps needed
|
||||
completion_callback: Called when configuration is complete
|
||||
|
||||
Returns:
|
||||
True if configuration successful, False otherwise
|
||||
"""
|
||||
logger.info(f"Configuring modlist after Steam setup: {context.name}")
|
||||
|
||||
# Check if debug mode is enabled and create debug callback
|
||||
from ..handlers.config_handler import ConfigHandler
|
||||
config_handler = ConfigHandler()
|
||||
debug_mode = config_handler.get('debug_mode', False)
|
||||
|
||||
def debug_callback(message):
|
||||
"""Send debug message to GUI if debug mode is enabled"""
|
||||
if debug_mode and progress_callback:
|
||||
progress_callback(f"[DEBUG] {message}")
|
||||
|
||||
debug_callback(f"Starting configuration for {context.name}")
|
||||
debug_callback(f"Debug mode enabled: {debug_mode}")
|
||||
debug_callback(f"Install directory: {context.install_dir}")
|
||||
debug_callback(f"Resolution: {getattr(context, 'resolution', 'Not set')}")
|
||||
debug_callback(f"App ID: {getattr(context, 'app_id', 'Not set')}")
|
||||
|
||||
# Set up a custom logging handler to capture backend DEBUG messages
|
||||
gui_log_handler = None
|
||||
if debug_mode and progress_callback:
|
||||
import logging
|
||||
|
||||
class GuiLogHandler(logging.Handler):
|
||||
def __init__(self, callback):
|
||||
super().__init__()
|
||||
self.callback = callback
|
||||
self.setLevel(logging.DEBUG)
|
||||
|
||||
def emit(self, record):
|
||||
try:
|
||||
msg = self.format(record)
|
||||
if record.levelno == logging.DEBUG:
|
||||
self.callback(f"[DEBUG] {msg}")
|
||||
elif record.levelno >= logging.WARNING:
|
||||
self.callback(f"[{record.levelname}] {msg}")
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
gui_log_handler = GuiLogHandler(progress_callback)
|
||||
gui_log_handler.setFormatter(logging.Formatter('%(message)s'))
|
||||
|
||||
# Add the GUI handler to key backend loggers
|
||||
backend_logger_names = [
|
||||
'jackify.backend.handlers.menu_handler',
|
||||
'jackify.backend.handlers.modlist_handler',
|
||||
'jackify.backend.handlers.install_wabbajack_handler',
|
||||
'jackify.backend.handlers.wabbajack_handler',
|
||||
'jackify.backend.handlers.shortcut_handler',
|
||||
'jackify.backend.handlers.protontricks_handler',
|
||||
'jackify.backend.handlers.validation_handler',
|
||||
'jackify.backend.handlers.resolution_handler'
|
||||
]
|
||||
|
||||
for logger_name in backend_logger_names:
|
||||
backend_logger = logging.getLogger(logger_name)
|
||||
backend_logger.addHandler(gui_log_handler)
|
||||
backend_logger.setLevel(logging.DEBUG)
|
||||
|
||||
debug_callback("GUI logging handler installed for backend services")
|
||||
|
||||
try:
|
||||
# COPY THE WORKING LOGIC: Use menu handler for configuration only
|
||||
from ..handlers.menu_handler import ModlistMenuHandler
|
||||
|
||||
# Initialize handlers (same as working code)
|
||||
modlist_menu = ModlistMenuHandler(config_handler)
|
||||
|
||||
# Build configuration context (copied from working code)
|
||||
config_context = {
|
||||
'name': context.name,
|
||||
'path': str(context.install_dir),
|
||||
'mo2_exe_path': str(context.install_dir / 'ModOrganizer.exe'),
|
||||
'resolution': getattr(context, 'resolution', None),
|
||||
'skip_confirmation': True, # Service layer should be non-interactive
|
||||
'manual_steps_completed': True, # Manual steps were done in GUI
|
||||
'appid': getattr(context, 'app_id', None), # Use updated app_id from Steam
|
||||
'engine_installed': getattr(context, 'engine_installed', False) # Path manipulation flag
|
||||
}
|
||||
|
||||
debug_callback(f"Configuration context built: {config_context}")
|
||||
debug_callback("Setting up GUI mode and stdout redirection")
|
||||
|
||||
# Set GUI mode for proper callback handling
|
||||
import os
|
||||
original_gui_mode = os.environ.get('JACKIFY_GUI_MODE')
|
||||
original_stdout = None
|
||||
|
||||
try:
|
||||
# Force GUI mode to prevent input prompts
|
||||
os.environ['JACKIFY_GUI_MODE'] = '1'
|
||||
|
||||
# CRITICAL FIX: Redirect print output to capture progress messages
|
||||
import sys
|
||||
from io import StringIO
|
||||
|
||||
# Create a custom stdout that forwards to GUI
|
||||
class GuiRedirectStdout:
|
||||
def __init__(self, callback):
|
||||
self.callback = callback
|
||||
self.buffer = ""
|
||||
|
||||
def write(self, text):
|
||||
if self.callback and text.strip():
|
||||
# Convert ANSI codes to HTML for colored GUI output
|
||||
try:
|
||||
from ...frontends.gui.utils import ansi_to_html
|
||||
# Clean up carriage returns but preserve ANSI colors
|
||||
clean_text = text.replace('\r', '').strip()
|
||||
if clean_text and clean_text != "Current Task: ":
|
||||
# Convert ANSI to HTML for colored display
|
||||
html_text = ansi_to_html(clean_text)
|
||||
self.callback(html_text)
|
||||
except ImportError:
|
||||
# Fallback: strip ANSI codes if ansi_to_html not available
|
||||
import re
|
||||
clean_text = re.sub(r'\x1b\[[0-9;]*[mK]', '', text)
|
||||
clean_text = clean_text.replace('\r', '').strip()
|
||||
if clean_text and clean_text != "Current Task: ":
|
||||
self.callback(clean_text)
|
||||
return len(text)
|
||||
|
||||
def flush(self):
|
||||
pass
|
||||
|
||||
# Redirect stdout to capture print statements
|
||||
if progress_callback:
|
||||
original_stdout = sys.stdout
|
||||
sys.stdout = GuiRedirectStdout(progress_callback)
|
||||
|
||||
# Call the working configuration-only method
|
||||
debug_callback("Calling run_modlist_configuration_phase")
|
||||
success = modlist_menu.run_modlist_configuration_phase(config_context)
|
||||
debug_callback(f"Configuration phase result: {success}")
|
||||
|
||||
# Restore stdout before calling completion callback
|
||||
if original_stdout:
|
||||
sys.stdout = original_stdout
|
||||
original_stdout = None
|
||||
|
||||
if completion_callback:
|
||||
if success:
|
||||
debug_callback("Configuration completed successfully, calling completion callback")
|
||||
completion_callback(True, "Configuration completed successfully!", context.name)
|
||||
else:
|
||||
debug_callback("Configuration failed, calling completion callback with failure")
|
||||
completion_callback(False, "Configuration failed", context.name)
|
||||
|
||||
return success
|
||||
|
||||
finally:
|
||||
# Always restore stdout and environment
|
||||
if original_stdout:
|
||||
sys.stdout = original_stdout
|
||||
|
||||
if original_gui_mode is not None:
|
||||
os.environ['JACKIFY_GUI_MODE'] = original_gui_mode
|
||||
else:
|
||||
os.environ.pop('JACKIFY_GUI_MODE', None)
|
||||
|
||||
# Remove GUI log handler to avoid memory leaks
|
||||
if gui_log_handler:
|
||||
for logger_name in [
|
||||
'jackify.backend.handlers.menu_handler',
|
||||
'jackify.backend.handlers.modlist_handler',
|
||||
'jackify.backend.handlers.install_wabbajack_handler',
|
||||
'jackify.backend.handlers.wabbajack_handler',
|
||||
'jackify.backend.handlers.shortcut_handler',
|
||||
'jackify.backend.handlers.protontricks_handler',
|
||||
'jackify.backend.handlers.validation_handler',
|
||||
'jackify.backend.handlers.resolution_handler'
|
||||
]:
|
||||
backend_logger = logging.getLogger(logger_name)
|
||||
backend_logger.removeHandler(gui_log_handler)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to configure modlist {context.name}: {e}")
|
||||
if completion_callback:
|
||||
completion_callback(False, f"Configuration failed: {e}", context.name)
|
||||
|
||||
# Clean up GUI log handler on exception
|
||||
if gui_log_handler:
|
||||
for logger_name in [
|
||||
'jackify.backend.handlers.menu_handler',
|
||||
'jackify.backend.handlers.modlist_handler',
|
||||
'jackify.backend.handlers.install_wabbajack_handler',
|
||||
'jackify.backend.handlers.wabbajack_handler',
|
||||
'jackify.backend.handlers.shortcut_handler',
|
||||
'jackify.backend.handlers.protontricks_handler',
|
||||
'jackify.backend.handlers.validation_handler',
|
||||
'jackify.backend.handlers.resolution_handler'
|
||||
]:
|
||||
backend_logger = logging.getLogger(logger_name)
|
||||
backend_logger.removeHandler(gui_log_handler)
|
||||
|
||||
return False
|
||||
|
||||
def configure_modlist(self, context: ModlistContext,
|
||||
progress_callback=None,
|
||||
manual_steps_callback=None,
|
||||
completion_callback=None,
|
||||
output_callback=None) -> bool:
|
||||
"""Configure a modlist after installation.
|
||||
|
||||
Args:
|
||||
context: Modlist context
|
||||
progress_callback: Optional callback for progress updates
|
||||
manual_steps_callback: Optional callback for manual steps
|
||||
completion_callback: Optional callback for completion
|
||||
output_callback: Optional callback for output/logging
|
||||
|
||||
Returns:
|
||||
True if configuration successful, False otherwise
|
||||
"""
|
||||
logger.info(f"Configuring modlist: {context.name}")
|
||||
|
||||
try:
|
||||
# Use the working ModlistMenuHandler for configuration
|
||||
from ..handlers.menu_handler import ModlistMenuHandler
|
||||
from ..handlers.config_handler import ConfigHandler
|
||||
|
||||
config_handler = ConfigHandler()
|
||||
modlist_menu = ModlistMenuHandler(config_handler)
|
||||
|
||||
# Build configuration context
|
||||
config_context = {
|
||||
'name': context.name,
|
||||
'path': str(context.install_dir),
|
||||
'mo2_exe_path': str(context.install_dir / 'ModOrganizer.exe'),
|
||||
'resolution': getattr(context, 'resolution', None),
|
||||
'skip_confirmation': True, # Service layer should be non-interactive
|
||||
'manual_steps_completed': False
|
||||
}
|
||||
|
||||
# Run the complete configuration phase
|
||||
success = modlist_menu.run_modlist_configuration_phase(config_context)
|
||||
|
||||
if success:
|
||||
logger.info("Modlist configuration completed successfully")
|
||||
if completion_callback:
|
||||
completion_callback(True, "Configuration completed successfully", context.name)
|
||||
else:
|
||||
logger.warning("Modlist configuration had issues")
|
||||
if completion_callback:
|
||||
completion_callback(False, "Configuration failed", context.name)
|
||||
|
||||
return success
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to configure modlist {context.name}: {e}")
|
||||
return False
|
||||
|
||||
def _validate_install_context(self, context: ModlistContext) -> bool:
|
||||
"""Validate that the installation context is complete and valid.
|
||||
|
||||
Args:
|
||||
context: The context to validate
|
||||
|
||||
Returns:
|
||||
True if valid, False otherwise
|
||||
"""
|
||||
if not context.name:
|
||||
logger.error("Modlist name is required")
|
||||
return False
|
||||
|
||||
if not context.install_dir:
|
||||
logger.error("Install directory is required")
|
||||
return False
|
||||
|
||||
if not context.download_dir:
|
||||
logger.error("Download directory is required")
|
||||
return False
|
||||
|
||||
if not context.nexus_api_key:
|
||||
logger.error("Nexus API key is required")
|
||||
return False
|
||||
|
||||
if not context.game_type:
|
||||
logger.error("Game type is required")
|
||||
return False
|
||||
|
||||
return True
|
||||
406
jackify/backend/services/native_steam_service.py
Normal file
406
jackify/backend/services/native_steam_service.py
Normal file
@@ -0,0 +1,406 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Native Steam Shortcut and Proton Management Service
|
||||
|
||||
This service replaces STL entirely with native Python VDF manipulation.
|
||||
Handles both shortcut creation and Proton version setting reliably.
|
||||
"""
|
||||
|
||||
import os
|
||||
import sys
|
||||
import time
|
||||
import logging
|
||||
import hashlib
|
||||
import vdf
|
||||
from pathlib import Path
|
||||
from typing import Optional, Tuple, Dict, Any, List
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
class NativeSteamService:
|
||||
"""
|
||||
Native Steam shortcut and Proton management service.
|
||||
|
||||
This completely replaces STL with reliable VDF manipulation that:
|
||||
1. Creates shortcuts with proper VDF structure
|
||||
2. Sets Proton versions in the correct config files
|
||||
3. Never corrupts existing shortcuts
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
self.steam_path = Path.home() / ".steam" / "steam"
|
||||
self.userdata_path = self.steam_path / "userdata"
|
||||
self.user_id = None
|
||||
self.user_config_path = None
|
||||
|
||||
def find_steam_user(self) -> bool:
|
||||
"""Find the active Steam user directory"""
|
||||
try:
|
||||
if not self.userdata_path.exists():
|
||||
logger.error("Steam userdata directory not found")
|
||||
return False
|
||||
|
||||
# Find the first user directory (usually there's only one)
|
||||
user_dirs = [d for d in self.userdata_path.iterdir() if d.is_dir() and d.name.isdigit()]
|
||||
if not user_dirs:
|
||||
logger.error("No Steam user directories found")
|
||||
return False
|
||||
|
||||
# Use the first user directory
|
||||
user_dir = user_dirs[0]
|
||||
self.user_id = user_dir.name
|
||||
self.user_config_path = user_dir / "config"
|
||||
|
||||
logger.info(f"Found Steam user: {self.user_id}")
|
||||
logger.info(f"User config path: {self.user_config_path}")
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error finding Steam user: {e}")
|
||||
return False
|
||||
|
||||
def get_shortcuts_vdf_path(self) -> Optional[Path]:
|
||||
"""Get the path to shortcuts.vdf"""
|
||||
if not self.user_config_path:
|
||||
if not self.find_steam_user():
|
||||
return None
|
||||
|
||||
shortcuts_path = self.user_config_path / "shortcuts.vdf"
|
||||
return shortcuts_path if shortcuts_path.exists() else shortcuts_path
|
||||
|
||||
def get_localconfig_vdf_path(self) -> Optional[Path]:
|
||||
"""Get the path to localconfig.vdf"""
|
||||
if not self.user_config_path:
|
||||
if not self.find_steam_user():
|
||||
return None
|
||||
|
||||
return self.user_config_path / "localconfig.vdf"
|
||||
|
||||
def read_shortcuts_vdf(self) -> Dict[str, Any]:
|
||||
"""Read the shortcuts.vdf file safely"""
|
||||
shortcuts_path = self.get_shortcuts_vdf_path()
|
||||
if not shortcuts_path:
|
||||
return {'shortcuts': {}}
|
||||
|
||||
try:
|
||||
if shortcuts_path.exists():
|
||||
with open(shortcuts_path, 'rb') as f:
|
||||
data = vdf.binary_load(f)
|
||||
return data
|
||||
else:
|
||||
logger.info("shortcuts.vdf does not exist, will create new one")
|
||||
return {'shortcuts': {}}
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error reading shortcuts.vdf: {e}")
|
||||
return {'shortcuts': {}}
|
||||
|
||||
def write_shortcuts_vdf(self, data: Dict[str, Any]) -> bool:
|
||||
"""Write the shortcuts.vdf file safely"""
|
||||
shortcuts_path = self.get_shortcuts_vdf_path()
|
||||
if not shortcuts_path:
|
||||
return False
|
||||
|
||||
try:
|
||||
# Create backup first
|
||||
if shortcuts_path.exists():
|
||||
backup_path = shortcuts_path.with_suffix(f".vdf.backup_{int(time.time())}")
|
||||
import shutil
|
||||
shutil.copy2(shortcuts_path, backup_path)
|
||||
logger.info(f"Created backup: {backup_path}")
|
||||
|
||||
# Ensure parent directory exists
|
||||
shortcuts_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# Write the VDF file
|
||||
with open(shortcuts_path, 'wb') as f:
|
||||
vdf.binary_dump(data, f)
|
||||
|
||||
logger.info("Successfully wrote shortcuts.vdf")
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error writing shortcuts.vdf: {e}")
|
||||
return False
|
||||
|
||||
def generate_app_id(self, app_name: str, exe_path: str) -> Tuple[int, int]:
|
||||
"""
|
||||
Generate AppID using STL's exact algorithm (MD5-based).
|
||||
|
||||
This matches STL's generateShortcutVDFAppId and generateSteamShortID functions:
|
||||
1. Combine AppName + ExePath
|
||||
2. Generate MD5 hash, take first 8 characters
|
||||
3. Convert to decimal, make negative, ensure < 1 billion
|
||||
4. Convert signed to unsigned for CompatToolMapping
|
||||
|
||||
Returns:
|
||||
(signed_app_id, unsigned_app_id) - Both the signed and unsigned versions
|
||||
"""
|
||||
# STL's algorithm: MD5 of app_name + exe_path
|
||||
input_string = f"{app_name}{exe_path}"
|
||||
|
||||
# Generate MD5 hash and take first 8 characters
|
||||
md5_hash = hashlib.md5(input_string.encode('utf-8')).hexdigest()
|
||||
seed = md5_hash[:8]
|
||||
|
||||
# Convert hex to decimal and make it negative with modulo 1 billion
|
||||
seed_decimal = int(seed, 16)
|
||||
signed_app_id = -(seed_decimal % 1000000000)
|
||||
|
||||
# Convert to unsigned using steam-conductor/trentondyck method (signed_app_id + 2^32)
|
||||
unsigned_app_id = signed_app_id + 2**32
|
||||
|
||||
logger.info(f"Generated AppID using STL algorithm for '{app_name}' + '{exe_path}': {signed_app_id} (unsigned: {unsigned_app_id})")
|
||||
return signed_app_id, unsigned_app_id
|
||||
|
||||
def create_shortcut(self, app_name: str, exe_path: str, start_dir: str = None,
|
||||
launch_options: str = "%command%", tags: List[str] = None) -> Tuple[bool, Optional[int]]:
|
||||
"""
|
||||
Create a Steam shortcut using direct VDF manipulation.
|
||||
|
||||
Args:
|
||||
app_name: The shortcut name
|
||||
exe_path: Path to the executable
|
||||
start_dir: Start directory (defaults to exe directory)
|
||||
launch_options: Launch options (defaults to "%command%")
|
||||
tags: List of tags to apply
|
||||
|
||||
Returns:
|
||||
(success, unsigned_app_id) - Success status and the AppID
|
||||
"""
|
||||
if not start_dir:
|
||||
start_dir = str(Path(exe_path).parent)
|
||||
|
||||
if not tags:
|
||||
tags = ["Jackify"]
|
||||
|
||||
logger.info(f"Creating shortcut '{app_name}' for '{exe_path}'")
|
||||
|
||||
try:
|
||||
# Read current shortcuts
|
||||
data = self.read_shortcuts_vdf()
|
||||
shortcuts = data.get('shortcuts', {})
|
||||
|
||||
# Generate AppID
|
||||
signed_app_id, unsigned_app_id = self.generate_app_id(app_name, exe_path)
|
||||
|
||||
# Find next available index
|
||||
indices = [int(k) for k in shortcuts.keys() if k.isdigit()]
|
||||
next_index = max(indices, default=-1) + 1
|
||||
|
||||
# Get icon path from SteamIcons directory if available
|
||||
icon_path = ''
|
||||
steamicons_dir = Path(exe_path).parent / "SteamIcons"
|
||||
if steamicons_dir.is_dir():
|
||||
grid_tall_icon = steamicons_dir / "grid-tall.png"
|
||||
if grid_tall_icon.exists():
|
||||
icon_path = str(grid_tall_icon)
|
||||
logger.info(f"Using icon from SteamIcons: {icon_path}")
|
||||
else:
|
||||
# Look for any PNG file
|
||||
png_files = list(steamicons_dir.glob("*.png"))
|
||||
if png_files:
|
||||
icon_path = str(png_files[0])
|
||||
logger.info(f"Using fallback icon: {icon_path}")
|
||||
|
||||
# Create the shortcut entry with proper structure
|
||||
shortcut_entry = {
|
||||
'appid': signed_app_id, # Use signed AppID in shortcuts.vdf
|
||||
'AppName': app_name,
|
||||
'Exe': f'"{exe_path}"',
|
||||
'StartDir': f'"{start_dir}"',
|
||||
'icon': icon_path,
|
||||
'ShortcutPath': '',
|
||||
'LaunchOptions': launch_options,
|
||||
'IsHidden': 0,
|
||||
'AllowDesktopConfig': 1,
|
||||
'AllowOverlay': 1,
|
||||
'OpenVR': 0,
|
||||
'Devkit': 0,
|
||||
'DevkitGameID': '',
|
||||
'DevkitOverrideAppID': 0,
|
||||
'LastPlayTime': 0,
|
||||
'IsInstalled': 1, # Mark as installed so it appears in "Installed locally"
|
||||
'FlatpakAppID': '',
|
||||
'tags': {}
|
||||
}
|
||||
|
||||
# Add tags
|
||||
for i, tag in enumerate(tags):
|
||||
shortcut_entry['tags'][str(i)] = tag
|
||||
|
||||
# Add to shortcuts
|
||||
shortcuts[str(next_index)] = shortcut_entry
|
||||
data['shortcuts'] = shortcuts
|
||||
|
||||
# Write back to file
|
||||
if self.write_shortcuts_vdf(data):
|
||||
logger.info(f"✅ Shortcut created successfully at index {next_index}")
|
||||
return True, unsigned_app_id
|
||||
else:
|
||||
logger.error("❌ Failed to write shortcut to VDF")
|
||||
return False, None
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"❌ Error creating shortcut: {e}")
|
||||
return False, None
|
||||
|
||||
def set_proton_version(self, app_id: int, proton_version: str = "proton_experimental") -> bool:
|
||||
"""
|
||||
Set the Proton version for a specific app using ONLY config.vdf like steam-conductor does.
|
||||
|
||||
Args:
|
||||
app_id: The unsigned AppID
|
||||
proton_version: The Proton version to set
|
||||
|
||||
Returns:
|
||||
True if successful
|
||||
"""
|
||||
logger.info(f"Setting Proton version '{proton_version}' for AppID {app_id} using STL-compatible format")
|
||||
|
||||
try:
|
||||
# Step 1: Write to the main config.vdf for CompatToolMapping
|
||||
config_path = self.steam_path / "config" / "config.vdf"
|
||||
|
||||
if not config_path.exists():
|
||||
logger.error(f"Steam config.vdf not found at: {config_path}")
|
||||
return False
|
||||
|
||||
# Create backup first
|
||||
backup_path = config_path.with_suffix(f".vdf.backup_{int(time.time())}")
|
||||
import shutil
|
||||
shutil.copy2(config_path, backup_path)
|
||||
logger.info(f"Created backup: {backup_path}")
|
||||
|
||||
# Read the file as text to avoid VDF library formatting issues
|
||||
with open(config_path, 'r', encoding='utf-8', errors='ignore') as f:
|
||||
config_text = f.read()
|
||||
|
||||
# Find the CompatToolMapping section
|
||||
compat_start = config_text.find('"CompatToolMapping"')
|
||||
if compat_start == -1:
|
||||
logger.error("CompatToolMapping section not found in config.vdf")
|
||||
return False
|
||||
|
||||
# Find the closing brace for CompatToolMapping
|
||||
# Look for the opening brace after CompatToolMapping
|
||||
brace_start = config_text.find('{', compat_start)
|
||||
if brace_start == -1:
|
||||
logger.error("CompatToolMapping opening brace not found")
|
||||
return False
|
||||
|
||||
# Count braces to find the matching closing brace
|
||||
brace_count = 1
|
||||
pos = brace_start + 1
|
||||
compat_end = -1
|
||||
|
||||
while pos < len(config_text) and brace_count > 0:
|
||||
if config_text[pos] == '{':
|
||||
brace_count += 1
|
||||
elif config_text[pos] == '}':
|
||||
brace_count -= 1
|
||||
if brace_count == 0:
|
||||
compat_end = pos
|
||||
break
|
||||
pos += 1
|
||||
|
||||
if compat_end == -1:
|
||||
logger.error("CompatToolMapping closing brace not found")
|
||||
return False
|
||||
|
||||
# Check if this AppID already exists
|
||||
app_id_pattern = f'"{app_id}"'
|
||||
app_id_exists = app_id_pattern in config_text[compat_start:compat_end]
|
||||
|
||||
if app_id_exists:
|
||||
logger.info(f"AppID {app_id} already exists in CompatToolMapping, will be overwritten")
|
||||
# Remove the existing entry by finding and removing the entire block
|
||||
# This is complex, so for now just add at the end
|
||||
|
||||
# Create the new entry in STL's exact format (tabs between key and value)
|
||||
new_entry = f'\t\t\t\t\t"{app_id}"\n\t\t\t\t\t{{\n\t\t\t\t\t\t"name"\t\t"{proton_version}"\n\t\t\t\t\t\t"config"\t\t""\n\t\t\t\t\t\t"priority"\t\t"250"\n\t\t\t\t\t}}\n'
|
||||
|
||||
# Insert the new entry just before the closing brace of CompatToolMapping
|
||||
new_config_text = config_text[:compat_end] + new_entry + config_text[compat_end:]
|
||||
|
||||
# Write back the modified text
|
||||
with open(config_path, 'w', encoding='utf-8') as f:
|
||||
f.write(new_config_text)
|
||||
|
||||
logger.info(f"✅ Successfully set Proton version '{proton_version}' for AppID {app_id} using config.vdf only (steam-conductor method)")
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"❌ Error setting Proton version: {e}")
|
||||
return False
|
||||
|
||||
def create_shortcut_with_proton(self, app_name: str, exe_path: str, start_dir: str = None,
|
||||
launch_options: str = "%command%", tags: List[str] = None,
|
||||
proton_version: str = "proton_experimental") -> Tuple[bool, Optional[int]]:
|
||||
"""
|
||||
Complete workflow: Create shortcut and set Proton version.
|
||||
|
||||
This is the main method that replaces STL entirely.
|
||||
|
||||
Returns:
|
||||
(success, app_id) - Success status and the AppID
|
||||
"""
|
||||
logger.info(f"Creating shortcut with Proton: '{app_name}' -> '{proton_version}'")
|
||||
|
||||
# Step 1: Create the shortcut
|
||||
success, app_id = self.create_shortcut(app_name, exe_path, start_dir, launch_options, tags)
|
||||
if not success:
|
||||
logger.error("Failed to create shortcut")
|
||||
return False, None
|
||||
|
||||
# Step 2: Set the Proton version
|
||||
if not self.set_proton_version(app_id, proton_version):
|
||||
logger.error("Failed to set Proton version (shortcut still created)")
|
||||
return False, app_id # Shortcut exists but Proton setting failed
|
||||
|
||||
logger.info(f"✅ Complete workflow successful: '{app_name}' with '{proton_version}'")
|
||||
return True, app_id
|
||||
|
||||
def list_shortcuts(self) -> Dict[str, str]:
|
||||
"""List all existing shortcuts (for debugging)"""
|
||||
shortcuts = self.read_shortcuts_vdf().get('shortcuts', {})
|
||||
shortcut_list = {}
|
||||
|
||||
for index, shortcut in shortcuts.items():
|
||||
app_name = shortcut.get('AppName', 'Unknown')
|
||||
shortcut_list[index] = app_name
|
||||
|
||||
return shortcut_list
|
||||
|
||||
def remove_shortcut(self, app_name: str) -> bool:
|
||||
"""Remove a shortcut by name"""
|
||||
try:
|
||||
data = self.read_shortcuts_vdf()
|
||||
shortcuts = data.get('shortcuts', {})
|
||||
|
||||
# Find shortcut by name
|
||||
to_remove = None
|
||||
for index, shortcut in shortcuts.items():
|
||||
if shortcut.get('AppName') == app_name:
|
||||
to_remove = index
|
||||
break
|
||||
|
||||
if to_remove is None:
|
||||
logger.warning(f"Shortcut '{app_name}' not found")
|
||||
return False
|
||||
|
||||
# Remove the shortcut
|
||||
del shortcuts[to_remove]
|
||||
data['shortcuts'] = shortcuts
|
||||
|
||||
# Write back
|
||||
if self.write_shortcuts_vdf(data):
|
||||
logger.info(f"✅ Removed shortcut '{app_name}'")
|
||||
return True
|
||||
else:
|
||||
logger.error("❌ Failed to write updated shortcuts")
|
||||
return False
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"❌ Error removing shortcut: {e}")
|
||||
return False
|
||||
220
jackify/backend/services/protontricks_detection_service.py
Normal file
220
jackify/backend/services/protontricks_detection_service.py
Normal file
@@ -0,0 +1,220 @@
|
||||
#!/usr/bin/env python3
|
||||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
Protontricks Detection Service Module
|
||||
Centralized service for detecting and managing protontricks installation across CLI and GUI frontends
|
||||
"""
|
||||
|
||||
import logging
|
||||
import shutil
|
||||
import subprocess
|
||||
from typing import Optional, Tuple
|
||||
from ..handlers.protontricks_handler import ProtontricksHandler
|
||||
from ..handlers.config_handler import ConfigHandler
|
||||
|
||||
# Initialize logger
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class ProtontricksDetectionService:
|
||||
"""
|
||||
Centralized service for detecting and managing protontricks installation
|
||||
Handles detection, validation, and installation guidance for both CLI and GUI
|
||||
"""
|
||||
|
||||
def __init__(self, steamdeck: bool = False):
|
||||
"""
|
||||
Initialize the protontricks detection service
|
||||
|
||||
Args:
|
||||
steamdeck (bool): Whether running on Steam Deck
|
||||
"""
|
||||
self.steamdeck = steamdeck
|
||||
self.config_handler = ConfigHandler()
|
||||
self._protontricks_handler = None
|
||||
self._last_detection_result = None
|
||||
self._cached_detection_valid = False
|
||||
logger.debug(f"ProtontricksDetectionService initialized (steamdeck={steamdeck})")
|
||||
|
||||
def _get_protontricks_handler(self) -> ProtontricksHandler:
|
||||
"""Get or create ProtontricksHandler instance"""
|
||||
if self._protontricks_handler is None:
|
||||
self._protontricks_handler = ProtontricksHandler(steamdeck=self.steamdeck)
|
||||
return self._protontricks_handler
|
||||
|
||||
def detect_protontricks(self, use_cache: bool = True) -> Tuple[bool, str, str]:
|
||||
"""
|
||||
Detect if protontricks is installed and get installation details
|
||||
|
||||
Args:
|
||||
use_cache (bool): Whether to use cached detection result
|
||||
|
||||
Returns:
|
||||
Tuple[bool, str, str]: (is_installed, installation_type, details_message)
|
||||
- is_installed: True if protontricks is available
|
||||
- installation_type: 'native', 'flatpak', or 'none'
|
||||
- details_message: Human-readable status message
|
||||
"""
|
||||
if use_cache and self._cached_detection_valid and self._last_detection_result:
|
||||
logger.debug("Using cached protontricks detection result")
|
||||
return self._last_detection_result
|
||||
|
||||
logger.info("Detecting protontricks installation...")
|
||||
|
||||
handler = self._get_protontricks_handler()
|
||||
|
||||
# Reset handler state for fresh detection
|
||||
handler.which_protontricks = None
|
||||
handler.protontricks_path = None
|
||||
handler.protontricks_version = None
|
||||
|
||||
# Perform detection without user prompts
|
||||
is_installed = self._detect_without_prompts(handler)
|
||||
|
||||
# Determine installation type and create message
|
||||
if is_installed:
|
||||
installation_type = handler.which_protontricks or 'unknown'
|
||||
if installation_type == 'native':
|
||||
details_message = f"Native protontricks found at {handler.protontricks_path}"
|
||||
elif installation_type == 'flatpak':
|
||||
details_message = "Flatpak protontricks is installed"
|
||||
else:
|
||||
details_message = "Protontricks is installed (unknown type)"
|
||||
else:
|
||||
installation_type = 'none'
|
||||
details_message = "Protontricks not found - required for Jackify functionality"
|
||||
|
||||
# Cache the result
|
||||
self._last_detection_result = (is_installed, installation_type, details_message)
|
||||
self._cached_detection_valid = True
|
||||
|
||||
logger.info(f"Protontricks detection complete: {details_message}")
|
||||
return self._last_detection_result
|
||||
|
||||
def _detect_without_prompts(self, handler: ProtontricksHandler) -> bool:
|
||||
"""
|
||||
Detect protontricks without user prompts or installation attempts
|
||||
|
||||
Args:
|
||||
handler (ProtontricksHandler): Handler instance to use
|
||||
|
||||
Returns:
|
||||
bool: True if protontricks is found
|
||||
"""
|
||||
import shutil
|
||||
|
||||
# Check if protontricks exists as a command
|
||||
protontricks_path_which = shutil.which("protontricks")
|
||||
|
||||
if protontricks_path_which:
|
||||
# Check if it's a flatpak wrapper
|
||||
try:
|
||||
with open(protontricks_path_which, 'r') as f:
|
||||
content = f.read()
|
||||
if "flatpak run" in content:
|
||||
logger.debug(f"Detected Protontricks is a Flatpak wrapper at {protontricks_path_which}")
|
||||
handler.which_protontricks = 'flatpak'
|
||||
# Continue to check flatpak list just to be sure
|
||||
else:
|
||||
logger.info(f"Native Protontricks found at {protontricks_path_which}")
|
||||
handler.which_protontricks = 'native'
|
||||
handler.protontricks_path = protontricks_path_which
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.error(f"Error reading protontricks executable: {e}")
|
||||
|
||||
# Check if flatpak protontricks is installed
|
||||
try:
|
||||
env = handler._get_clean_subprocess_env()
|
||||
result = subprocess.run(
|
||||
["flatpak", "list"],
|
||||
capture_output=True,
|
||||
text=True,
|
||||
check=True,
|
||||
env=env
|
||||
)
|
||||
if "com.github.Matoking.protontricks" in result.stdout:
|
||||
logger.info("Flatpak Protontricks is installed")
|
||||
handler.which_protontricks = 'flatpak'
|
||||
return True
|
||||
except FileNotFoundError:
|
||||
logger.warning("'flatpak' command not found. Cannot check for Flatpak Protontricks.")
|
||||
except subprocess.CalledProcessError as e:
|
||||
logger.warning(f"Error checking flatpak list: {e}")
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error checking flatpak: {e}")
|
||||
|
||||
return False
|
||||
|
||||
def install_flatpak_protontricks(self) -> Tuple[bool, str]:
|
||||
"""
|
||||
Install protontricks via Flatpak
|
||||
|
||||
Returns:
|
||||
Tuple[bool, str]: (success, message)
|
||||
"""
|
||||
logger.info("Attempting to install Flatpak Protontricks...")
|
||||
|
||||
try:
|
||||
handler = self._get_protontricks_handler()
|
||||
|
||||
# Check if flatpak is available
|
||||
if not shutil.which("flatpak"):
|
||||
error_msg = "Flatpak not found. Please install Flatpak first."
|
||||
logger.error(error_msg)
|
||||
return False, error_msg
|
||||
|
||||
# Install command
|
||||
install_cmd = ["flatpak", "install", "-u", "-y", "--noninteractive", "flathub", "com.github.Matoking.protontricks"]
|
||||
|
||||
# Use clean environment
|
||||
env = handler._get_clean_subprocess_env()
|
||||
|
||||
# Run installation
|
||||
process = subprocess.run(install_cmd, check=True, text=True, env=env, capture_output=True)
|
||||
|
||||
# Clear cache to force re-detection
|
||||
self._cached_detection_valid = False
|
||||
|
||||
success_msg = "Flatpak Protontricks installed successfully."
|
||||
logger.info(success_msg)
|
||||
return True, success_msg
|
||||
|
||||
except FileNotFoundError:
|
||||
error_msg = "Flatpak command not found. Please install Flatpak first."
|
||||
logger.error(error_msg)
|
||||
return False, error_msg
|
||||
except subprocess.CalledProcessError as e:
|
||||
error_msg = f"Flatpak installation failed: {e}"
|
||||
logger.error(error_msg)
|
||||
return False, error_msg
|
||||
except Exception as e:
|
||||
error_msg = f"Unexpected error during Flatpak installation: {e}"
|
||||
logger.error(error_msg)
|
||||
return False, error_msg
|
||||
|
||||
def get_installation_guidance(self) -> str:
|
||||
"""
|
||||
Get guidance message for installing protontricks natively
|
||||
|
||||
Returns:
|
||||
str: Installation guidance message
|
||||
"""
|
||||
return """To install protontricks natively, use your distribution's package manager:
|
||||
|
||||
• Arch Linux: sudo pacman -S protontricks
|
||||
• Ubuntu/Debian: sudo apt install protontricks
|
||||
• Fedora: sudo dnf install protontricks
|
||||
• OpenSUSE: sudo zypper install protontricks
|
||||
• Gentoo: sudo emerge protontricks
|
||||
|
||||
Alternatively, you can install via Flatpak:
|
||||
flatpak install flathub com.github.Matoking.protontricks
|
||||
|
||||
After installation, click "Re-detect" to continue."""
|
||||
|
||||
def clear_cache(self):
|
||||
"""Clear cached detection results to force re-detection"""
|
||||
self._cached_detection_valid = False
|
||||
self._last_detection_result = None
|
||||
logger.debug("Protontricks detection cache cleared")
|
||||
171
jackify/backend/services/resolution_service.py
Normal file
171
jackify/backend/services/resolution_service.py
Normal file
@@ -0,0 +1,171 @@
|
||||
#!/usr/bin/env python3
|
||||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
Resolution Service Module
|
||||
Centralized service for managing resolution settings across CLI and GUI frontends
|
||||
"""
|
||||
|
||||
import logging
|
||||
from typing import Optional
|
||||
from ..handlers.config_handler import ConfigHandler
|
||||
|
||||
# Initialize logger
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class ResolutionService:
|
||||
"""
|
||||
Centralized service for managing resolution settings
|
||||
Handles saving, loading, and validation of resolution settings
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
"""Initialize the resolution service"""
|
||||
self.config_handler = ConfigHandler()
|
||||
logger.debug("ResolutionService initialized")
|
||||
|
||||
def save_resolution(self, resolution: str) -> bool:
|
||||
"""
|
||||
Save a resolution setting to configuration
|
||||
|
||||
Args:
|
||||
resolution (str): The resolution to save (e.g., '1920x1080')
|
||||
|
||||
Returns:
|
||||
bool: True if saved successfully, False otherwise
|
||||
"""
|
||||
try:
|
||||
# Validate resolution format (basic check)
|
||||
if not self._validate_resolution_format(resolution):
|
||||
logger.warning("Invalid resolution format provided")
|
||||
return False
|
||||
|
||||
success = self.config_handler.save_resolution(resolution)
|
||||
if success:
|
||||
logger.info(f"Resolution saved successfully: {resolution}")
|
||||
else:
|
||||
logger.error("Failed to save resolution")
|
||||
|
||||
return success
|
||||
except Exception as e:
|
||||
logger.error(f"Error in save_resolution: {e}")
|
||||
return False
|
||||
|
||||
def get_saved_resolution(self) -> Optional[str]:
|
||||
"""
|
||||
Retrieve the saved resolution from configuration
|
||||
|
||||
Returns:
|
||||
str: The saved resolution or None if not saved
|
||||
"""
|
||||
try:
|
||||
resolution = self.config_handler.get_saved_resolution()
|
||||
if resolution:
|
||||
logger.debug(f"Retrieved saved resolution: {resolution}")
|
||||
else:
|
||||
logger.debug("No saved resolution found")
|
||||
return resolution
|
||||
except Exception as e:
|
||||
logger.error(f"Error retrieving resolution: {e}")
|
||||
return None
|
||||
|
||||
def has_saved_resolution(self) -> bool:
|
||||
"""
|
||||
Check if a resolution is saved in configuration
|
||||
|
||||
Returns:
|
||||
bool: True if resolution exists, False otherwise
|
||||
"""
|
||||
try:
|
||||
return self.config_handler.has_saved_resolution()
|
||||
except Exception as e:
|
||||
logger.error(f"Error checking for saved resolution: {e}")
|
||||
return False
|
||||
|
||||
def clear_saved_resolution(self) -> bool:
|
||||
"""
|
||||
Clear the saved resolution from configuration
|
||||
|
||||
Returns:
|
||||
bool: True if cleared successfully, False otherwise
|
||||
"""
|
||||
try:
|
||||
success = self.config_handler.clear_saved_resolution()
|
||||
if success:
|
||||
logger.info("Resolution cleared successfully")
|
||||
else:
|
||||
logger.error("Failed to clear resolution")
|
||||
return success
|
||||
except Exception as e:
|
||||
logger.error(f"Error clearing resolution: {e}")
|
||||
return False
|
||||
|
||||
def _validate_resolution_format(self, resolution: str) -> bool:
|
||||
"""
|
||||
Validate resolution format (e.g., '1920x1080' or '1280x800 (Steam Deck)')
|
||||
|
||||
Args:
|
||||
resolution (str): Resolution string to validate
|
||||
|
||||
Returns:
|
||||
bool: True if valid format, False otherwise
|
||||
"""
|
||||
import re
|
||||
|
||||
if not resolution or resolution == 'Leave unchanged':
|
||||
return True # Allow 'Leave unchanged' as valid
|
||||
|
||||
# Handle Steam Deck format: '1280x800 (Steam Deck)'
|
||||
if ' (Steam Deck)' in resolution:
|
||||
resolution = resolution.replace(' (Steam Deck)', '')
|
||||
|
||||
# Check for WxH format (e.g., 1920x1080)
|
||||
if re.match(r'^[0-9]+x[0-9]+$', resolution):
|
||||
# Extract width and height
|
||||
try:
|
||||
width, height = resolution.split('x')
|
||||
width_int = int(width)
|
||||
height_int = int(height)
|
||||
|
||||
# Basic sanity checks
|
||||
if width_int > 0 and height_int > 0 and width_int <= 10000 and height_int <= 10000:
|
||||
return True
|
||||
else:
|
||||
logger.warning(f"Resolution dimensions out of reasonable range: {resolution}")
|
||||
return False
|
||||
except ValueError:
|
||||
logger.warning(f"Invalid resolution format: {resolution}")
|
||||
return False
|
||||
else:
|
||||
logger.warning(f"Resolution does not match WxH format: {resolution}")
|
||||
return False
|
||||
|
||||
def get_resolution_index(self, resolution: str, combo_items: list) -> int:
|
||||
"""
|
||||
Get the index of a resolution in a combo box list
|
||||
|
||||
Args:
|
||||
resolution (str): Resolution to find
|
||||
combo_items (list): List of combo box items
|
||||
|
||||
Returns:
|
||||
int: Index of the resolution, or 0 (Leave unchanged) if not found
|
||||
"""
|
||||
if not resolution:
|
||||
return 0 # Default to 'Leave unchanged'
|
||||
|
||||
# Handle Steam Deck special case
|
||||
if resolution == '1280x800' and '1280x800 (Steam Deck)' in combo_items:
|
||||
return combo_items.index('1280x800 (Steam Deck)')
|
||||
|
||||
# Try exact match first
|
||||
if resolution in combo_items:
|
||||
return combo_items.index(resolution)
|
||||
|
||||
# Try partial match (e.g., '1920x1080' in '1920x1080 (Steam Deck)')
|
||||
for i, item in enumerate(combo_items):
|
||||
if resolution in item:
|
||||
return i
|
||||
|
||||
# Default to 'Leave unchanged'
|
||||
return 0
|
||||
419
jackify/backend/services/resource_manager.py
Normal file
419
jackify/backend/services/resource_manager.py
Normal file
@@ -0,0 +1,419 @@
|
||||
#!/usr/bin/env python3
|
||||
# -*- coding: utf-8 -*-
|
||||
"""
|
||||
Resource Manager Module
|
||||
Handles system resource limits for Jackify operations
|
||||
"""
|
||||
|
||||
import resource
|
||||
import logging
|
||||
import os
|
||||
from typing import Tuple, Optional
|
||||
|
||||
# Initialize logger
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class ResourceManager:
|
||||
"""
|
||||
Manages system resource limits for Jackify operations
|
||||
Focuses on file descriptor limits to resolve ulimit issues
|
||||
"""
|
||||
|
||||
# Target file descriptor limit based on successful user testing
|
||||
TARGET_FILE_DESCRIPTORS = 64556
|
||||
|
||||
def __init__(self):
|
||||
"""Initialize the resource manager"""
|
||||
self.original_limits = None
|
||||
self.current_limits = None
|
||||
self.target_achieved = False
|
||||
logger.debug("ResourceManager initialized")
|
||||
|
||||
def get_current_file_descriptor_limits(self) -> Tuple[int, int]:
|
||||
"""
|
||||
Get current file descriptor limits (soft, hard)
|
||||
|
||||
Returns:
|
||||
tuple: (soft_limit, hard_limit)
|
||||
"""
|
||||
try:
|
||||
soft, hard = resource.getrlimit(resource.RLIMIT_NOFILE)
|
||||
return soft, hard
|
||||
except Exception as e:
|
||||
logger.error(f"Error getting file descriptor limits: {e}")
|
||||
return 0, 0
|
||||
|
||||
def increase_file_descriptor_limit(self, target_limit: Optional[int] = None) -> bool:
|
||||
"""
|
||||
Increase file descriptor limit to target value
|
||||
|
||||
Args:
|
||||
target_limit (int, optional): Target limit. Defaults to TARGET_FILE_DESCRIPTORS
|
||||
|
||||
Returns:
|
||||
bool: True if limit was increased or already adequate, False if failed
|
||||
"""
|
||||
if target_limit is None:
|
||||
target_limit = self.TARGET_FILE_DESCRIPTORS
|
||||
|
||||
try:
|
||||
# Get current limits
|
||||
current_soft, current_hard = self.get_current_file_descriptor_limits()
|
||||
self.original_limits = (current_soft, current_hard)
|
||||
|
||||
logger.info(f"Current file descriptor limits: soft={current_soft}, hard={current_hard}")
|
||||
|
||||
# Check if we already have adequate limits
|
||||
if current_soft >= target_limit:
|
||||
logger.info(f"File descriptor limit already adequate: {current_soft} >= {target_limit}")
|
||||
self.target_achieved = True
|
||||
self.current_limits = (current_soft, current_hard)
|
||||
return True
|
||||
|
||||
# Calculate new soft limit (can't exceed hard limit)
|
||||
new_soft = min(target_limit, current_hard)
|
||||
|
||||
if new_soft <= current_soft:
|
||||
logger.warning(f"Cannot increase file descriptor limit: hard limit ({current_hard}) too low for target ({target_limit})")
|
||||
self.current_limits = (current_soft, current_hard)
|
||||
return False
|
||||
|
||||
# Attempt to set new limits
|
||||
try:
|
||||
resource.setrlimit(resource.RLIMIT_NOFILE, (new_soft, current_hard))
|
||||
|
||||
# Verify the change worked
|
||||
verify_soft, verify_hard = self.get_current_file_descriptor_limits()
|
||||
self.current_limits = (verify_soft, verify_hard)
|
||||
|
||||
if verify_soft >= new_soft:
|
||||
logger.info(f"Successfully increased file descriptor limit: {current_soft} -> {verify_soft}")
|
||||
self.target_achieved = (verify_soft >= target_limit)
|
||||
if not self.target_achieved:
|
||||
logger.warning(f"Increased limit ({verify_soft}) is below target ({target_limit}) but above original ({current_soft})")
|
||||
return True
|
||||
else:
|
||||
logger.error(f"File descriptor limit increase failed verification: expected {new_soft}, got {verify_soft}")
|
||||
return False
|
||||
|
||||
except (ValueError, OSError) as e:
|
||||
logger.error(f"Failed to set file descriptor limit: {e}")
|
||||
self.current_limits = (current_soft, current_hard)
|
||||
return False
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error in increase_file_descriptor_limit: {e}")
|
||||
return False
|
||||
|
||||
def get_limit_status(self) -> dict:
|
||||
"""
|
||||
Get detailed status of file descriptor limits
|
||||
|
||||
Returns:
|
||||
dict: Status information about limits
|
||||
"""
|
||||
current_soft, current_hard = self.get_current_file_descriptor_limits()
|
||||
|
||||
return {
|
||||
'current_soft': current_soft,
|
||||
'current_hard': current_hard,
|
||||
'original_limits': self.original_limits,
|
||||
'target_limit': self.TARGET_FILE_DESCRIPTORS,
|
||||
'target_achieved': self.target_achieved,
|
||||
'increase_needed': current_soft < self.TARGET_FILE_DESCRIPTORS,
|
||||
'can_increase': current_hard >= self.TARGET_FILE_DESCRIPTORS,
|
||||
'max_possible': current_hard
|
||||
}
|
||||
|
||||
def get_manual_increase_instructions(self) -> dict:
|
||||
"""
|
||||
Get distribution-specific instructions for manually increasing limits
|
||||
|
||||
Returns:
|
||||
dict: Instructions organized by distribution/method
|
||||
"""
|
||||
status = self.get_limit_status()
|
||||
target = self.TARGET_FILE_DESCRIPTORS
|
||||
|
||||
# Detect distribution
|
||||
distro = self._detect_distribution()
|
||||
|
||||
instructions = {
|
||||
'target_limit': target,
|
||||
'current_limit': status['current_soft'],
|
||||
'distribution': distro,
|
||||
'methods': {}
|
||||
}
|
||||
|
||||
# Temporary increase (all distributions)
|
||||
instructions['methods']['temporary'] = {
|
||||
'title': 'Temporary Increase (Current Session Only)',
|
||||
'commands': [
|
||||
f'ulimit -n {target}',
|
||||
'jackify # Re-run Jackify after setting ulimit'
|
||||
],
|
||||
'note': 'This only affects the current terminal session'
|
||||
}
|
||||
|
||||
# Permanent increase (varies by distribution)
|
||||
if distro in ['cachyos', 'arch', 'manjaro']:
|
||||
instructions['methods']['permanent'] = {
|
||||
'title': 'Permanent Increase (Arch-based Systems)',
|
||||
'commands': [
|
||||
'sudo nano /etc/security/limits.conf',
|
||||
f'# Add these lines to the file:',
|
||||
f'* soft nofile {target}',
|
||||
f'* hard nofile {target}',
|
||||
'# Save file and reboot, or logout/login'
|
||||
],
|
||||
'note': 'Requires root privileges and reboot/re-login'
|
||||
}
|
||||
elif distro in ['opensuse', 'suse']:
|
||||
instructions['methods']['permanent'] = {
|
||||
'title': 'Permanent Increase (openSUSE)',
|
||||
'commands': [
|
||||
'sudo nano /etc/security/limits.conf',
|
||||
f'# Add these lines to the file:',
|
||||
f'* soft nofile {target}',
|
||||
f'* hard nofile {target}',
|
||||
'# Save file and reboot, or logout/login',
|
||||
'# Alternative: Set in systemd service file'
|
||||
],
|
||||
'note': 'May require additional systemd configuration on openSUSE'
|
||||
}
|
||||
else:
|
||||
instructions['methods']['permanent'] = {
|
||||
'title': 'Permanent Increase (Generic Linux)',
|
||||
'commands': [
|
||||
'sudo nano /etc/security/limits.conf',
|
||||
f'# Add these lines to the file:',
|
||||
f'* soft nofile {target}',
|
||||
f'* hard nofile {target}',
|
||||
'# Save file and reboot, or logout/login'
|
||||
],
|
||||
'note': 'Standard method for most Linux distributions'
|
||||
}
|
||||
|
||||
return instructions
|
||||
|
||||
def _detect_distribution(self) -> str:
|
||||
"""
|
||||
Detect the Linux distribution
|
||||
|
||||
Returns:
|
||||
str: Distribution identifier
|
||||
"""
|
||||
try:
|
||||
# Check /etc/os-release
|
||||
if os.path.exists('/etc/os-release'):
|
||||
with open('/etc/os-release', 'r') as f:
|
||||
content = f.read().lower()
|
||||
|
||||
if 'cachyos' in content:
|
||||
return 'cachyos'
|
||||
elif 'arch' in content:
|
||||
return 'arch'
|
||||
elif 'manjaro' in content:
|
||||
return 'manjaro'
|
||||
elif 'opensuse' in content or 'suse' in content:
|
||||
return 'opensuse'
|
||||
elif 'ubuntu' in content:
|
||||
return 'ubuntu'
|
||||
elif 'debian' in content:
|
||||
return 'debian'
|
||||
elif 'fedora' in content:
|
||||
return 'fedora'
|
||||
|
||||
# Fallback detection methods
|
||||
if os.path.exists('/etc/arch-release'):
|
||||
return 'arch'
|
||||
elif os.path.exists('/etc/SuSE-release'):
|
||||
return 'opensuse'
|
||||
|
||||
except Exception as e:
|
||||
logger.warning(f"Could not detect distribution: {e}")
|
||||
|
||||
return 'unknown'
|
||||
|
||||
def is_too_many_files_error(self, error_message: str) -> bool:
|
||||
"""
|
||||
Check if an error message indicates a 'too many open files' issue
|
||||
|
||||
Args:
|
||||
error_message (str): Error message to check
|
||||
|
||||
Returns:
|
||||
bool: True if error is related to file descriptor limits
|
||||
"""
|
||||
if not error_message:
|
||||
return False
|
||||
|
||||
error_lower = error_message.lower()
|
||||
indicators = [
|
||||
'too many open files',
|
||||
'too many files open',
|
||||
'cannot open',
|
||||
'emfile', # errno 24
|
||||
'file descriptor',
|
||||
'ulimit',
|
||||
'resource temporarily unavailable'
|
||||
]
|
||||
|
||||
return any(indicator in error_lower for indicator in indicators)
|
||||
|
||||
def apply_recommended_limits(self) -> bool:
|
||||
"""
|
||||
Apply recommended resource limits for Jackify operations
|
||||
|
||||
Returns:
|
||||
bool: True if limits were successfully applied
|
||||
"""
|
||||
logger.info("Applying recommended resource limits for Jackify operations")
|
||||
|
||||
# Focus on file descriptor limits as the primary issue
|
||||
success = self.increase_file_descriptor_limit()
|
||||
|
||||
if success:
|
||||
status = self.get_limit_status()
|
||||
logger.info(f"Resource limits applied successfully. Current file descriptors: {status['current_soft']}")
|
||||
else:
|
||||
logger.warning("Failed to apply optimal resource limits")
|
||||
|
||||
return success
|
||||
|
||||
def handle_too_many_files_error(self, error_message: str, context: str = "") -> dict:
|
||||
"""
|
||||
Handle a 'too many open files' error by attempting to increase limits and providing guidance
|
||||
|
||||
Args:
|
||||
error_message (str): The error message that triggered this handler
|
||||
context (str): Additional context about where the error occurred
|
||||
|
||||
Returns:
|
||||
dict: Result of handling the error, including success status and guidance
|
||||
"""
|
||||
logger.warning(f"Detected 'too many open files' error in {context}: {error_message}")
|
||||
|
||||
result = {
|
||||
'error_detected': True,
|
||||
'error_message': error_message,
|
||||
'context': context,
|
||||
'auto_fix_attempted': False,
|
||||
'auto_fix_success': False,
|
||||
'manual_instructions': None,
|
||||
'recommendation': ''
|
||||
}
|
||||
|
||||
# Check if this is actually a file descriptor limit error
|
||||
if not self.is_too_many_files_error(error_message):
|
||||
result['error_detected'] = False
|
||||
return result
|
||||
|
||||
# Get current status
|
||||
status = self.get_limit_status()
|
||||
|
||||
# Attempt automatic fix if we haven't already optimized
|
||||
if not self.target_achieved and status['can_increase']:
|
||||
logger.info("Attempting to automatically increase file descriptor limits...")
|
||||
result['auto_fix_attempted'] = True
|
||||
|
||||
success = self.increase_file_descriptor_limit()
|
||||
result['auto_fix_success'] = success
|
||||
|
||||
if success:
|
||||
new_status = self.get_limit_status()
|
||||
result['recommendation'] = f"File descriptor limit increased to {new_status['current_soft']}. Please retry the operation."
|
||||
logger.info(f"Successfully increased file descriptor limit to {new_status['current_soft']}")
|
||||
else:
|
||||
result['recommendation'] = "Automatic limit increase failed. Manual intervention required."
|
||||
logger.warning("Automatic file descriptor limit increase failed")
|
||||
else:
|
||||
result['recommendation'] = "File descriptor limits already at maximum or cannot be increased automatically."
|
||||
|
||||
# Always provide manual instructions as fallback
|
||||
result['manual_instructions'] = self.get_manual_increase_instructions()
|
||||
|
||||
return result
|
||||
|
||||
def show_guidance_dialog(self, parent=None):
|
||||
"""
|
||||
Show the ulimit guidance dialog (GUI only)
|
||||
|
||||
Args:
|
||||
parent: Parent widget for the dialog
|
||||
|
||||
Returns:
|
||||
Dialog result or None if not in GUI mode
|
||||
"""
|
||||
try:
|
||||
# Only available in GUI mode
|
||||
from jackify.frontends.gui.dialogs.ulimit_guidance_dialog import show_ulimit_guidance
|
||||
return show_ulimit_guidance(parent, self)
|
||||
except ImportError:
|
||||
logger.debug("GUI ulimit guidance dialog not available (likely CLI mode)")
|
||||
return None
|
||||
|
||||
|
||||
# Convenience functions for easy use
|
||||
def ensure_adequate_file_descriptor_limits() -> bool:
|
||||
"""
|
||||
Convenience function to ensure adequate file descriptor limits
|
||||
|
||||
Returns:
|
||||
bool: True if limits are adequate or were successfully increased
|
||||
"""
|
||||
manager = ResourceManager()
|
||||
return manager.apply_recommended_limits()
|
||||
|
||||
|
||||
def handle_file_descriptor_error(error_message: str, context: str = "") -> dict:
|
||||
"""
|
||||
Convenience function to handle file descriptor limit errors
|
||||
|
||||
Args:
|
||||
error_message (str): The error message that triggered this handler
|
||||
context (str): Additional context about where the error occurred
|
||||
|
||||
Returns:
|
||||
dict: Result of handling the error, including success status and guidance
|
||||
"""
|
||||
manager = ResourceManager()
|
||||
return manager.handle_too_many_files_error(error_message, context)
|
||||
|
||||
|
||||
# Module-level testing
|
||||
if __name__ == '__main__':
|
||||
# Configure logging for testing
|
||||
logging.basicConfig(level=logging.DEBUG, format='%(asctime)s - %(name)s - %(levelname)s - %(message)s')
|
||||
|
||||
print("Testing ResourceManager...")
|
||||
|
||||
manager = ResourceManager()
|
||||
|
||||
# Show current status
|
||||
status = manager.get_limit_status()
|
||||
print(f"\nCurrent Status:")
|
||||
print(f" Current soft limit: {status['current_soft']}")
|
||||
print(f" Current hard limit: {status['current_hard']}")
|
||||
print(f" Target limit: {status['target_limit']}")
|
||||
print(f" Increase needed: {status['increase_needed']}")
|
||||
print(f" Can increase: {status['can_increase']}")
|
||||
|
||||
# Test limit increase
|
||||
print(f"\nAttempting to increase limits...")
|
||||
success = manager.apply_recommended_limits()
|
||||
print(f"Success: {success}")
|
||||
|
||||
# Show final status
|
||||
final_status = manager.get_limit_status()
|
||||
print(f"\nFinal Status:")
|
||||
print(f" Current soft limit: {final_status['current_soft']}")
|
||||
print(f" Target achieved: {final_status['target_achieved']}")
|
||||
|
||||
# Test manual instructions
|
||||
instructions = manager.get_manual_increase_instructions()
|
||||
print(f"\nDetected distribution: {instructions['distribution']}")
|
||||
print(f"Manual increase available if needed")
|
||||
|
||||
print("\nTesting completed successfully!")
|
||||
274
jackify/backend/services/steam_restart_service.py
Normal file
274
jackify/backend/services/steam_restart_service.py
Normal file
@@ -0,0 +1,274 @@
|
||||
import os
|
||||
import time
|
||||
import subprocess
|
||||
import signal
|
||||
import psutil
|
||||
import logging
|
||||
import sys
|
||||
from typing import Callable, Optional
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
def _get_clean_subprocess_env():
|
||||
"""
|
||||
Create a clean environment for subprocess calls by removing PyInstaller-specific
|
||||
environment variables that can interfere with Steam execution.
|
||||
|
||||
Returns:
|
||||
dict: Cleaned environment dictionary
|
||||
"""
|
||||
env = os.environ.copy()
|
||||
pyinstaller_vars_removed = []
|
||||
|
||||
# Remove PyInstaller-specific environment variables
|
||||
if env.pop('_MEIPASS', None):
|
||||
pyinstaller_vars_removed.append('_MEIPASS')
|
||||
if env.pop('_MEIPASS2', None):
|
||||
pyinstaller_vars_removed.append('_MEIPASS2')
|
||||
|
||||
# Clean library path variables that PyInstaller modifies (Linux/Unix)
|
||||
if 'LD_LIBRARY_PATH_ORIG' in env:
|
||||
# Restore original LD_LIBRARY_PATH if it was backed up by PyInstaller
|
||||
env['LD_LIBRARY_PATH'] = env['LD_LIBRARY_PATH_ORIG']
|
||||
pyinstaller_vars_removed.append('LD_LIBRARY_PATH (restored from _ORIG)')
|
||||
else:
|
||||
# Remove PyInstaller-modified LD_LIBRARY_PATH
|
||||
if env.pop('LD_LIBRARY_PATH', None):
|
||||
pyinstaller_vars_removed.append('LD_LIBRARY_PATH (removed)')
|
||||
|
||||
# Clean PATH of PyInstaller-specific entries
|
||||
if 'PATH' in env and hasattr(sys, '_MEIPASS'):
|
||||
path_entries = env['PATH'].split(os.pathsep)
|
||||
original_count = len(path_entries)
|
||||
# Remove any PATH entries that point to PyInstaller temp directory
|
||||
cleaned_path = [p for p in path_entries if not p.startswith(sys._MEIPASS)]
|
||||
env['PATH'] = os.pathsep.join(cleaned_path)
|
||||
if len(cleaned_path) < original_count:
|
||||
pyinstaller_vars_removed.append(f'PATH (removed {original_count - len(cleaned_path)} PyInstaller entries)')
|
||||
|
||||
# Clean macOS library path (if present)
|
||||
if 'DYLD_LIBRARY_PATH' in env and hasattr(sys, '_MEIPASS'):
|
||||
dyld_entries = env['DYLD_LIBRARY_PATH'].split(os.pathsep)
|
||||
cleaned_dyld = [p for p in dyld_entries if not p.startswith(sys._MEIPASS)]
|
||||
if cleaned_dyld:
|
||||
env['DYLD_LIBRARY_PATH'] = os.pathsep.join(cleaned_dyld)
|
||||
pyinstaller_vars_removed.append('DYLD_LIBRARY_PATH (cleaned)')
|
||||
else:
|
||||
env.pop('DYLD_LIBRARY_PATH', None)
|
||||
pyinstaller_vars_removed.append('DYLD_LIBRARY_PATH (removed)')
|
||||
|
||||
# Log what was cleaned for debugging
|
||||
if pyinstaller_vars_removed:
|
||||
logger.debug(f"Steam restart: Cleaned PyInstaller environment variables: {', '.join(pyinstaller_vars_removed)}")
|
||||
else:
|
||||
logger.debug("Steam restart: No PyInstaller environment variables detected (likely DEV mode)")
|
||||
|
||||
return env
|
||||
|
||||
class SteamRestartError(Exception):
|
||||
pass
|
||||
|
||||
def is_steam_deck() -> bool:
|
||||
"""Detect if running on Steam Deck/SteamOS."""
|
||||
try:
|
||||
if os.path.exists('/etc/os-release'):
|
||||
with open('/etc/os-release', 'r') as f:
|
||||
content = f.read().lower()
|
||||
if 'steamos' in content or 'steam deck' in content:
|
||||
return True
|
||||
if os.path.exists('/sys/devices/virtual/dmi/id/product_name'):
|
||||
with open('/sys/devices/virtual/dmi/id/product_name', 'r') as f:
|
||||
if 'steam deck' in f.read().lower():
|
||||
return True
|
||||
if os.environ.get('STEAM_RUNTIME') and os.path.exists('/home/deck'):
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.debug(f"Error detecting Steam Deck: {e}")
|
||||
return False
|
||||
|
||||
def get_steam_processes() -> list:
|
||||
"""Return a list of psutil.Process objects for running Steam processes."""
|
||||
steam_procs = []
|
||||
for proc in psutil.process_iter(['pid', 'name', 'exe', 'cmdline']):
|
||||
try:
|
||||
name = proc.info['name']
|
||||
exe = proc.info['exe']
|
||||
cmdline = proc.info['cmdline']
|
||||
if name and 'steam' in name.lower():
|
||||
steam_procs.append(proc)
|
||||
elif exe and 'steam' in exe.lower():
|
||||
steam_procs.append(proc)
|
||||
elif cmdline and any('steam' in str(arg).lower() for arg in cmdline):
|
||||
steam_procs.append(proc)
|
||||
except (psutil.NoSuchProcess, psutil.AccessDenied, psutil.ZombieProcess):
|
||||
continue
|
||||
return steam_procs
|
||||
|
||||
def wait_for_steam_exit(timeout: int = 60, check_interval: float = 0.5) -> bool:
|
||||
"""Wait for all Steam processes to exit using pgrep (matching existing logic)."""
|
||||
start = time.time()
|
||||
env = _get_clean_subprocess_env()
|
||||
while time.time() - start < timeout:
|
||||
try:
|
||||
result = subprocess.run(['pgrep', '-f', 'steamwebhelper'], capture_output=True, timeout=10, env=env)
|
||||
if result.returncode != 0:
|
||||
return True
|
||||
except Exception as e:
|
||||
logger.debug(f"Error checking Steam processes: {e}")
|
||||
time.sleep(check_interval)
|
||||
return False
|
||||
|
||||
def start_steam() -> bool:
|
||||
"""Attempt to start Steam using the exact methods from existing working logic."""
|
||||
env = _get_clean_subprocess_env()
|
||||
try:
|
||||
# Try systemd user service (Steam Deck)
|
||||
if is_steam_deck():
|
||||
subprocess.Popen(["systemctl", "--user", "restart", "app-steam@autostart.service"], env=env)
|
||||
return True
|
||||
|
||||
# Use startup methods with only -silent flag (no -minimized or -no-browser)
|
||||
start_methods = [
|
||||
{"name": "Popen", "cmd": ["steam", "-silent"], "kwargs": {"stdout": subprocess.DEVNULL, "stderr": subprocess.DEVNULL, "stdin": subprocess.DEVNULL, "start_new_session": True, "env": env}},
|
||||
{"name": "setsid", "cmd": ["setsid", "steam", "-silent"], "kwargs": {"stdout": subprocess.DEVNULL, "stderr": subprocess.DEVNULL, "stdin": subprocess.DEVNULL, "env": env}},
|
||||
{"name": "nohup", "cmd": ["nohup", "steam", "-silent"], "kwargs": {"stdout": subprocess.DEVNULL, "stderr": subprocess.DEVNULL, "stdin": subprocess.DEVNULL, "start_new_session": True, "preexec_fn": os.setpgrp, "env": env}}
|
||||
]
|
||||
|
||||
for method in start_methods:
|
||||
method_name = method["name"]
|
||||
logger.info(f"Attempting to start Steam using method: {method_name}")
|
||||
try:
|
||||
process = subprocess.Popen(method["cmd"], **method["kwargs"])
|
||||
if process is not None:
|
||||
logger.info(f"Initiated Steam start with {method_name}.")
|
||||
time.sleep(5) # Wait 5 seconds as in existing logic
|
||||
check_result = subprocess.run(['pgrep', '-f', 'steam'], capture_output=True, timeout=10, env=env)
|
||||
if check_result.returncode == 0:
|
||||
logger.info(f"Steam process detected after using {method_name}. Proceeding to wait phase.")
|
||||
return True
|
||||
else:
|
||||
logger.warning(f"Steam process not detected after initiating with {method_name}. Trying next method.")
|
||||
else:
|
||||
logger.warning(f"Failed to start process with {method_name}. Trying next method.")
|
||||
except FileNotFoundError:
|
||||
logger.error(f"Command not found for method {method_name} (e.g., setsid, nohup). Trying next method.")
|
||||
except Exception as e:
|
||||
logger.error(f"Error starting Steam with {method_name}: {e}. Trying next method.")
|
||||
|
||||
return False
|
||||
except Exception as e:
|
||||
logger.error(f"Error starting Steam: {e}")
|
||||
return False
|
||||
|
||||
def robust_steam_restart(progress_callback: Optional[Callable[[str], None]] = None, timeout: int = 60) -> bool:
|
||||
"""
|
||||
Robustly restart Steam across all distros. Returns True on success, False on failure.
|
||||
Optionally accepts a progress_callback(message: str) for UI feedback.
|
||||
Uses aggressive pkill approach for maximum reliability.
|
||||
"""
|
||||
env = _get_clean_subprocess_env()
|
||||
|
||||
def report(msg):
|
||||
logger.info(msg)
|
||||
if progress_callback:
|
||||
progress_callback(msg)
|
||||
|
||||
report("Shutting down Steam...")
|
||||
|
||||
# Steam Deck: Use systemctl for shutdown (special handling)
|
||||
if is_steam_deck():
|
||||
try:
|
||||
report("Steam Deck detected - using systemctl shutdown...")
|
||||
subprocess.run(['systemctl', '--user', 'stop', 'app-steam@autostart.service'],
|
||||
timeout=15, check=False, capture_output=True, env=env)
|
||||
time.sleep(2)
|
||||
except Exception as e:
|
||||
logger.debug(f"systemctl stop failed on Steam Deck: {e}")
|
||||
|
||||
# All systems: Use pkill approach (proven 15/16 test success rate)
|
||||
try:
|
||||
# Skip unreliable steam -shutdown, go straight to pkill
|
||||
pkill_result = subprocess.run(['pkill', 'steam'], timeout=15, check=False, capture_output=True, env=env)
|
||||
logger.debug(f"pkill steam result: {pkill_result.returncode}")
|
||||
time.sleep(2)
|
||||
|
||||
# Check if Steam is still running
|
||||
check_result = subprocess.run(['pgrep', '-f', 'steamwebhelper'], capture_output=True, timeout=10, env=env)
|
||||
if check_result.returncode == 0:
|
||||
# Force kill if still running
|
||||
report("Steam still running - force terminating...")
|
||||
force_result = subprocess.run(['pkill', '-9', 'steam'], timeout=15, check=False, capture_output=True, env=env)
|
||||
logger.debug(f"pkill -9 steam result: {force_result.returncode}")
|
||||
time.sleep(2)
|
||||
|
||||
# Final check
|
||||
final_check = subprocess.run(['pgrep', '-f', 'steamwebhelper'], capture_output=True, timeout=10, env=env)
|
||||
if final_check.returncode != 0:
|
||||
logger.info("Steam processes successfully force terminated.")
|
||||
else:
|
||||
report("Failed to terminate Steam processes.")
|
||||
return False
|
||||
else:
|
||||
logger.info("Steam processes successfully terminated.")
|
||||
except Exception as e:
|
||||
logger.error(f"Error during Steam shutdown: {e}")
|
||||
report("Failed to shut down Steam.")
|
||||
return False
|
||||
|
||||
report("Steam closed successfully.")
|
||||
|
||||
# Start Steam using platform-specific logic
|
||||
report("Starting Steam...")
|
||||
|
||||
# Steam Deck: Use systemctl restart (keep existing working approach)
|
||||
if is_steam_deck():
|
||||
try:
|
||||
subprocess.Popen(["systemctl", "--user", "restart", "app-steam@autostart.service"], env=env)
|
||||
logger.info("Steam Deck: Initiated systemctl restart")
|
||||
except Exception as e:
|
||||
logger.error(f"Steam Deck systemctl restart failed: {e}")
|
||||
report("Failed to restart Steam on Steam Deck.")
|
||||
return False
|
||||
else:
|
||||
# All other distros: Use proven steam -silent method
|
||||
if not start_steam():
|
||||
report("Failed to start Steam.")
|
||||
return False
|
||||
|
||||
# Wait for Steam to fully initialize using existing logic
|
||||
report("Waiting for Steam to fully start")
|
||||
logger.info("Waiting up to 2 minutes for Steam to fully initialize...")
|
||||
max_startup_wait = 120
|
||||
elapsed_wait = 0
|
||||
initial_wait_done = False
|
||||
|
||||
while elapsed_wait < max_startup_wait:
|
||||
try:
|
||||
result = subprocess.run(['pgrep', '-f', 'steam'], capture_output=True, timeout=10, env=env)
|
||||
if result.returncode == 0:
|
||||
if not initial_wait_done:
|
||||
logger.info("Steam process detected. Waiting additional time for full initialization...")
|
||||
initial_wait_done = True
|
||||
time.sleep(5)
|
||||
elapsed_wait += 5
|
||||
if initial_wait_done and elapsed_wait >= 15:
|
||||
final_check = subprocess.run(['pgrep', '-f', 'steam'], capture_output=True, timeout=10, env=env)
|
||||
if final_check.returncode == 0:
|
||||
report("Steam started successfully.")
|
||||
logger.info("Steam confirmed running after wait.")
|
||||
return True
|
||||
else:
|
||||
logger.warning("Steam process disappeared during final initialization wait.")
|
||||
break
|
||||
else:
|
||||
logger.debug(f"Steam process not yet detected. Waiting... ({elapsed_wait + 5}s)")
|
||||
time.sleep(5)
|
||||
elapsed_wait += 5
|
||||
except Exception as e:
|
||||
logger.warning(f"Error during Steam startup wait: {e}")
|
||||
time.sleep(5)
|
||||
elapsed_wait += 5
|
||||
|
||||
report("Steam did not start within timeout.")
|
||||
logger.error("Steam failed to start/initialize within the allowed time.")
|
||||
return False
|
||||
BIN
jackify/engine/AES-CTR-Netstandard.dll
Executable file
BIN
jackify/engine/AES-CTR-Netstandard.dll
Executable file
Binary file not shown.
BIN
jackify/engine/BCnEncoder.NET.ImageSharp.dll
Executable file
BIN
jackify/engine/BCnEncoder.NET.ImageSharp.dll
Executable file
Binary file not shown.
BIN
jackify/engine/BCnEncoder.dll
Executable file
BIN
jackify/engine/BCnEncoder.dll
Executable file
Binary file not shown.
BIN
jackify/engine/Crc32.NET.dll
Normal file
BIN
jackify/engine/Crc32.NET.dll
Normal file
Binary file not shown.
BIN
jackify/engine/DeviceId.dll
Executable file
BIN
jackify/engine/DeviceId.dll
Executable file
Binary file not shown.
BIN
jackify/engine/Extractors/linux-x64/7zz
Executable file
BIN
jackify/engine/Extractors/linux-x64/7zz
Executable file
Binary file not shown.
BIN
jackify/engine/Extractors/linux-x64/innoextract
Executable file
BIN
jackify/engine/Extractors/linux-x64/innoextract
Executable file
Binary file not shown.
BIN
jackify/engine/Extractors/mac/7zz
Normal file
BIN
jackify/engine/Extractors/mac/7zz
Normal file
Binary file not shown.
BIN
jackify/engine/Extractors/windows-x64/7z.dll
Normal file
BIN
jackify/engine/Extractors/windows-x64/7z.dll
Normal file
Binary file not shown.
BIN
jackify/engine/Extractors/windows-x64/7z.exe
Normal file
BIN
jackify/engine/Extractors/windows-x64/7z.exe
Normal file
Binary file not shown.
BIN
jackify/engine/Extractors/windows-x64/innoextract.exe
Normal file
BIN
jackify/engine/Extractors/windows-x64/innoextract.exe
Normal file
Binary file not shown.
BIN
jackify/engine/F23.StringSimilarity.dll
Normal file
BIN
jackify/engine/F23.StringSimilarity.dll
Normal file
Binary file not shown.
BIN
jackify/engine/FluentFTP.dll
Normal file
BIN
jackify/engine/FluentFTP.dll
Normal file
Binary file not shown.
BIN
jackify/engine/FluentResults.dll
Executable file
BIN
jackify/engine/FluentResults.dll
Executable file
Binary file not shown.
BIN
jackify/engine/GameFinder.Common.dll
Executable file
BIN
jackify/engine/GameFinder.Common.dll
Executable file
Binary file not shown.
BIN
jackify/engine/GameFinder.RegistryUtils.dll
Executable file
BIN
jackify/engine/GameFinder.RegistryUtils.dll
Executable file
Binary file not shown.
BIN
jackify/engine/GameFinder.StoreHandlers.EADesktop.dll
Executable file
BIN
jackify/engine/GameFinder.StoreHandlers.EADesktop.dll
Executable file
Binary file not shown.
BIN
jackify/engine/GameFinder.StoreHandlers.EGS.dll
Executable file
BIN
jackify/engine/GameFinder.StoreHandlers.EGS.dll
Executable file
Binary file not shown.
BIN
jackify/engine/GameFinder.StoreHandlers.GOG.dll
Executable file
BIN
jackify/engine/GameFinder.StoreHandlers.GOG.dll
Executable file
Binary file not shown.
BIN
jackify/engine/GameFinder.StoreHandlers.Origin.dll
Executable file
BIN
jackify/engine/GameFinder.StoreHandlers.Origin.dll
Executable file
Binary file not shown.
BIN
jackify/engine/GameFinder.StoreHandlers.Steam.dll
Executable file
BIN
jackify/engine/GameFinder.StoreHandlers.Steam.dll
Executable file
Binary file not shown.
BIN
jackify/engine/GameFinder.Wine.dll
Executable file
BIN
jackify/engine/GameFinder.Wine.dll
Executable file
Binary file not shown.
BIN
jackify/engine/HtmlAgilityPack.dll
Executable file
BIN
jackify/engine/HtmlAgilityPack.dll
Executable file
Binary file not shown.
BIN
jackify/engine/ICSharpCode.SharpZipLib.dll
Normal file
BIN
jackify/engine/ICSharpCode.SharpZipLib.dll
Normal file
Binary file not shown.
BIN
jackify/engine/INIFileParser.dll
Normal file
BIN
jackify/engine/INIFileParser.dll
Normal file
Binary file not shown.
BIN
jackify/engine/K4os.Compression.LZ4.Streams.dll
Normal file
BIN
jackify/engine/K4os.Compression.LZ4.Streams.dll
Normal file
Binary file not shown.
BIN
jackify/engine/K4os.Compression.LZ4.dll
Normal file
BIN
jackify/engine/K4os.Compression.LZ4.dll
Normal file
Binary file not shown.
BIN
jackify/engine/K4os.Hash.xxHash.dll
Normal file
BIN
jackify/engine/K4os.Hash.xxHash.dll
Normal file
Binary file not shown.
BIN
jackify/engine/Markdig.dll
Normal file
BIN
jackify/engine/Markdig.dll
Normal file
Binary file not shown.
BIN
jackify/engine/MegaApiClient.dll
Normal file
BIN
jackify/engine/MegaApiClient.dll
Normal file
Binary file not shown.
BIN
jackify/engine/Microsoft.AspNetCore.Http.Abstractions.dll
Executable file
BIN
jackify/engine/Microsoft.AspNetCore.Http.Abstractions.dll
Executable file
Binary file not shown.
BIN
jackify/engine/Microsoft.AspNetCore.Http.Extensions.dll
Executable file
BIN
jackify/engine/Microsoft.AspNetCore.Http.Extensions.dll
Executable file
Binary file not shown.
BIN
jackify/engine/Microsoft.AspNetCore.Http.Features.dll
Executable file
BIN
jackify/engine/Microsoft.AspNetCore.Http.Features.dll
Executable file
Binary file not shown.
BIN
jackify/engine/Microsoft.CSharp.dll
Normal file
BIN
jackify/engine/Microsoft.CSharp.dll
Normal file
Binary file not shown.
Binary file not shown.
BIN
jackify/engine/Microsoft.Extensions.Configuration.Binder.dll
Normal file
BIN
jackify/engine/Microsoft.Extensions.Configuration.Binder.dll
Normal file
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
BIN
jackify/engine/Microsoft.Extensions.Configuration.Json.dll
Normal file
BIN
jackify/engine/Microsoft.Extensions.Configuration.Json.dll
Normal file
Binary file not shown.
Binary file not shown.
BIN
jackify/engine/Microsoft.Extensions.Configuration.dll
Normal file
BIN
jackify/engine/Microsoft.Extensions.Configuration.dll
Normal file
Binary file not shown.
Binary file not shown.
BIN
jackify/engine/Microsoft.Extensions.DependencyInjection.dll
Normal file
BIN
jackify/engine/Microsoft.Extensions.DependencyInjection.dll
Normal file
Binary file not shown.
BIN
jackify/engine/Microsoft.Extensions.Diagnostics.Abstractions.dll
Normal file
BIN
jackify/engine/Microsoft.Extensions.Diagnostics.Abstractions.dll
Normal file
Binary file not shown.
BIN
jackify/engine/Microsoft.Extensions.Diagnostics.dll
Normal file
BIN
jackify/engine/Microsoft.Extensions.Diagnostics.dll
Normal file
Binary file not shown.
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user