Installed Packages version numbers and the source repo
Shared that the 45Drives Motherboard is the only item not fully working
Gave the use-case to reproduce as the issue happen during the the “Motherboard & CPU” portion.
Shared a snapshot of the error.
Noted what is working within the 45Drive Motherboard feature.
I checked the python script within the helper_script. I noticed that the prebuilt models of the “HL15 - Fully Built & Burned In” do not have the two motherboard models nor the CPU model.
I did not debug the problem further as I assume there are more updates than just the python script.
I will need to create all of the graphics for this motherboard as well as map out the various regions within to add this functionality.
I have been pretty busy trying to get the new Stornado up and running and just haven’t had the time to implement this yet. I’ll update here once I do. Luckily, the two motherboard offerings are pretty similar, so I’ll get support for both the X11SPH-nCTPF and X11SPH-nCTF added in the same release.
@Hutch-45Drives How does Proxmox and Houston manage RAM? With ZFS using a lot of RAM for caching, would it reserve a set amount of ZFS cache? use all that’s available?
Also, to migrate from TrueNAS is it as simple as exporting the ZFS pool and then importing into Houston?
RAM is managed by the kernel in Proxmox (and TrueNAS for that matter). The default value of /sys/module/zfs/parameters/zfs_arc_max in Linux should be 0 which will use 50% of available RAM for ZFS leaving the other 50% for other resources like VM’s. Proxmox has a wiki on adjusting this if you want to have more or less available to ZFS Arc.
As far as migrating your pool from TrueNAS, the big gotcha is the version of ZFS enabled for the zpool and what’s installed on the OS. If there’s a mismatch, you are limited in moving to the newer version due to unsupported flags and features built into the zpool and filesystem. On Linux, you can use zpool upgrade to see information about the pool and zfs --version to check what’s on your system. Both TrueNAS and Proxmox list ZFS versions in their release notes:
TrueNAS Core 13 U6.1 uses ZFS 2.1.14
TrueNAS Scale 23.10.1 uses ZFS 2.2.2
Promox 8.1 use ZFS 2.2.0
If you’re good on versioning, it should be as easy as you describe: export and then import!
I’m not entirely sure how “picky” ZFS is about versioning and if it considers all the way to the patch level or maybe just the minor release. That said, my opinion is you’ll probably be fine as long as you update Proxmox before trying anything. I logged into my proxmox cluster which is on 8.1.4 and ZFS is at version 2.2.2 now. This is the same as Cobia. Although, I can’t guarantee anything.
No luck with the import. It’s entirely possible I moved some hardware around that messed it up though. Not a big deal, I had planned for this contingency.
Oh no! Could you see the pool to import? If you can drop down the command line, I bet you can get more information than the generic error Cockpit/Houston will provide.
I could see the pool but there were errors. Not a big deal. I took the opportunity to do a clean setup and am in the process of transferring everything back over. Adding the HL15 was to help me do some reconfiguring and combining of my storage anyway.
EDIT: I had originally created these pools on TrueNAS core, so I think that could have been the issue.
If you are importing a pool from a different system/OS you may need to force the import in the CLI for the first time the pool is imported into the new OS.
You can do this by running “zpool import -f” change the pool name to match your pool and the pool should then import
Very happy to see this! Hoping to see some love for Ubuntu 22.04 or 24.04 LTS (releasing in April) or Rocky 9. If I had the coding skills I’d be happy to help, but alas, outside of some Ansible I’m a hardware and network guy at the end of the day!
I followed the above instuctons and root is not listed in the “disallowed-users but cockpit is still not allowing root or any other users lo log in. I keep getting " Wrong user name or password” error. Any ideas on what is going on here.
Yes to both. That’s why I am perplexed. No issues with SSH or root login in the Proxmox UI. I think I am going to reinstall everything fresh and see what happens from there.