Multipath problem

vladimer.gabunia
12 Posts
September 14, 2025, 11:55 pmQuote from vladimer.gabunia on September 14, 2025, 11:55 pmHi,
why multipath not works?
wipefs finished with following error: Error executing : wipefs --all /dev/sds.
when I'm tying make in manually:
wipefs --all /dev/sds
I getting following error:
wipefs: error: /dev/sds: probing initialization failed: Device or resource busy
My configuration is Supermicro CSE-829U with Supermicro CSE-847E connected to two SAS cable from different SAS adapters.
Is this configuration valid or multipathing is wrong option for Petasan?
thank you!
Hi,
why multipath not works?
wipefs finished with following error: Error executing : wipefs --all /dev/sds.
when I'm tying make in manually:
wipefs --all /dev/sds
I getting following error:
wipefs: error: /dev/sds: probing initialization failed: Device or resource busy
My configuration is Supermicro CSE-829U with Supermicro CSE-847E connected to two SAS cable from different SAS adapters.
Is this configuration valid or multipathing is wrong option for Petasan?
thank you!

admin
3,054 Posts
September 16, 2025, 6:49 pmQuote from admin on September 16, 2025, 6:49 pmIt is not clear why you think it is a multipath issue, it is also not clear if regular i/o works and only wipefs is failing.
if it is only wipefs that fails but other operations work. it is probably something not related to the storage and more related to your client OS, like is the volume mounted as a filesystem ? is it part of a client OS LVM logical volume that is activated ? if so wipefs will fail since you need to unmount or deactivate the LVM.
If it is general error: is the cluster health ok ? is the iSCSI disk status up ? do you have good hardware ? enough memory ?
If for some reason it is related to multipath as you say, can you check your client side multi-path configuration.
It is not clear why you think it is a multipath issue, it is also not clear if regular i/o works and only wipefs is failing.
if it is only wipefs that fails but other operations work. it is probably something not related to the storage and more related to your client OS, like is the volume mounted as a filesystem ? is it part of a client OS LVM logical volume that is activated ? if so wipefs will fail since you need to unmount or deactivate the LVM.
If it is general error: is the cluster health ok ? is the iSCSI disk status up ? do you have good hardware ? enough memory ?
If for some reason it is related to multipath as you say, can you check your client side multi-path configuration.

vladimer.gabunia
12 Posts
September 16, 2025, 7:23 pmQuote from vladimer.gabunia on September 16, 2025, 7:23 pmHi,
I leave only one SAS connection to JBOD and stopped multipath service, all is works now.
Looks like problem with multipathing module.
But think I have wrong config with multipath module. I don't have good experience with multipathing on SAS. Do I need special configuration?
Hi,
I leave only one SAS connection to JBOD and stopped multipath service, all is works now.
Looks like problem with multipathing module.
But think I have wrong config with multipath module. I don't have good experience with multipathing on SAS. Do I need special configuration?

admin
3,054 Posts
September 16, 2025, 8:40 pmQuote from admin on September 16, 2025, 8:40 pmstill not clear..
are you referring to client connected PetaSAN via muti-path iscsi or you are refering to PetaSAN connecting to SAS for OSD storage.
also as per prev. is wipefs command failing but other commands running ok ?
still not clear..
are you referring to client connected PetaSAN via muti-path iscsi or you are refering to PetaSAN connecting to SAS for OSD storage.
also as per prev. is wipefs command failing but other commands running ok ?

vladimer.gabunia
12 Posts
September 16, 2025, 8:47 pmQuote from vladimer.gabunia on September 16, 2025, 8:47 pmProblem was to connect PetaSAN to SAS drive for OSD.
only wipefs was failed . command I tested was successfully finished.
Problem was to connect PetaSAN to SAS drive for OSD.
only wipefs was failed . command I tested was successfully finished.

admin
3,054 Posts
September 16, 2025, 8:56 pmQuote from admin on September 16, 2025, 8:56 pmis the drive mounted on any file system ?
do the drives have lvm volumes (possibly from another older system), if yes they need to be de-activated else wipefs will fail with device busy.
is the drive mounted on any file system ?
do the drives have lvm volumes (possibly from another older system), if yes they need to be de-activated else wipefs will fail with device busy.

vladimer.gabunia
12 Posts
September 16, 2025, 9:06 pmQuote from vladimer.gabunia on September 16, 2025, 9:06 pmAfter disconnect of SAS cable and disable multipath, wipefs finished with success and drive is onboarded.
(After long research I meet this info on TrueNAS forum with same case)
Bad news is now my JBOD is connected only on one SAS cable.
After disconnect of SAS cable and disable multipath, wipefs finished with success and drive is onboarded.
(After long research I meet this info on TrueNAS forum with same case)
Bad news is now my JBOD is connected only on one SAS cable.
Multipath problem
vladimer.gabunia
12 Posts
Quote from vladimer.gabunia on September 14, 2025, 11:55 pmHi,
why multipath not works?
wipefs finished with following error: Error executing : wipefs --all /dev/sds.
when I'm tying make in manually:
wipefs --all /dev/sds
I getting following error:
wipefs: error: /dev/sds: probing initialization failed: Device or resource busy
My configuration is Supermicro CSE-829U with Supermicro CSE-847E connected to two SAS cable from different SAS adapters.
Is this configuration valid or multipathing is wrong option for Petasan?
thank you!
Hi,
why multipath not works?
wipefs finished with following error: Error executing : wipefs --all /dev/sds.
when I'm tying make in manually:
wipefs --all /dev/sds
I getting following error:
wipefs: error: /dev/sds: probing initialization failed: Device or resource busy
My configuration is Supermicro CSE-829U with Supermicro CSE-847E connected to two SAS cable from different SAS adapters.
Is this configuration valid or multipathing is wrong option for Petasan?
thank you!
admin
3,054 Posts
Quote from admin on September 16, 2025, 6:49 pmIt is not clear why you think it is a multipath issue, it is also not clear if regular i/o works and only wipefs is failing.
if it is only wipefs that fails but other operations work. it is probably something not related to the storage and more related to your client OS, like is the volume mounted as a filesystem ? is it part of a client OS LVM logical volume that is activated ? if so wipefs will fail since you need to unmount or deactivate the LVM.
If it is general error: is the cluster health ok ? is the iSCSI disk status up ? do you have good hardware ? enough memory ?
If for some reason it is related to multipath as you say, can you check your client side multi-path configuration.
It is not clear why you think it is a multipath issue, it is also not clear if regular i/o works and only wipefs is failing.
if it is only wipefs that fails but other operations work. it is probably something not related to the storage and more related to your client OS, like is the volume mounted as a filesystem ? is it part of a client OS LVM logical volume that is activated ? if so wipefs will fail since you need to unmount or deactivate the LVM.
If it is general error: is the cluster health ok ? is the iSCSI disk status up ? do you have good hardware ? enough memory ?
If for some reason it is related to multipath as you say, can you check your client side multi-path configuration.
vladimer.gabunia
12 Posts
Quote from vladimer.gabunia on September 16, 2025, 7:23 pmHi,
I leave only one SAS connection to JBOD and stopped multipath service, all is works now.
Looks like problem with multipathing module.
But think I have wrong config with multipath module. I don't have good experience with multipathing on SAS. Do I need special configuration?
Hi,
I leave only one SAS connection to JBOD and stopped multipath service, all is works now.
Looks like problem with multipathing module.
But think I have wrong config with multipath module. I don't have good experience with multipathing on SAS. Do I need special configuration?
admin
3,054 Posts
Quote from admin on September 16, 2025, 8:40 pmstill not clear..
are you referring to client connected PetaSAN via muti-path iscsi or you are refering to PetaSAN connecting to SAS for OSD storage.
also as per prev. is wipefs command failing but other commands running ok ?
still not clear..
are you referring to client connected PetaSAN via muti-path iscsi or you are refering to PetaSAN connecting to SAS for OSD storage.
also as per prev. is wipefs command failing but other commands running ok ?
vladimer.gabunia
12 Posts
Quote from vladimer.gabunia on September 16, 2025, 8:47 pmProblem was to connect PetaSAN to SAS drive for OSD.
only wipefs was failed . command I tested was successfully finished.
Problem was to connect PetaSAN to SAS drive for OSD.
only wipefs was failed . command I tested was successfully finished.
admin
3,054 Posts
Quote from admin on September 16, 2025, 8:56 pmis the drive mounted on any file system ?
do the drives have lvm volumes (possibly from another older system), if yes they need to be de-activated else wipefs will fail with device busy.
is the drive mounted on any file system ?
do the drives have lvm volumes (possibly from another older system), if yes they need to be de-activated else wipefs will fail with device busy.
vladimer.gabunia
12 Posts
Quote from vladimer.gabunia on September 16, 2025, 9:06 pmAfter disconnect of SAS cable and disable multipath, wipefs finished with success and drive is onboarded.
(After long research I meet this info on TrueNAS forum with same case)
Bad news is now my JBOD is connected only on one SAS cable.
After disconnect of SAS cable and disable multipath, wipefs finished with success and drive is onboarded.
(After long research I meet this info on TrueNAS forum with same case)
Bad news is now my JBOD is connected only on one SAS cable.