-
-
Notifications
You must be signed in to change notification settings - Fork 99
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] Storage information not accurate on BTRFS RAID (Type 5) #980
Comments
Hi there! The minor difference in the values is probably due to different units used (GiB vs GB). About the split-view: That is definitely wrong. Can you please update your application and then provide the output of the following command?
|
yarn run v1.22.19 Output:const disks = [
{
device: '/dev/sda',
type: 'HD',
name: 'WDC WD181KFGX-68',
vendor: 'Western Digital',
size: 18000207937536,
bytesPerSector: null,
totalCylinders: null,
totalHeads: null,
totalSectors: null,
totalTracks: null,
tracksPerCylinder: null,
sectorsPerTrack: null,
firmwareRevision: '0A83',
serialNum: '',
interfaceType: 'SATA',
smartStatus: 'unknown',
temperature: null
},
{
device: '/dev/sdb',
type: 'HD',
name: 'WDC WD181KFGX-68',
vendor: 'Western Digital',
size: 18000207937536,
bytesPerSector: null,
totalCylinders: null,
totalHeads: null,
totalSectors: null,
totalTracks: null,
tracksPerCylinder: null,
sectorsPerTrack: null,
firmwareRevision: '0A83',
serialNum: '',
interfaceType: 'SATA',
smartStatus: 'unknown',
temperature: null
},
{
device: '/dev/sdc',
type: 'HD',
name: 'WDC WD181KFGX-68',
vendor: 'Western Digital',
size: 18000207937536,
bytesPerSector: null,
totalCylinders: null,
totalHeads: null,
totalSectors: null,
totalTracks: null,
tracksPerCylinder: null,
sectorsPerTrack: null,
firmwareRevision: '0A83',
serialNum: '',
interfaceType: 'SATA',
smartStatus: 'unknown',
temperature: null
},
{
device: '/dev/sdd',
type: 'HD',
name: 'WDC WD181KFGX-68',
vendor: 'Western Digital',
size: 18000207937536,
bytesPerSector: null,
totalCylinders: null,
totalHeads: null,
totalSectors: null,
totalTracks: null,
tracksPerCylinder: null,
sectorsPerTrack: null,
firmwareRevision: '0A83',
serialNum: '',
interfaceType: 'SATA',
smartStatus: 'unknown',
temperature: null
},
{
device: '/dev/nvme0n1',
type: 'NVMe',
name: 'Samsung SSD 980 PRO 2TB ',
vendor: 'Samsung',
size: 2000398934016,
bytesPerSector: null,
totalCylinders: null,
totalHeads: null,
totalSectors: null,
totalTracks: null,
tracksPerCylinder: null,
sectorsPerTrack: null,
firmwareRevision: '',
serialNum: 'S6B0NL0T615860L',
interfaceType: 'PCIe',
smartStatus: 'unknown',
temperature: null
}
]
const sizes = [
{
fs: '/dev/nvme0n1p3',
type: 'btrfs',
size: 1998694907904,
used: 236039712768,
available: 1760865275904,
use: 11.82,
mount: '/',
rw: true
},
{
fs: 'efivarfs',
type: 'efivarfs',
size: 262144,
used: 54272,
available: 202752,
use: 21.12,
mount: '/mnt/host/sys/firmware/efi/efivars',
rw: false
},
{
fs: '/dev/nvme0n1p2',
type: 'ext4',
size: 1020702720,
used: 388988928,
available: 561250304,
use: 40.94,
mount: '/mnt/host/boot',
rw: true
},
{
fs: '/dev/nvme0n1p1',
type: 'vfat',
size: 627900416,
used: 18206720,
available: 609693696,
use: 2.9,
mount: '/mnt/host/boot/efi',
rw: true
},
{
fs: '/dev/sdc1',
type: 'btrfs',
size: 72000827555840,
used: 51600711196672,
available: 2409158074368,
use: 95.54,
mount: '/mnt/host/mnt/cryo',
rw: true
}
]
const blocks = [
{
name: 'nvme0n1',
type: 'disk',
fsType: '',
mount: '',
size: 2000398934016,
physical: 'SSD',
uuid: '',
label: '',
model: 'Samsung SSD 980 PRO 2TB',
serial: 'S6B0NL0T615860L ',
removable: false,
protocol: 'nvme',
group: '',
device: '/dev/nvme0n1'
},
{
name: 'sda',
type: 'disk',
fsType: 'btrfs',
mount: '',
size: 18000207937536,
physical: 'HDD',
uuid: 'a5850a6f-adba-4e34-8fe4-6f191813d5cd',
label: '',
model: 'WDC WD181KFGX-68',
serial: '',
removable: false,
protocol: 'sata',
group: '',
device: '/dev/sda'
},
{
name: 'sdb',
type: 'disk',
fsType: 'btrfs',
mount: '',
size: 18000207937536,
physical: 'HDD',
uuid: 'a5850a6f-adba-4e34-8fe4-6f191813d5cd',
label: '',
model: 'WDC WD181KFGX-68',
serial: '',
removable: false,
protocol: 'sata',
group: '',
device: '/dev/sdb'
},
{
name: 'sdc',
type: 'disk',
fsType: '',
mount: '',
size: 18000207937536,
physical: 'HDD',
uuid: '',
label: '',
model: 'WDC WD181KFGX-68',
serial: '',
removable: false,
protocol: 'sata',
group: '',
device: '/dev/sdc'
},
{
name: 'sdd',
type: 'disk',
fsType: '',
mount: '',
size: 18000207937536,
physical: 'HDD',
uuid: '',
label: '',
model: 'WDC WD181KFGX-68',
serial: '',
removable: false,
protocol: 'sata',
group: '',
device: '/dev/sdd'
},
{
name: 'zram0',
type: 'disk',
fsType: 'swap',
mount: '[SWAP]',
size: 8589934592,
physical: 'SSD',
uuid: '7bcab061-21a3-4241-b6e3-bfe482fa018d',
label: 'zram0',
model: '',
serial: '',
removable: false,
protocol: '',
group: '',
device: '/dev/zram0'
},
{
name: 'nvme0n1p1',
type: 'part',
fsType: 'vfat',
mount: '/mnt/host/boot/efi',
size: 629145600,
physical: '',
uuid: '3671-6189',
label: '',
model: '',
serial: '',
removable: false,
protocol: 'nvme',
group: '',
device: '/dev/nvme0n1'
},
{
name: 'nvme0n1p2',
type: 'part',
fsType: 'ext4',
mount: '/mnt/host/boot',
size: 1073741824,
physical: '',
uuid: '7b45994d-fe45-406d-b7f8-1ec605f6dcd7',
label: '',
model: '',
serial: '',
removable: false,
protocol: 'nvme',
group: '',
device: '/dev/nvme0n1'
},
{
name: 'nvme0n1p3',
type: 'part',
fsType: 'btrfs',
mount: '/etc/hosts',
size: 1998694907904,
physical: '',
uuid: 'b795618d-1f46-4658-b196-69c7d4348e40',
label: 'fedora_localhost-live',
model: '',
serial: '',
removable: false,
protocol: 'nvme',
group: '',
device: '/dev/nvme0n1'
},
{
name: 'sdc1',
type: 'part',
fsType: 'btrfs',
mount: '/mnt/host/mnt/cryo',
size: 18000205840384,
physical: '',
uuid: 'a5850a6f-adba-4e34-8fe4-6f191813d5cd',
label: '',
model: '',
serial: '',
removable: false,
protocol: '',
group: '',
device: '/dev/sdc'
},
{
name: 'sdd1',
type: 'part',
fsType: 'btrfs',
mount: '',
size: 18000205840384,
physical: '',
uuid: 'a5850a6f-adba-4e34-8fe4-6f191813d5cd',
label: '',
model: '',
serial: '',
removable: false,
protocol: '',
group: '',
device: '/dev/sdd'
}
] |
I don't know how to fix this exactly as it also shows up incorrectly in $ btrfs fi df /mnt/my_mount | grep RAID5
Data, RAID5: total=48.96TiB, used=46.79TiB |
Okay I see. Unfortunately, there is not much I can fix right now, because the mounted path is directly assigned to the Something like this:
This is missing on your system and I don't know why. I would suggest opening an issue on the In the meantime, I would suggest using the following variables to get a somewhat desired result:
It will look like there are only 2 drives in your system (which is kind of correct, but it will omit the raid info), but they should show the correct sizes. |
@MauriceNino I normally would open an issue but I don't know anything about systeminformation or what it is so I don't feel too comfortable making a bug report. If you do so though, feel free to tag me and I can provide help in the form of |
@zwimer I have created an issue for you. If you can please comment all the information that might help on there, that would be nice. What would help:
|
Description of the bug
I have a BTRFS raid 5 array.
On split view: Most of the HDDs hit bug #935 but the one that does not lists that it is
47.6TiB / 16 TiB
full.On non-split view: The pie chart shows
47.6 TiB / 67.3 TiB
.The first one is showing the
used of the entire raid array / capacity of the individual HDD
. The second is showingused of the entire raid array / sum(capacity of each HDD in the raid array)
.Neither shows
used / capacity of the raid array
.I'm not sure how to handle this exactly, as
df -Thl
itself reports47.6 TiB / 67.3 TiB
; I knowbtrfs fi usage /mnt/foo
shows the correct amount of49 TiB
. But even if this is a won't fix, I think at least the split view display value is incorrect.How to reproduce
Relevant log output
Info output of dashdot cli
What browsers are you seeing the problem on?
Chrome
Where is your instance running?
Linux Server
Additional context
In docker
The text was updated successfully, but these errors were encountered: