How do I determine the total size of a directory (folder) from the command line?Is it okay to delete ~/.cache?How to list recursive file sizes of files and directories in a directory?How to determine where biggest files/directories on my system are stored?Command 'ls' showing directory size instead of block-sizeHow can I print the breakdown of disk usage by directory?View TOTAL Disk Space Taken by ALL Installed Packageshow to Create SIGQUIT as command in the command line?How to see the size and free disk space on the / filesystem's partition?How to increase size allocated to a userWhat is the simplest command line instruction in Ubuntu to export a directory file list to a text file?How to list the size of a directoy and subdirectories and files inside it by terminal?Get total diskspace usign du for multiple directoriesHow to find the directory on which a particular command is executed?Increasing size of a virtualmachine using only the command lineReduce the file size of multiple .mp3s of voice recordings via command line at oncedf shown size much smaller than partition size, /home full
Are Captain Marvel's powers affected by Thanos breaking the Tesseract and claiming the stone?
Can I cause damage to electrical appliances by unplugging them when they are turned on?
Quoting Keynes in a lecture
What is the meaning of "You've never met a graph you didn't like?"
Identifying "long and narrow" polygons in with PostGIS
Would this string work as string?
Unable to disable Microsoft Store in domain environment
Sigmoid with a slope but no asymptotes?
Mimic lecturing on blackboard, facing audience
What is the smallest number n> 5 so that 5 ^ n ends with "3125"?
Do I have to take mana from my deck or hand when tapping a dual land?
Difference between shutdown options
What's the name of the logical fallacy where a debater extends a statement far beyond the original statement to make it true?
Why would five hundred and five be same as one?
Has the laser at Magurele, Romania reached a tenth of the Sun's power?
Echo with obfuscation
Should I warn a new PhD Student?
How to make a list of partial sums using forEach
Is there anyway, I can have two passwords for my wi-fi
Grepping string, but include all non-blank lines following each grep match
Pre-Employment Background Check With Consent For Future Checks
Air travel with refrigerated insulin
Why do Radio Buttons not fill the entire outer circle?
Does Doodling or Improvising on the Piano Have Any Benefits?
How do I determine the total size of a directory (folder) from the command line?
Is it okay to delete ~/.cache?How to list recursive file sizes of files and directories in a directory?How to determine where biggest files/directories on my system are stored?Command 'ls' showing directory size instead of block-sizeHow can I print the breakdown of disk usage by directory?View TOTAL Disk Space Taken by ALL Installed Packageshow to Create SIGQUIT as command in the command line?How to see the size and free disk space on the / filesystem's partition?How to increase size allocated to a userWhat is the simplest command line instruction in Ubuntu to export a directory file list to a text file?How to list the size of a directoy and subdirectories and files inside it by terminal?Get total diskspace usign du for multiple directoriesHow to find the directory on which a particular command is executed?Increasing size of a virtualmachine using only the command lineReduce the file size of multiple .mp3s of voice recordings via command line at oncedf shown size much smaller than partition size, /home full
Is there a simple command to display the total aggregate size (disk usage) of all files in a directory (folder)?
I have tried these, and they don't do what I want:
ls -l
, which only displays the size of the individual files in a directory, nordf -h
, which only displays the free and used space on my disks.
filesystem command-line
add a comment |
Is there a simple command to display the total aggregate size (disk usage) of all files in a directory (folder)?
I have tried these, and they don't do what I want:
ls -l
, which only displays the size of the individual files in a directory, nordf -h
, which only displays the free and used space on my disks.
filesystem command-line
Related: How to list recursive file sizes of files and directories in a directory?
– Peter Mortensen
Nov 26 '17 at 23:12
Cross-site duplicate: How do I get the size of a directory on the command line?
– Peter Mortensen
Nov 26 '17 at 23:17
add a comment |
Is there a simple command to display the total aggregate size (disk usage) of all files in a directory (folder)?
I have tried these, and they don't do what I want:
ls -l
, which only displays the size of the individual files in a directory, nordf -h
, which only displays the free and used space on my disks.
filesystem command-line
Is there a simple command to display the total aggregate size (disk usage) of all files in a directory (folder)?
I have tried these, and they don't do what I want:
ls -l
, which only displays the size of the individual files in a directory, nordf -h
, which only displays the free and used space on my disks.
filesystem command-line
filesystem command-line
edited Feb 13 '14 at 9:44
kiri
19.3k1360106
19.3k1360106
asked Aug 5 '10 at 18:20
David BarryDavid Barry
3,8033118
3,8033118
Related: How to list recursive file sizes of files and directories in a directory?
– Peter Mortensen
Nov 26 '17 at 23:12
Cross-site duplicate: How do I get the size of a directory on the command line?
– Peter Mortensen
Nov 26 '17 at 23:17
add a comment |
Related: How to list recursive file sizes of files and directories in a directory?
– Peter Mortensen
Nov 26 '17 at 23:12
Cross-site duplicate: How do I get the size of a directory on the command line?
– Peter Mortensen
Nov 26 '17 at 23:17
Related: How to list recursive file sizes of files and directories in a directory?
– Peter Mortensen
Nov 26 '17 at 23:12
Related: How to list recursive file sizes of files and directories in a directory?
– Peter Mortensen
Nov 26 '17 at 23:12
Cross-site duplicate: How do I get the size of a directory on the command line?
– Peter Mortensen
Nov 26 '17 at 23:17
Cross-site duplicate: How do I get the size of a directory on the command line?
– Peter Mortensen
Nov 26 '17 at 23:17
add a comment |
12 Answers
12
active
oldest
votes
The command du
"summarizes disk usage of each FILE, recursively for directories," e.g.,
du -hs /path/to/directory
-h
is to get the numbers "human readable", e.g. get140M
instead of143260
(size in KBytes)-s
is for summary (otherwise you'll get not only the size of the folder but also for everything in the folder separately)
As you're using -h
you can sort the human readable values using
du -h | sort -h
The -h
flag on sort
will consider "Human Readable" size values.
If want to avoid recursively listing all files and directories, you can supply the --max-depth
parameter to limit how many items are displayed. Most commonly, --max-depth=1
du -h --max-depth=1 /path/to/directory
7
respect for going the extra mile. :)
– myusuf3
Aug 5 '10 at 18:37
84
I usedu -sh
or DOOSH as a way to remember it (NOTE: the command is the same, just the organization of commandline flags for memory purposes)
– Marco Ceppi♦
Aug 5 '10 at 18:56
2
There is a useful option to du called the --apparent-size. It can be used to find the actual size of a file or directory (as opposed to its footprint on the disk) eg, a text file with just 4 characters will occupy about 6 bytes, but will still show up as taking up ~4K in a regular du -sh output. However, if you pass the --apparent-size option, the output will be 6. man du says: --apparent-size print apparent sizes, rather than disk usage; although the apparent size is usually smaller, it may be larger due to holes in (‘sparse’) files, internal fragmentation, indirect blocks
– Hopping Bunny
Jun 3 '16 at 4:45
3
This works for OS X too! Thanks, I was really looking for a way to clear up files, both on my local machine, and my server, but automated methods seemed not to work. So, I randu -hs *
and went into the largest directory and found out which files were so large... This is such a good method, and the best part is you don't have to install anything! Definitely deserved my upvote
– Dev
Sep 14 '16 at 17:57
@BandaMuhammadAlHelal I think there are two reasons: rounding (du
has somewhat peculiar rounding, showing no decimals if the value has more than one digit in the chosen unit), and the classical 1024 vs. 1000 prefix issue.du
has an option-B
(or--block-size
) to change the units in which it displays values, or you could use-b
instead of-h
to get the "raw" value in bytes.
– Marcel Stimberg
May 23 '17 at 9:12
add a comment |
Recently I found a great, ncurses based interactive tool, that quickly gives you an overview about directory sizes. Searched for that kind of tool for years.
- quickly drilldown through file hierarchy
- you can delete e.g. huge temporary files from inside the tool
- extremely fast
Think of it as baobab for the command line:
apt-get install ncdu
10
This is absolutely fantastic! Like DaisyDisk, for OSX
– subZero
Jun 6 '14 at 19:58
3
ncdu
is awesome! After installing it, just do thisncdu /
. You will very quickly find the biggest files on the system. Also pressh
while inside ncdu's console interface. It has very useful shortcuts
– vlad-ardelean
Dec 12 '17 at 16:34
This is EXACTLY what I needed: so simple, so efficient. Pure love.
– user3790897
Oct 30 '18 at 17:18
add a comment |
This finds the size recursively and puts it next to each folder name, along with total size at the bottom, all in the human format
du -hsc *
add a comment |
Enjoy!
du foldername
More information on that command here
add a comment |
tree
is another useful command for this job:
Just install it via sudo apt-get install tree
and type the following:
tree --du -h /path/to/directory
...
...
33.7M used in 0 directories, 25 files
From man tree:
-h Print the size of each file but in a more human readable way, e.g. appending a size letter for kilo‐
bytes (K), megabytes (M), gigabytes (G), terabytes (T), petabytes (P) and exabytes (E).
--du For each directory report its size as the accumulation of sizes of all its files and sub-directories
(and their files, and so on). The total amount of used space is also given in the final report (like
the 'du -c' command.)
add a comment |
Below is what I am using to print total, folder, and file size:
$ du -sch /home/vivek/* | sort -rh
Details
------------------------------------------------------------
-c, --total
produce a grand total
-h, --human-readable
print sizes in human readable format (e.g., 1K 234M 2G)
-s, --summarize
display only a total for each argument
-------------------------------------------------------------
-h, --human-numeric-sort
compare human readable numbers (e.g., 2K 1G)
-r, --reverse
reverse the result of comparisons
Output
70M total
69M /home/vivek/Downloads/gatling-charts-highcharts-bundle-2.2.2/lib
992K /home/vivek/Downloads/gatling-charts-highcharts-bundle-2.2.2/results
292K /home/vivek/Downloads/gatling-charts-highcharts-bundle-2.2.2/target
52K /home/vivek/Downloads/gatling-charts-highcharts-bundle-2.2.2/user-files
add a comment |
The answers have made it obvious that du
is the tool to find the total size of a directory. However, there are a couple of factors to consider:
Occasionally,
du
output can be misleading because it reports the space allocated by the filesystem, which may be different from the sum of the sizes of the individual files. Typically the filesystem will allocate 4096 bytes for a file even if you stored just one character in it!Output differences due to power of 2 and power of 10 units. The
-h
switch todu
divides the number of bytes by 2^10 (1024), 2^20 (1048576) etc to give a human readable output. Many people might be more habituated to seeing powers of 10 (e.g. 1K = 1000, 1M = 1000000) and be surprised by the result.
To find the total sum of sizes of all files in a directory, in bytes, do:
find <dir> -ls | awk 'sum += $7 END print sum'
Example:
$ du -s -B 1
255729664
$ find . -ls | awk 'sum += $7 END print sum'
249008169
The find-ls-awk will return a wrong value for large folders #1. For newer awk you can add--bignum
or-M
; if that is not an option usefind . -ls | tr -s ' '|cut -d' ' -f 7| paste -sd+ |bc
#2.
– goozez
Jun 29 '16 at 20:25
If powers of 2 being used is a problem, there's the--si
option: "like -h, but use powers of 1000 not 1024"
– muru
Dec 15 '17 at 3:25
add a comment |
For only the directory size in a readable format, use the below:
du -hs directoryname
This probably isn't in the correct section, but from the command line, you could try:
ls -sh filename
The -s
is size, and the -h
is human readable.
Use -l
to show on ls
list, like below:
ls -shl
add a comment |
du /foldername
is the standard command to know the size of a folder. It is best practice to find the options by reading the man page:
man du
You should read the man page (available online) before you use the command.
add a comment |
Here is a POSIX script that will work with:
- A file
- Files
- A directory
- Directories
#!/bin/sh
ls -ARgo "$@" | awk 'q += $3 END print q'
Source
add a comment |
If your desired directory has many sub-directories then, use the following:
$ cd ~/your/target/directory
$ du -csh
-c, --total produce a grand total
-s, --summarize display only a total for each argument
-h, --human-readable print sizes in human readable format (e.g., 1K 234M 2G)
which would then produce a overall total of the memory usage by all files/folders in the current directory.
add a comment |
The best one I think is the following:
du -h directory_name | tail -n1
This will show you only the size of the directory that you are interested in and will not print sizes of any directories and files inside that directory.
I should add that if the size of the folder is large then du takes longer time. You must be patient for this command to work. Just like any other unix command, you may find out the total time for this process by using time
before this command:
time du -h directory_name | tail -n1
6
du
has an option for this:-s
– muru
Oct 4 '15 at 11:46
add a comment |
protected by αғsнιη Nov 19 '17 at 3:18
Thank you for your interest in this question.
Because it has attracted low-quality or spam answers that had to be removed, posting an answer now requires 10 reputation on this site (the association bonus does not count).
Would you like to answer one of these unanswered questions instead?
12 Answers
12
active
oldest
votes
12 Answers
12
active
oldest
votes
active
oldest
votes
active
oldest
votes
The command du
"summarizes disk usage of each FILE, recursively for directories," e.g.,
du -hs /path/to/directory
-h
is to get the numbers "human readable", e.g. get140M
instead of143260
(size in KBytes)-s
is for summary (otherwise you'll get not only the size of the folder but also for everything in the folder separately)
As you're using -h
you can sort the human readable values using
du -h | sort -h
The -h
flag on sort
will consider "Human Readable" size values.
If want to avoid recursively listing all files and directories, you can supply the --max-depth
parameter to limit how many items are displayed. Most commonly, --max-depth=1
du -h --max-depth=1 /path/to/directory
7
respect for going the extra mile. :)
– myusuf3
Aug 5 '10 at 18:37
84
I usedu -sh
or DOOSH as a way to remember it (NOTE: the command is the same, just the organization of commandline flags for memory purposes)
– Marco Ceppi♦
Aug 5 '10 at 18:56
2
There is a useful option to du called the --apparent-size. It can be used to find the actual size of a file or directory (as opposed to its footprint on the disk) eg, a text file with just 4 characters will occupy about 6 bytes, but will still show up as taking up ~4K in a regular du -sh output. However, if you pass the --apparent-size option, the output will be 6. man du says: --apparent-size print apparent sizes, rather than disk usage; although the apparent size is usually smaller, it may be larger due to holes in (‘sparse’) files, internal fragmentation, indirect blocks
– Hopping Bunny
Jun 3 '16 at 4:45
3
This works for OS X too! Thanks, I was really looking for a way to clear up files, both on my local machine, and my server, but automated methods seemed not to work. So, I randu -hs *
and went into the largest directory and found out which files were so large... This is such a good method, and the best part is you don't have to install anything! Definitely deserved my upvote
– Dev
Sep 14 '16 at 17:57
@BandaMuhammadAlHelal I think there are two reasons: rounding (du
has somewhat peculiar rounding, showing no decimals if the value has more than one digit in the chosen unit), and the classical 1024 vs. 1000 prefix issue.du
has an option-B
(or--block-size
) to change the units in which it displays values, or you could use-b
instead of-h
to get the "raw" value in bytes.
– Marcel Stimberg
May 23 '17 at 9:12
add a comment |
The command du
"summarizes disk usage of each FILE, recursively for directories," e.g.,
du -hs /path/to/directory
-h
is to get the numbers "human readable", e.g. get140M
instead of143260
(size in KBytes)-s
is for summary (otherwise you'll get not only the size of the folder but also for everything in the folder separately)
As you're using -h
you can sort the human readable values using
du -h | sort -h
The -h
flag on sort
will consider "Human Readable" size values.
If want to avoid recursively listing all files and directories, you can supply the --max-depth
parameter to limit how many items are displayed. Most commonly, --max-depth=1
du -h --max-depth=1 /path/to/directory
7
respect for going the extra mile. :)
– myusuf3
Aug 5 '10 at 18:37
84
I usedu -sh
or DOOSH as a way to remember it (NOTE: the command is the same, just the organization of commandline flags for memory purposes)
– Marco Ceppi♦
Aug 5 '10 at 18:56
2
There is a useful option to du called the --apparent-size. It can be used to find the actual size of a file or directory (as opposed to its footprint on the disk) eg, a text file with just 4 characters will occupy about 6 bytes, but will still show up as taking up ~4K in a regular du -sh output. However, if you pass the --apparent-size option, the output will be 6. man du says: --apparent-size print apparent sizes, rather than disk usage; although the apparent size is usually smaller, it may be larger due to holes in (‘sparse’) files, internal fragmentation, indirect blocks
– Hopping Bunny
Jun 3 '16 at 4:45
3
This works for OS X too! Thanks, I was really looking for a way to clear up files, both on my local machine, and my server, but automated methods seemed not to work. So, I randu -hs *
and went into the largest directory and found out which files were so large... This is such a good method, and the best part is you don't have to install anything! Definitely deserved my upvote
– Dev
Sep 14 '16 at 17:57
@BandaMuhammadAlHelal I think there are two reasons: rounding (du
has somewhat peculiar rounding, showing no decimals if the value has more than one digit in the chosen unit), and the classical 1024 vs. 1000 prefix issue.du
has an option-B
(or--block-size
) to change the units in which it displays values, or you could use-b
instead of-h
to get the "raw" value in bytes.
– Marcel Stimberg
May 23 '17 at 9:12
add a comment |
The command du
"summarizes disk usage of each FILE, recursively for directories," e.g.,
du -hs /path/to/directory
-h
is to get the numbers "human readable", e.g. get140M
instead of143260
(size in KBytes)-s
is for summary (otherwise you'll get not only the size of the folder but also for everything in the folder separately)
As you're using -h
you can sort the human readable values using
du -h | sort -h
The -h
flag on sort
will consider "Human Readable" size values.
If want to avoid recursively listing all files and directories, you can supply the --max-depth
parameter to limit how many items are displayed. Most commonly, --max-depth=1
du -h --max-depth=1 /path/to/directory
The command du
"summarizes disk usage of each FILE, recursively for directories," e.g.,
du -hs /path/to/directory
-h
is to get the numbers "human readable", e.g. get140M
instead of143260
(size in KBytes)-s
is for summary (otherwise you'll get not only the size of the folder but also for everything in the folder separately)
As you're using -h
you can sort the human readable values using
du -h | sort -h
The -h
flag on sort
will consider "Human Readable" size values.
If want to avoid recursively listing all files and directories, you can supply the --max-depth
parameter to limit how many items are displayed. Most commonly, --max-depth=1
du -h --max-depth=1 /path/to/directory
edited Feb 26 at 11:31


ketza
32
32
answered Aug 5 '10 at 18:27
Marcel StimbergMarcel Stimberg
26.6k73944
26.6k73944
7
respect for going the extra mile. :)
– myusuf3
Aug 5 '10 at 18:37
84
I usedu -sh
or DOOSH as a way to remember it (NOTE: the command is the same, just the organization of commandline flags for memory purposes)
– Marco Ceppi♦
Aug 5 '10 at 18:56
2
There is a useful option to du called the --apparent-size. It can be used to find the actual size of a file or directory (as opposed to its footprint on the disk) eg, a text file with just 4 characters will occupy about 6 bytes, but will still show up as taking up ~4K in a regular du -sh output. However, if you pass the --apparent-size option, the output will be 6. man du says: --apparent-size print apparent sizes, rather than disk usage; although the apparent size is usually smaller, it may be larger due to holes in (‘sparse’) files, internal fragmentation, indirect blocks
– Hopping Bunny
Jun 3 '16 at 4:45
3
This works for OS X too! Thanks, I was really looking for a way to clear up files, both on my local machine, and my server, but automated methods seemed not to work. So, I randu -hs *
and went into the largest directory and found out which files were so large... This is such a good method, and the best part is you don't have to install anything! Definitely deserved my upvote
– Dev
Sep 14 '16 at 17:57
@BandaMuhammadAlHelal I think there are two reasons: rounding (du
has somewhat peculiar rounding, showing no decimals if the value has more than one digit in the chosen unit), and the classical 1024 vs. 1000 prefix issue.du
has an option-B
(or--block-size
) to change the units in which it displays values, or you could use-b
instead of-h
to get the "raw" value in bytes.
– Marcel Stimberg
May 23 '17 at 9:12
add a comment |
7
respect for going the extra mile. :)
– myusuf3
Aug 5 '10 at 18:37
84
I usedu -sh
or DOOSH as a way to remember it (NOTE: the command is the same, just the organization of commandline flags for memory purposes)
– Marco Ceppi♦
Aug 5 '10 at 18:56
2
There is a useful option to du called the --apparent-size. It can be used to find the actual size of a file or directory (as opposed to its footprint on the disk) eg, a text file with just 4 characters will occupy about 6 bytes, but will still show up as taking up ~4K in a regular du -sh output. However, if you pass the --apparent-size option, the output will be 6. man du says: --apparent-size print apparent sizes, rather than disk usage; although the apparent size is usually smaller, it may be larger due to holes in (‘sparse’) files, internal fragmentation, indirect blocks
– Hopping Bunny
Jun 3 '16 at 4:45
3
This works for OS X too! Thanks, I was really looking for a way to clear up files, both on my local machine, and my server, but automated methods seemed not to work. So, I randu -hs *
and went into the largest directory and found out which files were so large... This is such a good method, and the best part is you don't have to install anything! Definitely deserved my upvote
– Dev
Sep 14 '16 at 17:57
@BandaMuhammadAlHelal I think there are two reasons: rounding (du
has somewhat peculiar rounding, showing no decimals if the value has more than one digit in the chosen unit), and the classical 1024 vs. 1000 prefix issue.du
has an option-B
(or--block-size
) to change the units in which it displays values, or you could use-b
instead of-h
to get the "raw" value in bytes.
– Marcel Stimberg
May 23 '17 at 9:12
7
7
respect for going the extra mile. :)
– myusuf3
Aug 5 '10 at 18:37
respect for going the extra mile. :)
– myusuf3
Aug 5 '10 at 18:37
84
84
I use
du -sh
or DOOSH as a way to remember it (NOTE: the command is the same, just the organization of commandline flags for memory purposes)– Marco Ceppi♦
Aug 5 '10 at 18:56
I use
du -sh
or DOOSH as a way to remember it (NOTE: the command is the same, just the organization of commandline flags for memory purposes)– Marco Ceppi♦
Aug 5 '10 at 18:56
2
2
There is a useful option to du called the --apparent-size. It can be used to find the actual size of a file or directory (as opposed to its footprint on the disk) eg, a text file with just 4 characters will occupy about 6 bytes, but will still show up as taking up ~4K in a regular du -sh output. However, if you pass the --apparent-size option, the output will be 6. man du says: --apparent-size print apparent sizes, rather than disk usage; although the apparent size is usually smaller, it may be larger due to holes in (‘sparse’) files, internal fragmentation, indirect blocks
– Hopping Bunny
Jun 3 '16 at 4:45
There is a useful option to du called the --apparent-size. It can be used to find the actual size of a file or directory (as opposed to its footprint on the disk) eg, a text file with just 4 characters will occupy about 6 bytes, but will still show up as taking up ~4K in a regular du -sh output. However, if you pass the --apparent-size option, the output will be 6. man du says: --apparent-size print apparent sizes, rather than disk usage; although the apparent size is usually smaller, it may be larger due to holes in (‘sparse’) files, internal fragmentation, indirect blocks
– Hopping Bunny
Jun 3 '16 at 4:45
3
3
This works for OS X too! Thanks, I was really looking for a way to clear up files, both on my local machine, and my server, but automated methods seemed not to work. So, I ran
du -hs *
and went into the largest directory and found out which files were so large... This is such a good method, and the best part is you don't have to install anything! Definitely deserved my upvote– Dev
Sep 14 '16 at 17:57
This works for OS X too! Thanks, I was really looking for a way to clear up files, both on my local machine, and my server, but automated methods seemed not to work. So, I ran
du -hs *
and went into the largest directory and found out which files were so large... This is such a good method, and the best part is you don't have to install anything! Definitely deserved my upvote– Dev
Sep 14 '16 at 17:57
@BandaMuhammadAlHelal I think there are two reasons: rounding (
du
has somewhat peculiar rounding, showing no decimals if the value has more than one digit in the chosen unit), and the classical 1024 vs. 1000 prefix issue. du
has an option -B
(or --block-size
) to change the units in which it displays values, or you could use -b
instead of -h
to get the "raw" value in bytes.– Marcel Stimberg
May 23 '17 at 9:12
@BandaMuhammadAlHelal I think there are two reasons: rounding (
du
has somewhat peculiar rounding, showing no decimals if the value has more than one digit in the chosen unit), and the classical 1024 vs. 1000 prefix issue. du
has an option -B
(or --block-size
) to change the units in which it displays values, or you could use -b
instead of -h
to get the "raw" value in bytes.– Marcel Stimberg
May 23 '17 at 9:12
add a comment |
Recently I found a great, ncurses based interactive tool, that quickly gives you an overview about directory sizes. Searched for that kind of tool for years.
- quickly drilldown through file hierarchy
- you can delete e.g. huge temporary files from inside the tool
- extremely fast
Think of it as baobab for the command line:
apt-get install ncdu
10
This is absolutely fantastic! Like DaisyDisk, for OSX
– subZero
Jun 6 '14 at 19:58
3
ncdu
is awesome! After installing it, just do thisncdu /
. You will very quickly find the biggest files on the system. Also pressh
while inside ncdu's console interface. It has very useful shortcuts
– vlad-ardelean
Dec 12 '17 at 16:34
This is EXACTLY what I needed: so simple, so efficient. Pure love.
– user3790897
Oct 30 '18 at 17:18
add a comment |
Recently I found a great, ncurses based interactive tool, that quickly gives you an overview about directory sizes. Searched for that kind of tool for years.
- quickly drilldown through file hierarchy
- you can delete e.g. huge temporary files from inside the tool
- extremely fast
Think of it as baobab for the command line:
apt-get install ncdu
10
This is absolutely fantastic! Like DaisyDisk, for OSX
– subZero
Jun 6 '14 at 19:58
3
ncdu
is awesome! After installing it, just do thisncdu /
. You will very quickly find the biggest files on the system. Also pressh
while inside ncdu's console interface. It has very useful shortcuts
– vlad-ardelean
Dec 12 '17 at 16:34
This is EXACTLY what I needed: so simple, so efficient. Pure love.
– user3790897
Oct 30 '18 at 17:18
add a comment |
Recently I found a great, ncurses based interactive tool, that quickly gives you an overview about directory sizes. Searched for that kind of tool for years.
- quickly drilldown through file hierarchy
- you can delete e.g. huge temporary files from inside the tool
- extremely fast
Think of it as baobab for the command line:
apt-get install ncdu
Recently I found a great, ncurses based interactive tool, that quickly gives you an overview about directory sizes. Searched for that kind of tool for years.
- quickly drilldown through file hierarchy
- you can delete e.g. huge temporary files from inside the tool
- extremely fast
Think of it as baobab for the command line:
apt-get install ncdu
answered Oct 11 '12 at 7:24
geekQgeekQ
1,81111013
1,81111013
10
This is absolutely fantastic! Like DaisyDisk, for OSX
– subZero
Jun 6 '14 at 19:58
3
ncdu
is awesome! After installing it, just do thisncdu /
. You will very quickly find the biggest files on the system. Also pressh
while inside ncdu's console interface. It has very useful shortcuts
– vlad-ardelean
Dec 12 '17 at 16:34
This is EXACTLY what I needed: so simple, so efficient. Pure love.
– user3790897
Oct 30 '18 at 17:18
add a comment |
10
This is absolutely fantastic! Like DaisyDisk, for OSX
– subZero
Jun 6 '14 at 19:58
3
ncdu
is awesome! After installing it, just do thisncdu /
. You will very quickly find the biggest files on the system. Also pressh
while inside ncdu's console interface. It has very useful shortcuts
– vlad-ardelean
Dec 12 '17 at 16:34
This is EXACTLY what I needed: so simple, so efficient. Pure love.
– user3790897
Oct 30 '18 at 17:18
10
10
This is absolutely fantastic! Like DaisyDisk, for OSX
– subZero
Jun 6 '14 at 19:58
This is absolutely fantastic! Like DaisyDisk, for OSX
– subZero
Jun 6 '14 at 19:58
3
3
ncdu
is awesome! After installing it, just do this ncdu /
. You will very quickly find the biggest files on the system. Also press h
while inside ncdu's console interface. It has very useful shortcuts– vlad-ardelean
Dec 12 '17 at 16:34
ncdu
is awesome! After installing it, just do this ncdu /
. You will very quickly find the biggest files on the system. Also press h
while inside ncdu's console interface. It has very useful shortcuts– vlad-ardelean
Dec 12 '17 at 16:34
This is EXACTLY what I needed: so simple, so efficient. Pure love.
– user3790897
Oct 30 '18 at 17:18
This is EXACTLY what I needed: so simple, so efficient. Pure love.
– user3790897
Oct 30 '18 at 17:18
add a comment |
This finds the size recursively and puts it next to each folder name, along with total size at the bottom, all in the human format
du -hsc *
add a comment |
This finds the size recursively and puts it next to each folder name, along with total size at the bottom, all in the human format
du -hsc *
add a comment |
This finds the size recursively and puts it next to each folder name, along with total size at the bottom, all in the human format
du -hsc *
This finds the size recursively and puts it next to each folder name, along with total size at the bottom, all in the human format
du -hsc *
answered Dec 19 '14 at 16:32
BradBrad
52742
52742
add a comment |
add a comment |
Enjoy!
du foldername
More information on that command here
add a comment |
Enjoy!
du foldername
More information on that command here
add a comment |
Enjoy!
du foldername
More information on that command here
Enjoy!
du foldername
More information on that command here
answered Aug 5 '10 at 18:22
myusuf3myusuf3
13.4k338099
13.4k338099
add a comment |
add a comment |
tree
is another useful command for this job:
Just install it via sudo apt-get install tree
and type the following:
tree --du -h /path/to/directory
...
...
33.7M used in 0 directories, 25 files
From man tree:
-h Print the size of each file but in a more human readable way, e.g. appending a size letter for kilo‐
bytes (K), megabytes (M), gigabytes (G), terabytes (T), petabytes (P) and exabytes (E).
--du For each directory report its size as the accumulation of sizes of all its files and sub-directories
(and their files, and so on). The total amount of used space is also given in the final report (like
the 'du -c' command.)
add a comment |
tree
is another useful command for this job:
Just install it via sudo apt-get install tree
and type the following:
tree --du -h /path/to/directory
...
...
33.7M used in 0 directories, 25 files
From man tree:
-h Print the size of each file but in a more human readable way, e.g. appending a size letter for kilo‐
bytes (K), megabytes (M), gigabytes (G), terabytes (T), petabytes (P) and exabytes (E).
--du For each directory report its size as the accumulation of sizes of all its files and sub-directories
(and their files, and so on). The total amount of used space is also given in the final report (like
the 'du -c' command.)
add a comment |
tree
is another useful command for this job:
Just install it via sudo apt-get install tree
and type the following:
tree --du -h /path/to/directory
...
...
33.7M used in 0 directories, 25 files
From man tree:
-h Print the size of each file but in a more human readable way, e.g. appending a size letter for kilo‐
bytes (K), megabytes (M), gigabytes (G), terabytes (T), petabytes (P) and exabytes (E).
--du For each directory report its size as the accumulation of sizes of all its files and sub-directories
(and their files, and so on). The total amount of used space is also given in the final report (like
the 'du -c' command.)
tree
is another useful command for this job:
Just install it via sudo apt-get install tree
and type the following:
tree --du -h /path/to/directory
...
...
33.7M used in 0 directories, 25 files
From man tree:
-h Print the size of each file but in a more human readable way, e.g. appending a size letter for kilo‐
bytes (K), megabytes (M), gigabytes (G), terabytes (T), petabytes (P) and exabytes (E).
--du For each directory report its size as the accumulation of sizes of all its files and sub-directories
(and their files, and so on). The total amount of used space is also given in the final report (like
the 'du -c' command.)
answered Jan 27 '15 at 12:37
αғsнιηαғsнιη
24.9k23100161
24.9k23100161
add a comment |
add a comment |
Below is what I am using to print total, folder, and file size:
$ du -sch /home/vivek/* | sort -rh
Details
------------------------------------------------------------
-c, --total
produce a grand total
-h, --human-readable
print sizes in human readable format (e.g., 1K 234M 2G)
-s, --summarize
display only a total for each argument
-------------------------------------------------------------
-h, --human-numeric-sort
compare human readable numbers (e.g., 2K 1G)
-r, --reverse
reverse the result of comparisons
Output
70M total
69M /home/vivek/Downloads/gatling-charts-highcharts-bundle-2.2.2/lib
992K /home/vivek/Downloads/gatling-charts-highcharts-bundle-2.2.2/results
292K /home/vivek/Downloads/gatling-charts-highcharts-bundle-2.2.2/target
52K /home/vivek/Downloads/gatling-charts-highcharts-bundle-2.2.2/user-files
add a comment |
Below is what I am using to print total, folder, and file size:
$ du -sch /home/vivek/* | sort -rh
Details
------------------------------------------------------------
-c, --total
produce a grand total
-h, --human-readable
print sizes in human readable format (e.g., 1K 234M 2G)
-s, --summarize
display only a total for each argument
-------------------------------------------------------------
-h, --human-numeric-sort
compare human readable numbers (e.g., 2K 1G)
-r, --reverse
reverse the result of comparisons
Output
70M total
69M /home/vivek/Downloads/gatling-charts-highcharts-bundle-2.2.2/lib
992K /home/vivek/Downloads/gatling-charts-highcharts-bundle-2.2.2/results
292K /home/vivek/Downloads/gatling-charts-highcharts-bundle-2.2.2/target
52K /home/vivek/Downloads/gatling-charts-highcharts-bundle-2.2.2/user-files
add a comment |
Below is what I am using to print total, folder, and file size:
$ du -sch /home/vivek/* | sort -rh
Details
------------------------------------------------------------
-c, --total
produce a grand total
-h, --human-readable
print sizes in human readable format (e.g., 1K 234M 2G)
-s, --summarize
display only a total for each argument
-------------------------------------------------------------
-h, --human-numeric-sort
compare human readable numbers (e.g., 2K 1G)
-r, --reverse
reverse the result of comparisons
Output
70M total
69M /home/vivek/Downloads/gatling-charts-highcharts-bundle-2.2.2/lib
992K /home/vivek/Downloads/gatling-charts-highcharts-bundle-2.2.2/results
292K /home/vivek/Downloads/gatling-charts-highcharts-bundle-2.2.2/target
52K /home/vivek/Downloads/gatling-charts-highcharts-bundle-2.2.2/user-files
Below is what I am using to print total, folder, and file size:
$ du -sch /home/vivek/* | sort -rh
Details
------------------------------------------------------------
-c, --total
produce a grand total
-h, --human-readable
print sizes in human readable format (e.g., 1K 234M 2G)
-s, --summarize
display only a total for each argument
-------------------------------------------------------------
-h, --human-numeric-sort
compare human readable numbers (e.g., 2K 1G)
-r, --reverse
reverse the result of comparisons
Output
70M total
69M /home/vivek/Downloads/gatling-charts-highcharts-bundle-2.2.2/lib
992K /home/vivek/Downloads/gatling-charts-highcharts-bundle-2.2.2/results
292K /home/vivek/Downloads/gatling-charts-highcharts-bundle-2.2.2/target
52K /home/vivek/Downloads/gatling-charts-highcharts-bundle-2.2.2/user-files
edited Dec 15 '17 at 1:43
Peter Mortensen
1,03721016
1,03721016
answered Oct 6 '16 at 14:05


vivekyad4vvivekyad4v
3191411
3191411
add a comment |
add a comment |
The answers have made it obvious that du
is the tool to find the total size of a directory. However, there are a couple of factors to consider:
Occasionally,
du
output can be misleading because it reports the space allocated by the filesystem, which may be different from the sum of the sizes of the individual files. Typically the filesystem will allocate 4096 bytes for a file even if you stored just one character in it!Output differences due to power of 2 and power of 10 units. The
-h
switch todu
divides the number of bytes by 2^10 (1024), 2^20 (1048576) etc to give a human readable output. Many people might be more habituated to seeing powers of 10 (e.g. 1K = 1000, 1M = 1000000) and be surprised by the result.
To find the total sum of sizes of all files in a directory, in bytes, do:
find <dir> -ls | awk 'sum += $7 END print sum'
Example:
$ du -s -B 1
255729664
$ find . -ls | awk 'sum += $7 END print sum'
249008169
The find-ls-awk will return a wrong value for large folders #1. For newer awk you can add--bignum
or-M
; if that is not an option usefind . -ls | tr -s ' '|cut -d' ' -f 7| paste -sd+ |bc
#2.
– goozez
Jun 29 '16 at 20:25
If powers of 2 being used is a problem, there's the--si
option: "like -h, but use powers of 1000 not 1024"
– muru
Dec 15 '17 at 3:25
add a comment |
The answers have made it obvious that du
is the tool to find the total size of a directory. However, there are a couple of factors to consider:
Occasionally,
du
output can be misleading because it reports the space allocated by the filesystem, which may be different from the sum of the sizes of the individual files. Typically the filesystem will allocate 4096 bytes for a file even if you stored just one character in it!Output differences due to power of 2 and power of 10 units. The
-h
switch todu
divides the number of bytes by 2^10 (1024), 2^20 (1048576) etc to give a human readable output. Many people might be more habituated to seeing powers of 10 (e.g. 1K = 1000, 1M = 1000000) and be surprised by the result.
To find the total sum of sizes of all files in a directory, in bytes, do:
find <dir> -ls | awk 'sum += $7 END print sum'
Example:
$ du -s -B 1
255729664
$ find . -ls | awk 'sum += $7 END print sum'
249008169
The find-ls-awk will return a wrong value for large folders #1. For newer awk you can add--bignum
or-M
; if that is not an option usefind . -ls | tr -s ' '|cut -d' ' -f 7| paste -sd+ |bc
#2.
– goozez
Jun 29 '16 at 20:25
If powers of 2 being used is a problem, there's the--si
option: "like -h, but use powers of 1000 not 1024"
– muru
Dec 15 '17 at 3:25
add a comment |
The answers have made it obvious that du
is the tool to find the total size of a directory. However, there are a couple of factors to consider:
Occasionally,
du
output can be misleading because it reports the space allocated by the filesystem, which may be different from the sum of the sizes of the individual files. Typically the filesystem will allocate 4096 bytes for a file even if you stored just one character in it!Output differences due to power of 2 and power of 10 units. The
-h
switch todu
divides the number of bytes by 2^10 (1024), 2^20 (1048576) etc to give a human readable output. Many people might be more habituated to seeing powers of 10 (e.g. 1K = 1000, 1M = 1000000) and be surprised by the result.
To find the total sum of sizes of all files in a directory, in bytes, do:
find <dir> -ls | awk 'sum += $7 END print sum'
Example:
$ du -s -B 1
255729664
$ find . -ls | awk 'sum += $7 END print sum'
249008169
The answers have made it obvious that du
is the tool to find the total size of a directory. However, there are a couple of factors to consider:
Occasionally,
du
output can be misleading because it reports the space allocated by the filesystem, which may be different from the sum of the sizes of the individual files. Typically the filesystem will allocate 4096 bytes for a file even if you stored just one character in it!Output differences due to power of 2 and power of 10 units. The
-h
switch todu
divides the number of bytes by 2^10 (1024), 2^20 (1048576) etc to give a human readable output. Many people might be more habituated to seeing powers of 10 (e.g. 1K = 1000, 1M = 1000000) and be surprised by the result.
To find the total sum of sizes of all files in a directory, in bytes, do:
find <dir> -ls | awk 'sum += $7 END print sum'
Example:
$ du -s -B 1
255729664
$ find . -ls | awk 'sum += $7 END print sum'
249008169
edited Feb 5 '16 at 7:10
answered Feb 4 '16 at 18:56
pdppdp
23227
23227
The find-ls-awk will return a wrong value for large folders #1. For newer awk you can add--bignum
or-M
; if that is not an option usefind . -ls | tr -s ' '|cut -d' ' -f 7| paste -sd+ |bc
#2.
– goozez
Jun 29 '16 at 20:25
If powers of 2 being used is a problem, there's the--si
option: "like -h, but use powers of 1000 not 1024"
– muru
Dec 15 '17 at 3:25
add a comment |
The find-ls-awk will return a wrong value for large folders #1. For newer awk you can add--bignum
or-M
; if that is not an option usefind . -ls | tr -s ' '|cut -d' ' -f 7| paste -sd+ |bc
#2.
– goozez
Jun 29 '16 at 20:25
If powers of 2 being used is a problem, there's the--si
option: "like -h, but use powers of 1000 not 1024"
– muru
Dec 15 '17 at 3:25
The find-ls-awk will return a wrong value for large folders #1. For newer awk you can add
--bignum
or -M
; if that is not an option use find . -ls | tr -s ' '|cut -d' ' -f 7| paste -sd+ |bc
#2.– goozez
Jun 29 '16 at 20:25
The find-ls-awk will return a wrong value for large folders #1. For newer awk you can add
--bignum
or -M
; if that is not an option use find . -ls | tr -s ' '|cut -d' ' -f 7| paste -sd+ |bc
#2.– goozez
Jun 29 '16 at 20:25
If powers of 2 being used is a problem, there's the
--si
option: "like -h, but use powers of 1000 not 1024"– muru
Dec 15 '17 at 3:25
If powers of 2 being used is a problem, there's the
--si
option: "like -h, but use powers of 1000 not 1024"– muru
Dec 15 '17 at 3:25
add a comment |
For only the directory size in a readable format, use the below:
du -hs directoryname
This probably isn't in the correct section, but from the command line, you could try:
ls -sh filename
The -s
is size, and the -h
is human readable.
Use -l
to show on ls
list, like below:
ls -shl
add a comment |
For only the directory size in a readable format, use the below:
du -hs directoryname
This probably isn't in the correct section, but from the command line, you could try:
ls -sh filename
The -s
is size, and the -h
is human readable.
Use -l
to show on ls
list, like below:
ls -shl
add a comment |
For only the directory size in a readable format, use the below:
du -hs directoryname
This probably isn't in the correct section, but from the command line, you could try:
ls -sh filename
The -s
is size, and the -h
is human readable.
Use -l
to show on ls
list, like below:
ls -shl
For only the directory size in a readable format, use the below:
du -hs directoryname
This probably isn't in the correct section, but from the command line, you could try:
ls -sh filename
The -s
is size, and the -h
is human readable.
Use -l
to show on ls
list, like below:
ls -shl
edited Dec 15 '17 at 1:42
Peter Mortensen
1,03721016
1,03721016
answered May 12 '16 at 9:09
Shiv SinghShiv Singh
2,902289
2,902289
add a comment |
add a comment |
du /foldername
is the standard command to know the size of a folder. It is best practice to find the options by reading the man page:
man du
You should read the man page (available online) before you use the command.
add a comment |
du /foldername
is the standard command to know the size of a folder. It is best practice to find the options by reading the man page:
man du
You should read the man page (available online) before you use the command.
add a comment |
du /foldername
is the standard command to know the size of a folder. It is best practice to find the options by reading the man page:
man du
You should read the man page (available online) before you use the command.
du /foldername
is the standard command to know the size of a folder. It is best practice to find the options by reading the man page:
man du
You should read the man page (available online) before you use the command.
edited Dec 15 '17 at 1:43


muru
1
1
answered Dec 5 '11 at 9:17
Balaswamy vaddemanBalaswamy vaddeman
1254
1254
add a comment |
add a comment |
Here is a POSIX script that will work with:
- A file
- Files
- A directory
- Directories
#!/bin/sh
ls -ARgo "$@" | awk 'q += $3 END print q'
Source
add a comment |
Here is a POSIX script that will work with:
- A file
- Files
- A directory
- Directories
#!/bin/sh
ls -ARgo "$@" | awk 'q += $3 END print q'
Source
add a comment |
Here is a POSIX script that will work with:
- A file
- Files
- A directory
- Directories
#!/bin/sh
ls -ARgo "$@" | awk 'q += $3 END print q'
Source
Here is a POSIX script that will work with:
- A file
- Files
- A directory
- Directories
#!/bin/sh
ls -ARgo "$@" | awk 'q += $3 END print q'
Source
edited Feb 9 '18 at 1:46
answered Mar 27 '17 at 11:48
Steven PennySteven Penny
1
1
add a comment |
add a comment |
If your desired directory has many sub-directories then, use the following:
$ cd ~/your/target/directory
$ du -csh
-c, --total produce a grand total
-s, --summarize display only a total for each argument
-h, --human-readable print sizes in human readable format (e.g., 1K 234M 2G)
which would then produce a overall total of the memory usage by all files/folders in the current directory.
add a comment |
If your desired directory has many sub-directories then, use the following:
$ cd ~/your/target/directory
$ du -csh
-c, --total produce a grand total
-s, --summarize display only a total for each argument
-h, --human-readable print sizes in human readable format (e.g., 1K 234M 2G)
which would then produce a overall total of the memory usage by all files/folders in the current directory.
add a comment |
If your desired directory has many sub-directories then, use the following:
$ cd ~/your/target/directory
$ du -csh
-c, --total produce a grand total
-s, --summarize display only a total for each argument
-h, --human-readable print sizes in human readable format (e.g., 1K 234M 2G)
which would then produce a overall total of the memory usage by all files/folders in the current directory.
If your desired directory has many sub-directories then, use the following:
$ cd ~/your/target/directory
$ du -csh
-c, --total produce a grand total
-s, --summarize display only a total for each argument
-h, --human-readable print sizes in human readable format (e.g., 1K 234M 2G)
which would then produce a overall total of the memory usage by all files/folders in the current directory.
answered 1 hour ago


kmario23kmario23
4301316
4301316
add a comment |
add a comment |
The best one I think is the following:
du -h directory_name | tail -n1
This will show you only the size of the directory that you are interested in and will not print sizes of any directories and files inside that directory.
I should add that if the size of the folder is large then du takes longer time. You must be patient for this command to work. Just like any other unix command, you may find out the total time for this process by using time
before this command:
time du -h directory_name | tail -n1
6
du
has an option for this:-s
– muru
Oct 4 '15 at 11:46
add a comment |
The best one I think is the following:
du -h directory_name | tail -n1
This will show you only the size of the directory that you are interested in and will not print sizes of any directories and files inside that directory.
I should add that if the size of the folder is large then du takes longer time. You must be patient for this command to work. Just like any other unix command, you may find out the total time for this process by using time
before this command:
time du -h directory_name | tail -n1
6
du
has an option for this:-s
– muru
Oct 4 '15 at 11:46
add a comment |
The best one I think is the following:
du -h directory_name | tail -n1
This will show you only the size of the directory that you are interested in and will not print sizes of any directories and files inside that directory.
I should add that if the size of the folder is large then du takes longer time. You must be patient for this command to work. Just like any other unix command, you may find out the total time for this process by using time
before this command:
time du -h directory_name | tail -n1
The best one I think is the following:
du -h directory_name | tail -n1
This will show you only the size of the directory that you are interested in and will not print sizes of any directories and files inside that directory.
I should add that if the size of the folder is large then du takes longer time. You must be patient for this command to work. Just like any other unix command, you may find out the total time for this process by using time
before this command:
time du -h directory_name | tail -n1
edited Dec 19 '14 at 6:41
answered Dec 19 '14 at 6:35


PeacefulPeaceful
19017
19017
6
du
has an option for this:-s
– muru
Oct 4 '15 at 11:46
add a comment |
6
du
has an option for this:-s
– muru
Oct 4 '15 at 11:46
6
6
du
has an option for this: -s
– muru
Oct 4 '15 at 11:46
du
has an option for this: -s
– muru
Oct 4 '15 at 11:46
add a comment |
protected by αғsнιη Nov 19 '17 at 3:18
Thank you for your interest in this question.
Because it has attracted low-quality or spam answers that had to be removed, posting an answer now requires 10 reputation on this site (the association bonus does not count).
Would you like to answer one of these unanswered questions instead?
Related: How to list recursive file sizes of files and directories in a directory?
– Peter Mortensen
Nov 26 '17 at 23:12
Cross-site duplicate: How do I get the size of a directory on the command line?
– Peter Mortensen
Nov 26 '17 at 23:17