{"id":466,"date":"2014-11-28T09:36:48","date_gmt":"2014-11-28T09:36:48","guid":{"rendered":"http:\/\/www.tech-g.com\/?p=466"},"modified":"2024-04-03T09:26:10","modified_gmt":"2024-04-03T09:26:10","slug":"backing-up-disk-with-dd-saving-space","status":"publish","type":"post","link":"https:\/\/www.voodoo.business\/blog\/2014\/11\/28\/backing-up-disk-with-dd-saving-space\/","title":{"rendered":"Backing up disk with DD saving space"},"content":{"rendered":"\n<p>The problem with DD is that it copies the whole disk, In reality, the disk could have 10GBs but that dump file has to be of the disk size, lets say 100GBs<\/p>\n\n\n\n<p>So, how do we get a dump file that is only around 10GBs in size.<\/p>\n\n\n\n<p>The answer is simple. Compressing a zero fill file is very efficient (almost nothing).<\/p>\n\n\n\n<p>So, frst we create a zero fill file. with the following command, i recommend you stop the fill while there is still a bit of space on the disk especially if the disk has a database running that could need to insert.. so stop the running fill with ctrl+c before you actually fill the whole disk<\/p>\n\n\n\n<pre class=\"wp-block-preformatted\">cat \/dev\/zero &gt; zero3.fill;sync;sleep 1;sync;\n<\/pre>\n\n\n\n<p>At this point, you can either delete the zero.fill file or not, It will not make a difference in the dump size, deleting is recommended, but it wont make much of a difference.<\/p>\n\n\n\n<p>Notes<br>sync flushed any remaining buffer in memory to the hard drive<br>If the process stops for any reason, keep the file already written and make a second one and a third and whatever it takes, do not delete the existing one, just make sure almost all of your disk&#8217;s free space is occupied by zero fill files.<\/p>\n\n\n\n<p>Now, to DD and compression on the fly (So that you won&#8217;t need much space on the target drive)<\/p>\n\n\n\n<p>If you want to monitor the dump, you can use pv<\/p>\n\n\n\n<pre class=\"wp-block-preformatted\">dd if=\/dev\/sdb | pv -s SIZEOFDRIVEINBYTES | pigz --fast &gt; \/targetdrive\/diskimage.img.gz\n<\/pre>\n\n\n\n<p>Or if you like, you can use parallel BZIP2 like so, in this example this is a 2TB hard drive<\/p>\n\n\n\n<pre class=\"wp-block-preformatted\">dd if=\/dev\/sda | pv -s 2000398934016 | pbzip2 --best &gt; \/somefolder\/thefile.img.bz2<\/pre>\n\n\n\n<p>without the monitoring<\/p>\n\n\n\n<pre class=\"wp-block-preformatted\">dd if=\/dev\/sdb | pigz --fast &gt; \/targetdrive\/diskimage.img.gz\n<\/pre>\n\n\n\n<p>Now, to dump this image back to a hard drive<\/p>\n\n\n\n<p>Note that using pigz for the decompression in this situation is not recommended, somthing along the lines of this<\/p>\n\n\n\n<pre class=\"wp-block-preformatted\">DO NOT USE this one, use the one with gunzip<\/pre>\n\n\n\n<pre class=\"wp-block-preformatted\">pigz -d \/hds\/www\/vzhost.img.gz | pv -s SIZEOFIMAGEINBYTES | dd of=\/dev\/sdd\n<\/pre>\n\n\n\n<p>Will work, but it will decompress the file in place before sending it through the pipe, so the recommended way to do it on the fly is with gunzip, this is also true because there are no benefits from parallel gzip while decompressing<\/p>\n\n\n\n<pre class=\"wp-block-preformatted\">gunzip -c \/hds\/www\/vzhost.img.gz | pv -s SIZEOFIMAGEINBYTES | dd of=\/dev\/sdb\n<\/pre>\n\n\n\n<p>Or<\/p>\n\n\n\n<pre class=\"wp-block-preformatted\">pigz -d \/hds\/www\/vzhost.img.gz | dd of=\/dev\/sdd\n<\/pre>\n\n\n\n<p>My records<br>The following are irrelevant to you, this is strictly for my records<\/p>\n\n\n\n<pre class=\"wp-block-preformatted\">mount -t ext4 \/dev\/sdb1 \/hds\n\ndd if=\/dev\/sdc | pv -s 1610612736000 | pigz --fast &gt; \/hds\/www\/vzhost.img.gz\n<\/pre>\n\n\n\n<p>One that covers doing for part of a disk<\/p>\n\n\n\n<p>Assume i want to copy the first 120GB of a large drive where my windows partition lives, I want it compressed and i want the free space cleared<\/p>\n\n\n\n<p>first, in windows use SDELETE to zero empty space<\/p>\n\n\n\n<pre class=\"wp-block-preformatted\">sdelete -z c:<\/pre>\n\n\n\n<p>Now, mount the disk on a linux partition<\/p>\n\n\n\n<pre class=\"wp-block-preformatted\">dd if=\/dev\/sdb bs=512 count=235000000 | pigz --fast &gt; \/hds\/usb1\/diskimage.img.gz\ndd if=\/dev\/sdb bs=512 count=235000000 | pbzip2 &gt; \/hds\/usb1\/diskimage.img.gz\n<\/pre>\n\n\n\n<p>If it is advanced format, you would probably do<br>dd if=\/dev\/sdb of=\/hds\/usb1\/firstpartofdisk.img bs=4096 count=29000000<\/p>\n\n\n\n<p>or something like that<\/p>\n\n\n\n<p>Now, if we have a disk image with the extension (.bin.gz) and we want to extract it to a different directory, we can pipe it as follows<\/p>\n\n\n\n<pre class=\"wp-block-preformatted\">gunzip -c \/pathto\/my_disk.bin.gz > \/targetdir\/my_disk.bin<\/pre>\n","protected":false},"excerpt":{"rendered":"<p>The problem with DD is that it copies the whole disk, In reality, the disk could have 10GBs but that dump file has to be of the disk size, lets say 100GBs So, how do we get a dump file that is only around 10GBs in size. The answer is simple. Compressing a zero fill [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[],"class_list":["post-466","post","type-post","status-publish","format-standard","hentry","category-uncategorized"],"_links":{"self":[{"href":"https:\/\/www.voodoo.business\/blog\/wp-json\/wp\/v2\/posts\/466","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.voodoo.business\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.voodoo.business\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.voodoo.business\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.voodoo.business\/blog\/wp-json\/wp\/v2\/comments?post=466"}],"version-history":[{"count":12,"href":"https:\/\/www.voodoo.business\/blog\/wp-json\/wp\/v2\/posts\/466\/revisions"}],"predecessor-version":[{"id":3444,"href":"https:\/\/www.voodoo.business\/blog\/wp-json\/wp\/v2\/posts\/466\/revisions\/3444"}],"wp:attachment":[{"href":"https:\/\/www.voodoo.business\/blog\/wp-json\/wp\/v2\/media?parent=466"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.voodoo.business\/blog\/wp-json\/wp\/v2\/categories?post=466"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.voodoo.business\/blog\/wp-json\/wp\/v2\/tags?post=466"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}