Kaydol

Flood göndermek, insanların floodlarını okumak ve diğer insanlarla bağlantı kurmak için sosyal Floodlar ve Flood Yanıtları Motorumuza kaydolun.

Oturum aç

Flood göndermek, insanların floodlarını okumak ve diğer insanlarla bağlantı kurmak için sosyal Floodlar ve Flood Yanıtları Motorumuza giriş yapın.

Şifremi hatırlamıyorum

Şifreni mi unuttun? Lütfen e-mail adresinizi giriniz. Bir bağlantı alacaksınız ve e-posta yoluyla yeni bir şifre oluşturacaksınız.

3 ve kadim dostu 1 olan sj'yi rakamla giriniz. ( 31 )

Üzgünüz, Flood yazma yetkiniz yok, Flood girmek için giriş yapmalısınız.

Lütfen bu Floodun neden bildirilmesi gerektiğini düşündüğünüzü kısaca açıklayın.

Lütfen bu cevabın neden bildirilmesi gerektiğini kısaca açıklayın.

Please briefly explain why you feel this user should be reported.

I did file recovery on a 2TB drive and came up with 13875051.999 Petabytes of space…

I did file recovery on a 2TB drive and came up with 13875051.999 Petabytes of space…

Benzer Yazılar

Yorum eklemek için giriş yapmalısınız.

40 Yorumları

  1. 13.8 Zettabytes. It defeats the purpose of using the shorthand prefix notation if the prefix is not between 1 and 1000.

  2. That’s some excellent compression you’re using.

    A deep scan looks at the entire drive and attempts to see if that sequence looks like the start of any known file type. Many do, so you end up with it generating a bunch of fake files. Some will be real though most won’t. Some of the real files will have proper starts but the end of the file isn’t detected correctly so you end up with a 5GB png file which should only be 2kb. However inside that 5GB if any sequences match other file types it’ll report even more files, so many of the reported files overlap. If you have truly important data on that drive, you now get to go through all 7032804 files checking their starts to see if they match what you’re looking for. Good luck.

  3. Man, this brings back memories…. double space for DoS anyone ? Basicly a torture-format-pending

  4. That’s OK. The portion of the Government entrusted with inspecting, verifying, and certifying the software run once told me they couldn’t decompress my zip file because it was 6.7 petabytes. They said it was more disk than they had, and if that was required we needed to talk.

    ​

    …. the same ones that couldn’t launch a VM because they’d gotten new hardware/software and didn’t have the associations set.

  5. Just need another 2TB drive to restore the free petabytes onto. Shouldnt take that long

  6. Boy, I bet you can fit a lot of 1K demos on that bad boy. Wait till the guys at my local BBS hear about this.

  7. Damn! My external drive recoveries usually goes completely opposite. It runs for ever and comes up with way more data than the drive can hold then I have to run 10 simultaneous dupe guru windows and my Ryzen9 Pc sounds like it’s gonna fly away with the fans. Bahaha

  8. One of my coworkers did a deep scan with recuva, on a 16gb thumb drive, that lost it’s partition information and walked away with 91gb of data as she was going through it, she recognized things that she had deleted long ago. lol

  9. Same happened with mine. If you look at the files, what’s happened is that there are fragments of a file that get picked up as the original file. So I’d have the complete file, and then 5+ versions that were fragments. However it still saw them as the original filesize, so adding them up is far beyond the size of the drive.

    I suspect it’s from defragmenting leaving little bits of data around that the deep scan picks up.

  10. You know those times Microsoft says “We’re getting things ready…”

    Well… you might want to open up your case…

  11. I started wondering about you having heavy use of sparse files (file is nominally large, but mostly not backed with actual storage) or if it is repeatedly counting hard links (and you really aggressively used them, perhaps in a snapshot backup system run amok) but I think I’ll just leave it as a simple bug.

  12. What file recovery software? What happened to drive anyway?

  13. You counted wrong. 🙂 The image shows ≈13.9 EB not 13875051.999 PB. Anyway, that’s still 2TB/file

  14. I knew it was possible to download more RAM, but downloading more storage is a game changer!

  15. How much are you charging for cloud storage on your new business?

  16. Windows could never count properly… e.g. copy file dialog time remaining stats.

  17. > 13875051.999 Petabytes

    That’s a meaningless number. Just say 13.8 zettabytes next time.

  18. I can remeber this image of quick and deep scan, I us3d the software to recover my accidentally deleted raid array, had the same problem, 10TB of drives gave me a couple 100TB of found files. Had a look in the quick Search foldrr and saw that it contained everything I deleted so i ignored the other folders.

  19. Gzip compression has expanded to a whole new level…

    Otherwise, file headers