Kaydol

Flood göndermek, insanların floodlarını okumak ve diğer insanlarla bağlantı kurmak için sosyal Floodlar ve Flood Yanıtları Motorumuza kaydolun.

Oturum aç

Flood göndermek, insanların floodlarını okumak ve diğer insanlarla bağlantı kurmak için sosyal Floodlar ve Flood Yanıtları Motorumuza giriş yapın.

Şifremi hatırlamıyorum

Şifreni mi unuttun? Lütfen e-mail adresinizi giriniz. Bir bağlantı alacaksınız ve e-posta yoluyla yeni bir şifre oluşturacaksınız.

3 ve kadim dostu 1 olan sj'yi rakamla giriniz. ( 31 )

Üzgünüz, Flood yazma yetkiniz yok, Flood girmek için giriş yapmalısınız.

Lütfen bu Floodun neden bildirilmesi gerektiğini düşündüğünüzü kısaca açıklayın.

Lütfen bu cevabın neden bildirilmesi gerektiğini kısaca açıklayın.

Please briefly explain why you feel this user should be reported.

UCBerkeley to remove 10k hours of lectures posted on Youtube

UCBerkeley to remove 10k hours of lectures posted on Youtube

Benzer Yazılar

Yorum eklemek için giriş yapmalısınız.

35 Yorumları

  1. It looks like the waybackmachine is grabbing these currently.

  2. > recent findings by the Department of Justice which suggests that the YouTube and iTunesU content meet higher accessibility standards as a condition of remaining publicly available.

    Oh ffs.

  3. I’m not currently in a position to help seed, but I’m commenting here to express my interest in doing so – and to light a fire under my ass to get things set up if I can.

  4. Im going to try to get the best quality I can of this resource. However I do not want to hold it forever. Who can I contact at the archive team to help?

  5. Why are they doing this? The explanation is complete bullshit and corporate double talk. Removing access to improve accessibility?? Wtf?

  6. 10K hours to deliver these classes, plus how many implied hours for the professors to prepare them, all at an elite public university.

    And we’re supposed to believe that the law requires the UC to *suppress* them?

    Skilled Redditors, preserve this knowledge as if it were destined for the Alexandria Library.

    History will remember your efforts.

  7. here is what I believe is the full list of videos and playlists in json and csv formats

    http://www.yourfilelink.com/get.php?fid=1328657

    there were some duplicate playlist names so in the files I have an ID field split by — eg:
    <playlist title>–<playlistID>–<videoitemID>

    Computer Science 198, 032 – Spring 2015–PL-XXv-cvA_iASSQV5cPTPIPiwgIAjaR6b–UEwtWFh2LWN2QV9pQVNTUVY1Y1BUUElQaXdnSUFqYVI2Yi41NkI0NEY2RDEwNTU3Q0M2

    There are 14322 videos and 428 playlists listed in the data files

    Please note that there are duplicate videos when only looking at the videoID field. There are only 9743 videos if the duplicates are removed.

    This data was pulled using the Youtube API

  8. Someone here already archived it, but they stopped seeding. I only have 800GB of the 3.5TB I think it was.

  9. Shit, need to automate this somehow with Clouddrive and Google Drive.

    ARCHIVE ALL THE THINGS!!!!!!

  10. I had mentioned this in a reply but wanted to check with its own reply to the OP:
    Why don’t we just upload all of them to another YT account and mark as unlisted? With only a week or so to gather the info this would at least give more time for others to hoard it. I can’t imagine YT would detect it. I may be ignorant to how YT works though.

  11. > As part of the campus’s ongoing effort to improve the accessibility of online content

    So their first step to improve accessibility is to make everything inaccessible?

  12. Anyone who is currently doing this, please post your youtube-dl command line string. May be useful for others.

  13. Interesting. I may look into that. The download for all the videos has been started since this morning. Thankfully I have pretty decent download speed.

  14. Started downloading using /u/YouTubeBackups command line. Have about 1.5TB free on the box, downloading at 40MB/s. We’ll see how long it takes to fill.

  15. FUCK lost my 3TB seedbox at the worst time possible wtf..I’ll still try my best to capture this.Thanks for the heads up fam

  16. i can only contribute 500gb max atm. what would be the best method to distribute archiving efforts like this?

  17. To all the people in this thread who are archiving all this material, thank you.

    I only have a broken 1tb drive so I can’t archive..

  18. Low on space at the moment, may get the audio only ones if they look valuable.

  19. Any guides out there on how to download these with a script in the highest quality? I’d do it and host it on FTP or via BitTorrent.

  20. Skipped a bit through some videos and found several going up to 1440p.
    [example](https://www.youtube.com/watch?v=v8DfRYUG4MM)

    So it could total to a bit more than 4TB :D.

    It probably doesn’t fall under the legacy criteria (3-10 years old) though.

  21. I could try and download them all, then upload the videos to my Google Drive account and share it from there.

  22. Would anybody be willing to help save these videos?

    https://www.youtube.com/user/UCBerkeley/featured

    The channel has 9897 videos, all around an hour each, on subjects pertaining from computer science to law. AFAIK that’s around 4TB of 480p video. As an outsider on this sub this is more space than I have at all, never mind available to dedicate.

    There’s over a month left before they get deleted which should be enough time to download every video but I’m not sure whether or not Youtube has protections for scripting youtube-dl to download an entire channel. If so, it might take a lot of manual effort to download everything.

    What kind of service is best to create a backup? Torrenting would allow for distributed backups but is dangerous legally and may end up seedless quickly.

    Also I’m not sure whether this belongs on here or /r/archiveteam