How do they do it? Last night i was wondering something: On a lan, when on Gbit, you can transfer data at 40-50mb/sec to 5-10 other users. And then your system is under heavy load. But what with for example Eweka? Or Newshosting / Giganews? Just for easy calcution, let's say they got 10 000 customers connected on average. Each one on broadband, again for easy calculation let's say 500kb/sec. Bandwith isn't a problem, but what with the userloads and hddspace? Again for easy calcution, usenet is 10TB big. With one 'heavy' desktop server you got 10-20 hdd's * 250GB = max 5TB, no redundancy, and a max of 50-100 users before your hdd's are overloaded. This will happen MUCH sooner if everyone is accessing something different on the same hdd's. Just for the content you'll need 2 of those, and then 2 backups, or a raid 5-10 array. And that for only 50-100 users, do that 100x for the 1000 customers and you got A LOT of server. And that's very conservative, because there are much more than 10 000 users, and for 80 days retention you'll need hundreds of TB. So what kind of systems do they use? Ofcourse there a big serverclusters with thousands of cpu's, tb's of ram, but i've never seen or heard from a system where you can, for example, plugin 200 hdd's in raid 5. |
Aha those are the things i had in mind, but i had no idea they existed. De item die u heeft geselecteerd is niet online te bestellen. Gelieve uw vertegenwoordiger te bellen om te bestellen. FU dell! :p |
for 100TB you're looking at close to €30.000 I think |
All times are GMT +1. The time now is 14:11. |
Powered by vBulletin® - Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Content Relevant URLs by vBSEO