Esato Mobile
General discussions : Non mobile discussion : Is there a web page over 500Kb?
> New Topic
> Reply
< Esato Forum Index > General discussions > Non mobile discussion > Is there a web page over 500Kb? Bookmark topic
Page <  12

sadeghi85 Posts: 341

That way won't work. consider a link like this: http://example.com/?fileid=35465
it can be a file and there is no dot on it. so i should check "response header", and "content-disposition" isn't always available.



On 2008-01-13 18:35:41, ÈL ® ö B ì Ñ wrote:
Because you don't want 5 people downloading large files and maxing out the server connection and having the site run slower for everyone else just doing general browsing?


Just as an example.



using two host, one for large file storage and the other for general use is a better approach IMO. i stated disadvantages of passing file with php before.
--
Posted: 2008-01-13 20:14:36
Edit : Quote

Johnex Posts: > 500

ÈL ® ö B ì Ñ is correct on my usage. Also, making it so members have faster download speeds is something that rapidshare uses, and works quite nicely, incentive to register.

I would go with what cyco suggested, but dont see how php will download multiple files for you, well, to the webserver its possible. I made a php based proxy a while back, and it opens a socket connection to a page, downloads the page, and edits all the links to go through the proxy. All files that are linked also go through the proxy without issues, gives the correct content type. This could be modified to use the socket to download to the actual server, but it wont be as effective as having a native c++ or c program.

[ This Message was edited by: Johnex on 2008-01-13 19:21 ]
--
Posted: 2008-01-13 20:20:38
Edit : Quote

sadeghi85 Posts: 341

Rapidshare encourages users to get premium account by 3 strategies:
1. by passing the file through php(if they are using php):
*user wants to download a file. if he faces a connection drop he should download the file from beginning.
**he can't use a downloader(e.g firefox->flashgot->flashget) because downloader can't use the socket opened by browser and should open another socket and php script will recognize it as a separate request.
2. by blocking ip for e.g one hour.
3. by using CAPTCHA image to prevent automation.

these restrictions are annoying and force users to register. a regular download using a downloader like flashget is even faster. so passing file through php is to force users to register not for having faster download speeds for all.

actually i wrote the app for downloading from Rapidshare! (and other hosts that don't support resuming download).
till now my approach has been successful. i haven't any problem with the app. i just asked "Is there a web page over 500Kb?"

thanks for replies.
--
Posted: 2008-01-14 11:00:26
Edit : Quote

Johnex Posts: > 500

http://pizzaseo.com/google-cache-maximum-file-size

Short and concise, yes there are.
--
Posted: 2008-01-14 15:54:32
Edit : Quote

sadeghi85 Posts: 341

Thanks.

since Google is unable to go over 1MB cache limit, i'll set the limit to 1MB.

many many thanks, you solved my problem
--
Posted: 2008-01-14 16:50:09
Edit : Quote
Page <  12

New Topic   Reply
Forum Index

Esato home