faster?

Discuss features as they are added to the new version. Give us your feedback. Don't post bug reports, feature requests, support questions or suggestions here.
Forum rules
Discuss features as they are added to the new version. Give us your feedback. Don't post bug reports, feature requests, support questions or suggestions here. Feature requests are closed.
Administrater
Registered User
Posts: 19
Joined: Thu May 12, 2005 12:30 pm

Re: faster?

Post by Administrater »

GZIP executes all available bits until they are in half, and then adds them
to gether in a special formula. Then the browser records these bits with an
asynchronous time scheme. It also creates a CRC57 Hash of the encrypted
resource times and expelles them in the /dev/null
That's the wrong info, so I will clear it up.

Basically, the GZIP was an invention for making web pages smaller when sent
from the server, and they are decompressed by the browser.

dmaj007
Registered User
Posts: 52
Joined: Mon Aug 25, 2003 9:31 pm

Re: faster?

Post by dmaj007 »

Let me make this grossly exagerated scenario. Lets say we are given two super computers. They are both fully setup and expanded Blue Gene-L machines. The server computer will use its algorithm to turn the page into something much smaller. Smaller means less data to transfer. With our horribly powerfull machine, we see litteraly no change in proccessing time but we now have a change in bandwith, its lower and we now have a change in data length, its less. Our gzipped file is zipping across your network to the other Blue Gene-L. The Blue Gene-L gets the smaller file. This increases proccessing time but decreases the time required to retrieve the file. A net gain in speed.

If your computers can crunch algorithms faster then they can send/recieve data. Gzip will make things faster. If they have slow computers, it will take longer.

Spectral Dragon
Registered User
Posts: 208
Joined: Mon Feb 16, 2004 1:45 pm
Location: Milan, MI
Contact:

Re: faster?

Post by Spectral Dragon »

I think it's just a matter of them updating this board, if they have.

Kib_Tph
Registered User
Posts: 1
Joined: Thu Mar 28, 2002 6:21 pm

Re: faster?

Post by Kib_Tph »

GZIP compression makes pages quite alot smaller. This first of all saves bandwith because less data is sent from the server, but it also means that pages can load faster to the client.

The compression indeed does cost some CPU power, but in a lot of cases this isn't really a problem because the advantage of less downloading time saves a lot of memory.

For instance in an exagerated example:

It takes 1 second to generate a page, which then takes another second to send to a user. So the whole process of requesting a page to actually having sent it takes 2 seconds.

Now if you enable gzip, it may take 1.1 second to generate the page, 1 second for the generation and 10%, or 0.1 second to gzip the data. Say it compresses a page by 50% it will then only take 0.5 seconds for the client to download the page.
The entire process now only takes 1.6 seconds instead of the 2 seconds it did before.

This actually saves alot of memory because the time that data lives on the system is shortened and less webserver processes are required the serve the same number of pages.


p.s.


Another great way to speed of a php system greatly is to compile all php pages, this can cause a speed increase of up to 5 times!! http://eaccelerator.net/" target="_blank

dmaj007
Registered User
Posts: 52
Joined: Mon Aug 25, 2003 9:31 pm

Re: faster?

Post by dmaj007 »

Opcode caching is great, sometimes. It becomes quite a bit stricter, you DO write better code because of it (you release things that you nolonger need, like query data etc.) Other times, it just breaks software. Not good.

Post Reply