Horse Sense #87
What is
Slowing Up Your PC?
In this issue of Horse Sense:
--Little Things Matter
--What is Slowing Up Your PC?
Little Things Matter
I went to a computer show last week. What impressed me
most were the "little things." A new lock makes it much quicker to secure
your laptop to a security cable. That seems like such a tiny thing, but if
you cut down on the amount of time it takes to secure a laptop and make it
easier, people are likely to do it more often. Laptops are stolen at an
alarming rate and the consequences of a lost laptop can be dire. If
security is easier to use, people will use it and losses will decrease.
There were also cables that allowed for right angle connections or had ends
that would twist, making connections easier. There were flat network cables
which would not show under a rug or cause a bump to trip on. There were
lots of bags, chargers, batteries, cables, screen protectors, privacy
screens, and hard drive enclosures that made it more convenient and safer to
take your mobile gear on the road. There were monitors that could be both
TV and computer monitors. There were monitors that sensed if someone was in
front of them and turned off if no one was there, saving power and
increasing security. There were monitors that adjusted themselves to
compensate for ambient lighting so they could be more easily seen. There
was equipment with very low power requirements that could save lots of money
over time. There were power devices that automatically shut off after a
period of time or would shut down unused devices that needed your computer
to be running to do anything useful. There were devices to make one
computer sharable by many users. There was a device that hooks to your
keychain and warns you if you get too far away from your cell phone so you
do not lose it. There was even a stand meant to take an iPAD and make it a
touch screen monitor while charging it at the same time.
What I Like, What You Like
Here are some little things even I cannot do without: a
Bluetooth headset for talking on my cell phone; USB sticks; portable hard
drives, enclosures to make my hard drives portable, and hard drive docking
stations that connect via USB; laptop cooler/rests to keep my PC and lap
cool, one for home and one for work; extendable Ethernet cables; a power
converter for my car that I can plug my chargers and USB devices into so I
can keep my devices charged; a charger for my laptop at the office and in
my bag; an extra laptop battery for long trips; a set of ear buds so I can
listen to my "weird shows" on my computer while my wife sleeps;
uninterruptible power supplies, so I can be sure my office equipment remains
powered. You get the idea. And, that is just the hardware!
We are entering the last two months of the year. You
might want to buy someone a present (my birthday is coming up!). You might
want to buy yourself a present. You might want to take advantage of buying
it in this year rather than next so you get to take it off this year's taxes
or so you can use the generous Section 179a tax deduction. But, whatever
the reason, you should sweat the small stuff, because it is often the little
things that make life so much easier.
What is Slowing Up Your PC?
There are three things that will slow up your PC:
(1) What you or some programmer told it to do.
(2) Latency.
(3) Bandwidth.
(3) Bandwidth
Let us start with the least important factor first,
bandwidth. Bandwidth is a measure of how much data can flow past a certain
point. The more that can flow past, the higher the bandwidth. The more
bandwidth you have the faster you can move information, and the more
information you can move in a given period of time, the more you can do.
Think of it like cars on a road that all travel the same speed. If you have
a two lane road, only two cars can pass a point at any given time. If you
have an eight lane road, eight can pass, so you have four times the
bandwidth.
(2) Latency
However, bandwidth is less important than latency.
Latency is a measure of how long it takes to get the information you want.
You may have heard of a PING time. Think of a PING as a round trip drive
somewhere. That time is a measure of latency. If the destination is
farther away, latency increases. If the path you take through the Internet
is a long one, regardless of the actual physical distance between the two
points, then you will have a high PING time. What networking devices really
do, though, is act like traffic cops. They take each car in turn, look up
the directions to where it should go next, and then send it on. If one of
those cops is slower than the others, it will take you longer to get to your
destination. A cop can only deal with one car at a time, so you have to
wait your turn. If you are in a long line, you may wait a while, and it
will again take you longer to get to your destination. Latency is increased
by the distance between end points, the ability to forward the information
on quickly, and the traffic and the priority of that traffic ahead of it.
In some cases, a cop might be so busy, she just cannot handle any more cars
being backed up beyond a certain point. In the real world, you would turn
around. In the data world, your car is sent to car heaven. When your data
disappears because a router cannot forward it on the Internet it is called a
dropped packet. Now, if you were depending on a bunch of cars to get
somewhere so you could start your soccer game, you would have to realize
that someone is missing and get their family to send them again. Meanwhile
you have to wait to start the game until that car arrives. Lost packets
(cars) severely limit your ability to get your message through because you
have to realize they are not there *and* you have to get them sent again.
Lost packets increase latency so much that even a small amount of packet
loss will make very fast connections seem slow.
Why Bandwidth is Still
Important
Bandwidth is important to latency because each link to a
destination may be either a two lane road or a superhighway. The wider
roads can move more cars in a given period of time. Having a large road
feed a small one can result in traffic congestion, resulting in higher wait
times as the cars wait to get on the smaller road or having a car being sent
to car heaven. This is a very common occurrence, so low bandwidth links can
increase both latency and throughput (number of cars per minute). In fact,
almost all ISPs greatly oversubscribe their bandwidth out to the rest of the
world. Think of a fanlike road system. Each rib of the fan can send data
to the junction point which then sends it through the one link to the rest
of the world. As long as people do not need to go long distance very often
and you do not have a lot of people on the edge of the fan, then you are
unlikely to have congestion at the exit road. But, if they need to go their
often or you have a lot more people (higher oversubscription rate) or both,
then you will end up with lots of congestion and/or high packet loss.
Latency is the bane of computing. The majority of a
computer's time is spent waiting for information to become available. You
may have a very high bandwidth inbound connection from the Internet. Yet
your web browsing might only use a fraction of that bandwidth. Latency is
the culprit. To build a web page, you are probably making a lot of requests
for small amounts of information from a web server that is not nearby. Each
one of those requests has to go the distance. That is why it is relatively
unusual to see someone browsing the web use more than half a megabit per
second's worth of bandwidth. You simply cannot fill a wide road with cars
if you only ask that a few be sent at a time and you have to send a
messenger car to do it each time. You can also think of bandwidth as
another latency factor. With a given amount of bandwidth, it is only
possible to deliver a certain amount of information in a certain period of
time. You can go no faster than that. The key computing rule here is that
the slowest link in the chain always determines how fast you can do
something.
Latency Inside Your
Computer
Latency occurs within your computer as well. Processors
are VERY fast. They operate at gigahertz speeds (one billion operations per
second). They may be working on a number of things in parallel as they
march data through (higher bandwidth). Standard memory cannot keep up with
their processors. So, engineers use small amounts of very fast memory to
buffer what is needed next for the processor. This cache (literally means a
bag) of memory is designed to keep up with the processor's needs. But you
cannot build an entire PC with that high speed memory as it would be
unbelievably expensive. In fact, because the speeds of processors are so
great, engineers have had to rely on multiple stages of cache memory. As we
move outwards from the processor to main memory, bandwidth decreases and
latency increases. If the processor needs something in main memory that it
does not have in its local cache, it needs to copy it to that cache from
main memory. Oh, no, it is worse! It is not in memory either. It is on
your hard disk. Hard disks also have cache, and some new drives have
multiple stages of cache, like the Seagate Momentus XT. As long as the data
you want is in the disk cache, you can start the transfer to system memory
almost immediately and from there it can be transferred to processor cache
memory. If it is not in the hard disk cache, then you have to find it on
the disk, which takes many milliseconds before you can even start reading it
off. And, if the file is fragmented, you might have to seek out all the
pieces incurring multiple waits. A millisecond does not sound like much at
1/1,000th of a second. But, compared to system memory, it is because system
memory can respond in nanoseconds, 1/1,000,000,000th of a second! Wait, are
you working on a file on the server? You may have to add the time for the
server to respond and transfer data across the wire (this can actually be
faster than local storage, but normally is not). Are you looking for
something on the Internet? It may take many seconds to get that
information.
Fighting Latency With
Caching
So, you can see why cache is very important. You want to
keep as much information that might be needed as close to the processor as
possible so it has something to work with. Otherwise, it just twiddles its
thumbs waiting for something to do. Our information storage devices and
connections just cannot keep up. That is why improvements in information
storage and connections can mean a great deal. Processor cache, system
memory, and disk cache are much bigger and faster than they used to be. You
especially do not want to run low on system memory because your computer
will have to use your hard disk to simulate system memory and your computing
speed will plummet.
Nowhere is caching more important than when you are
working on the Internet. Think of it as the ability to queue up cars so you
can work on them at a body shop, even if the roads are bad. Your browser
has a cache. So does Java. So do all your media players, though there it
is called a buffer. In fact, one of the most important numbers involving
Internet data is the timeout value. The timeout value answers the question:
how long can you hold this value in your cache and still trust that it will
be current? Network engineers do a lot to lower your latency. Anything you
have cached on your system is best. You do not have to go over a wire to
ask someone else for it. Each hop through a network device or to a server
adds in some latency. Fewer hops mean better responsiveness. So, having it
on your machine is best. Having it on you local high bandwidth low latency
LAN is pretty still quite good, so caching information there is also a good
idea. If you cannot find it there, having it somewhere close on the
Internet is the next best thing. That is why content providers commonly
spread their information all over the Internet so that when you want it, it
will be close to you. Think of it as your local McDonald's that has the
same type of food (information) rather than having to drive to another
restaurant that is not a chain hundreds of miles away. Routers (the traffic
cops we talked about) are given instructions to route you via the best path
they can find. Load balancers divide an incoming load among servers so the
least burdened server can handle your request, something like the shortest
lane at the toll booth. Anything that can be done on your end or out on the
Internet to lower latency will make your experience a better one. No one
wants to have their shopping cart time out and have to start over or have
web pages take minutes to display.
Congestion Adds to Latency
Some people think they are OK if they have low bandwidth
utilization, or the amount of bandwidth they use relative the bandwidth
available. If, on average, you only use one lane and you have ten, that is
a 10% bandwidth utilization. Unfortunately, bandwidth utilization tells you
less than you might think. You think you have an open road. But, if you
measured the utilization over a 24 hour period and you are going home during
rush hour, that bandwidth may not seem like enough, and your drive home
might be a long one (high latency due to congestion). Data is like that,
too. You ask for something, and it is delivered right now, as fast as the
systems can deliver it. If you had infinite bandwidth, you might not see
any congestion. On high bandwidth LAN links congestion is not normally an
issue and latency because you are less likely to be taking up all of the
bandwidth at any moment in time, though it still does happen. Slower links
out to the Internet do have congestion issues and congestion raises
latency. There are two ways to "solve" congestion issues. The easy way is
to build bigger or more roads (Internet links) and get traffic cops
(routers) that can handle more cars (data). The more elegant, but harder to
implement way, is to prioritize your data. In effect, you allow some cars
only on HOV lanes or you allow emergency vehicles to go first rather than
having to queue up.
(1) Stop Telling Your Computer to
do Too Much!
So, bandwidth is important, and latency is even more
important. We are done, right? Wrong. More important than either of these
factors is how you, your programs, your operating system, and the systems
you are accessing behave. Let us look at something that needs to be done in
a series of steps, like booting up your computer. Adding a solid state
drive which cuts data access times to the microsecond range should improve
boot time drastically, right? It does, but not as much as you might have
hoped for. The problem is that you are following a sequence. One step must
complete before the other begins. In addition, the code was written to give
enough time for the various pieces of code to load. This timing may be
based on a conservative estimate of how long it should take to get that
information off of a rotating disk, or it might simply be the same no matter
how fast your storage is. Programs and operating systems usually have
inefficient pieces of code. In addition to that, modern operating systems
and programs are written in a modular fashion. They use shared programs and
routines so that new programs do not have to be completely rewritten from
scratch. They may need to use a small piece of code out of a code library.
Unfortunately, they must load the entire library first and find that piece
of code before running it.
As part of the boot process, your PC needs to start the
various programs and services you want it to run. If you have a lot of
them, your boot time can be several minutes. Each program that you have
running all the time also chews up system memory that might be used for
doing work. So be very sparing with what you load at system startup time
and keep running at all times. Your productivity will suffer for it.
Defragmentation Helps!
Remember how I mentioned how slow disk speed access is
and it is even worse if your files are in pieces? The more cluttered your
hard drive, the harder it is to find things. Remove what you do not need.
Reorganize your hard drive to minimize the number of times you have to seek
a piece of a file by reorganizing your files to be contiguous with disk
defragmentation. Fewer and more contiguous files mean faster access.
Data Lost in Transmission Really
Hurts!
When a program or your operating system asks for data
from storage, it has to ensure that what it asked for is completely
accurate. Data delivered from storage or over the Internet has extra bits
attached to it that confirm that it has not been corrupted in transmission
(think dinged car). Each piece of data you receive is only part of the
whole. If one piece got corrupted or did not appear, your program or
operating system has to realize it, and then ask for it again. Lossy
communication systems (packet loss on local networks or the Internet is not
uncommon) will bring performance to a standstill and often cause a program
to visibly slow or fail. Some programs do not deal well with packet loss
well at all. For example, have you ever tried to go to a web site and not
been able to get there, but when you clicked again it worked? That was
probably because your browser needed to find out where the information you
wanted was on the Internet. It used a program called a DNS client to ask
what IP number belonged to
www.iwantthis.com,
but that number did not come back in time or at all, so the browser timed
out and did not show anything. The next time you asked, it worked.
Program It Right!
Poor programming or use practices bring to mind one of my
favorite phrases, "There is no amount of networking or hardware improvements
that will fix bad programming." Put another way, if you tell your computer
to do something silly, it will do it.
Conclusion
So, when you go to troubleshoot your computer for slow
speed, look for the most likely causes: (1) You, your programs, operating
system, and the systems you are accessing (2) Latency (3) Bandwidth.
©2010 Tony Stirk, Iron Horse
tstirk@ih-online.com