Binary Numbers and Computer Speed

Posted by PITHOCRATES - April 2nd, 2014

Technology 101

Computers are Good at Arithmetic thanks to Binary Numbers

Let’s do a fun little experiment.  Get a piece of paper and a pen or a pencil.  And then using long division divide 4,851 by 34.  Time yourself.  See how long it takes to complete this.  If the result is not a whole number take it to at least three places past the decimal point.  Okay?  Ready……..start.

Chances are the older you are the faster you did this.  Because once upon a time you had to do long division in school.  In that ancient era before calculators.  Younger people may have struggled with this.  Because the result is not a whole number.  Few probably could do this in their head.  Most probably had a lot of scribbling on that piece of paper before they could get 3 places past the decimal point.  The answer to three places past the decimal point, by the way, is 142.676.  Did you get it right?  And, if so, how long did it take?

Probably tens of seconds.  Or minutes.  A computer, on the other hand, could crunch that out faster than you could punch the buttons on a calculator.  Because one thing computers are good at is arithmetic.  Thanks to binary numbers.  The language of all computers.  1s and 0s to most of us.  But two different states to a computer.  That make information the computer can understand and process.  Fast.

A Computer can look at Long Streams of 1s and 0s and make Perfect Sense out of Them

The numbers we use in everyday life are from the decimal numeral system.  Or base ten.  For example, the number ‘4851’ contains four digits.  Where each digit can be one of 10 values (0, 1, 2, 3…9).   And then the ‘base’ part comes in.  We say base ten because each digit is a multiple of 10 to the power of n.  Where n=0, 1, 2, 3….  So 4851 is the sum of (4 X 103) + (8 X 102) + (5 X 101) + (1 X 100).  Or (4 X 1000) + (8 X 100) + (5 X 10) + (1 X 1).  Or 4000 + 800 + 50 + 1.  Which adds up to 4851.

But the decimal numeral system isn’t the only numeral system.  You can do this with any base number.  Such as 16.  What we call hexadecimal.  Which uses 16 distinct values (0, 1, 2, 3…9, A, B, C, D, E, and F).  So 4851 is the sum of (1 X 163) + (2 X 162) + (15 X 161) + (3 X 160).  Or (1 X 4096) + (2 X 256) + (15 X 16) + (3 X 1).  Or 4096 + 512 + 240 + 3.  Which adds up to 4851.  Or 12F3 in hexadecimal.  Where F=15.  So ‘4851’ requires four positions in decimal.  And four positions in hexadecimal.  Interesting.  But not very useful.  As 12F3 isn’t a number we can do much with in long division.  Or even on a calculator.

Let’s do this one more time.  And use 2 for the base.  What we call binary.  Which uses 2 distinct values (0 and 1).  So 4851 is the sum of (1 X 212) + (0 X 211) + (0 X 210) + (1 X 29) + (0 X 28) + (1 X 27) + (1 X 26) + (1 X 25) + (1 X 24) + (0 X 23) + (0 X 22) + (1 X 21) + (1 X 20).  Or (1 X 4096) + (0 X 2048) + (0 X 1024) + (1 X 512) + (0 X 256) + (1 X 128) + (1 X 64) + (1 X 32) + (1 X 16) + (0 X 8) + (0 X 4) + (1 X 2) + (1 X 1).  Or 4096 + 0 + 0 + 512 + 0 + 128 + 64 + 32 + 16 + 0 + 0 + 2 + 1.  Which adds up to 4851.  Or 1001011110011 in binary.  Which is gibberish to most humans.  And a little too cumbersome for long division.  Unless you’re a computer.  They love binary numbers.  And can look at long streams of these 1s and 0s and make perfect sense out of them.

A Computer can divide two Numbers in a few One-Billionths of a Second

A computer doesn’t see 1s and 0s.  They see two different states.  A high voltage and a low voltage.  An open switch and a closed switch.  An on and off.  Because of this machines that use binary numbers can be extremely simple.  Computers process bits of information.  Where each bit can be only one of two things (1 or 0, high or low, open or closed, on or off, etc.).  Greatly simplifying the electronic hardware that holds these bits.  If computers processed decimal numbers, however, just imagine the complexity that would require.

If working with decimal numbers a computer would need to work with, say, 10 different voltage levels.  Requiring the ability to produce 10 discrete voltage levels.  And the ability to detect 10 different voltage levels.  Greatly increasing the circuitry for each digit.  Requiring far more power consumption.  And producing far more damaging heat that requires more cooling capacity.  As well as adding more circuitry that can break down.  So keeping computers simple makes them cost less and more reliable.  And if each bit requires less circuitry you can add a lot more bits when using binary numbers than you can when using decimal numbers.  Allowing bigger and more powerful number crunching ability.

Computers load and process data in bytes.  Where a byte has 8 bits.  Which makes hexadecimal so useful.  If you have 2 bytes of data you can break it down into 4 groups of 4 bits.  Or nibbles.  Each nibble is a 4-bit binary number that can be easily converted into a single hexadecimal number.  In our example the binary number 0001 0010 1111 0011 easily converts to 12F3 where the first nibble (0001) converts to hexadecimal 1.  The second nibble (0010) converts to hexadecimal 2.  The third nibble (1111) converts to hexadecimal F.  And the fourth nibble (0011) converts to hexadecimal 3.  Making the man-machine interface a lot simpler.  And making our number crunching easier.

The simplest binary arithmetic operation is addition.  And it happens virtually instantaneously at the bit level.  We call the electronics that make this happen logical gates.  A typical logical gate has two inputs.  Each input can be one of two states (high voltage or low voltage, etc.).  Each possible combination of inputs produces a unique output (high voltage or low voltage, etc.).  If you change one of the inputs the output changes.  Computers have vast arrays of these logical gates that can process many bytes of data at a time.  All you need is a ‘pulsing’ clock to sequentially apply these inputs.  With the outputs providing an input for the next logical operation on the next pulse of the clock.

The faster the clock speed the faster the computer can crunch numbers.  We once measured clock speeds in megahertz (1 megahertz is one million pulses per second).  Now the faster CPUs are in gigahertz (1 gigahertz is 1 billion pulses per second).  Because of this incredible speed a computer can divide two numbers to many places past the decimal point in a few one-billionths of a second.  And be correct.  While it takes us tens of seconds.  Or even minutes.  And our answer could very well be wrong.


Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

Morse Code, Telegraph, Binary System, Bit, Byte, Bitstream, Dialup Modems, Broadband, Cable Modem and Coaxial Cable

Posted by PITHOCRATES - August 8th, 2012

Technology 101

One of the First Improvements in Communication Speed was Morse Code sent on a Telegraph

The Battle of New Orleans (1815) was a great American victory over the British.  General Andrew Jackson with a force of about 4,000 repulsed a British force of some 11,000.  It was a huge American win.  The biggest in the war.  And a humiliating British defeat.  Now here’s an interesting side note about that battle.  The war was already over.  We had already signed a peace treaty with the British.  And were already repairing that special relationship between the United States and Britain.  So why was there even a Battle of New Orleans?  Because there was no Internet, television, radio or telegraph back then.  There was only ink and paper.  And foot, horse and boat.  Making communications slow.  Very, very slow.

The American Civil War, like the Crimean War, was a war where the technology was ahead of the tactics.  Four years of fighting with modern weapons using Napoleon tactics killed over half a million Americans by 1865.  After General Grant flushed General Lee from the Petersburg defenses he chased him as Lee fled west.  With General Sheridan’s cavalry in hot pursuit.  Cutting in front of Lee’s army to bring on the Battle of Sayler’s Creek.  Where the Confederates suffered a crippling defeat.  General Sheridan telegraphed General Grant, “If the thing is pressed, I think that Lee will surrender.”  President Lincoln was monitoring the military wires in Washington.  When he read Sheridan’s message he quickly sent a wire to General Grant.  “Let the thing be pressed.”  Grant pressed the thing.  And Lee surrendered at Appomattox Courthouse.

In 50 years time communications went from taking weeks.  To taking as little as minutes. The benefit of faster communications?  At the Battle of New Orleans approximately 2,792 people were killed, wounded or went missing.  In a battle fought after the war was over.  Only word hadn’t gotten to them yet.  So fast communications are a good thing.  And can prevent bad things from happening.  And one of the first improvements in communication speed was Morse code sent on a telegraph.  A wire between two places.  With a key switch and an electromechanical device at each end.  When an operator tapped the switch closed an electrical current went down the wire to the electromechanical device at the other end of the wire, inducing a current in it that opened and closed a device that replicated the keying at the other end.  Thus they could send a series of ‘dots and dashes’ through this wire.  The operator encoded the message at one end by assigning a series of dots and/or dashes for each letter.  The operator at the other end then decoded these dots and dashes back into the original message.

Getting Outside Information into your Computer was a little like Getting Information over a Telegraph

Morse code is a binary system.  Just like the ‘bits’ in a computer system.  Where each bit was one of two voltage levels.  Represented by 1s and 0s.  Eight bits make a byte.  Like the telegraph operator a man-machine interface encodes information into a series of bits.  The computer bus, registers and microprocessor ‘grab’ bytes of this bitstream at a time.  And then processes these bits in parallel blocks of bytes.  Unlike the telegraph where the encoded message went serially down the wire.  The telegraph greatly increased the speed of communications.  But a telegraph operator could only encode and send one letter of a word at a time.  So he couldn’t send many letters (or pulses) per second.  Just a few.  But when you encode this information into 8-bit chunks you can greatly increase the speed data moves inside a computer.  As computer speeds grew so did their bus size.  From 8 bit to 16 bit (2 bytes).  From 16 bit to 32 bit (4 bytes).  From 32 bit to 64 bit (8 bytes).  As a computer processed more bytes of data at a time in parallel computers could increase the speed it completed tasks.

Of course, people who were most interested in faster computers were gamers.  Who played games with a lot of video and sound information encoded in them.  The faster the computer could process this information the better the graphics and sound were.  Today computers are pretty darn fast.  They can run some of the most demanding programs from 3-D gaming to computer-aided design (CAD).  But then a new technology came out that made people interested by what was happening outside of their computer.  And how fast their computer was didn’t matter as much anymore.  Because getting that outside information into your computer was a little like getting information over a telegraph.  It came in serially.  Over a wire.  Through a modem that attached a computer to the Internet.  And the World Wide Web.  Where there was a whole lot of interesting stuff.  But to see it and hear it you had to get it inside your computer first.  And the weak link in all your web surfing was the speed of your modem.

A modem is modulator-demodulator.  Hence modem.  And it worked similar to the telegraph.  There was a wire between two locations.  Typically a telephone line.  At each end of this wire was a modem.  The wire terminated into each modem.  Each modem was connected to a computer.  One computer would feed a bitstream to its modem.  The modem would encode the 1s and 0s in that bitstream.  And modulate it onto a carrier frequency.  The modem would output this onto the telephone line.  Where it traveled to the other modem.  The other modem then demodulated the carrier frequency.  Decoded the 1s and 0s and recreated the bitstream.  And fed it into the other computer.  Where the computer grabbed bytes of the bitstream and processed it.

The Coaxial Cable of Broadband could Carry a wider Range of Frequencies than the Twisted Pairs of Telephone Wire

The speed at which all of this happened depended on your modem.  Specifically your modem.  The other modem you connected to was typically on a web server and was of the highest speed.  And on all of the time.  Unlike the early dialup modems we used in the Nineties when we first started surfing the web.  Back then surfing could be expensive as you often paid for that time as if you were on the telephone.  This was the other weak link in surfing.  Trying to make that telephone line as short as possible.  Because that was what you paid for.  The use of the telephone line.  Once you got onto the Internet you could travel anywhere at no additional cost.  So you dialed in to an available local number.  Which sometimes could take awhile.  And when you finally did dial-up on a local line but went inactive for a period of time it disconnected you.  Because others were looking for an available local phone line, too.

The first modem speeds many of us used at the beginning were 2400 bits per second (bps).  Which was a lot faster than the few bits per second of a telegraph operator.  And okay for sending email.  But it was painfully slow for graphics and sound.  And then the improvements in speed came.  And they came quickly.  4800 bps.  9600 bps.  14400 (14.4k) bps.  28800 (28.8k) bps.  33600 (33.6k) bps.  And then the last of the dialup modems.  56000 (56k) bps.  Which meant you could download up to 56,000 bits per second of 1s and 0s.  That’s 56,000 pieces of information coming out of that modem each second.  Now that was fast.  Still slower than what happened inside the computer with those wide parallel buses.  That chomped off huge bytes of data.  And processed them at rates in excess of a billion times a second.  But it was still the fastest thing on the block.  Until broadband arrived.

Today you can buy a broadband cable modem for less than $100 that can download at speeds in excess of 100,000,000 bits per second.  That’s over 100 million pieces of information each second.  It is only data rates like this that let you live stream a movie off the Internet.  Something that the 56k modem just wouldn’t do for you.  And it’s always on.  Costing you a flat fee no matter how long you spend surfing the web.  You turned on your computer and you were connected to the Internet.  What allowed those greater speeds?  The wire.  The coaxial cable of broadband could carry a wider range of frequencies than the twisted pairs of the telephone wire.  Providing a greater bandwidth.  Which could carry more encoded information between modems.  Allowing you to download music and videos quicker than it took a telegraph operator to send a message.


Tags: , , , , , , , , , , , , , , , , , , , , ,

Boolean Algebra, Logic Gates, Flip-Flop, Bit, Byte, Transistor, Integrated Circuit, Microprocessor and Computer Programming

Posted by PITHOCRATES - February 1st, 2012

Technology 101

A Binary System is one where a Bit of Information can only have One of Two States 

Parents can be very logical when it comes to their children.  Children always want dessert.  But they don’t always clean their rooms or do their homework.  So some parents make dessert conditional.  For the children to have their dessert they must clean their rooms AND do their homework.  Both things are required to get dessert.  Or you could say this in another way.  If the children either don’t clean their rooms OR don’t do their homework they will forfeit their dessert.  Stated in this way they only need to do one of two things (not clean their room OR not do their homework) to forfeit their dessert. 

This was an introduction to logic.  George Boole created a mathematical way to express this logic. We call it Boolean algebra.  But relax.  There will be no algebraic equations here.

In the above example things had only one of two states.  Room cleaned.  Room not cleaned.   Homework done.  Homework not done.  This is a binary system.  Where a bit of information can only have one of two states.  We gave these states names.  We could have used anything.  But in our digital age we chose to represent these two states with either a ‘1’ or a ‘0’.  One piece of information is either a ‘1’.  And if it’s not a ‘1’ then it has to be a ‘0’.  In the above example a clean room and complete homework would both be 1s.  And a dirty room and incomplete homework would be 0s.  Where ‘1’ means a condition is ‘true’.  And a ‘0’ means the condition is ‘false’.

Miniaturization allowed us to place more Transistors onto an Integrated Circuit

Logic gates are electrical/electronic devices that process these bits of information to make a decision.  The above was an example of two logic gates.  Can you guess what we call them?  One was an AND gate.  The other was an OR gate.  Because one needed both conditions (the first AND the second) to be true to trigger a true output.  Children get dessert.  The other needed only one condition (the first OR the second) to be true to trigger a true output.  Children forfeit dessert. 

We made early gates with electromechanical relays and vacuum tubes.  Claude Shannon used Boolean algebra to optimize telephone routing switches made of relays.  But these were big and required big spaces, needed lots of wiring, consumed a lot of power and generated a lot of heat.  Especially as we combined more and more of these logic gates together to be able to make more complex decisions.  Think of what happens when you press a button to call an elevator (an input).  Doors close (an action).  When doors are closed (an input) car moves (an action).  Car slows down when near floor.  Car stops on floor.  When car stops doors open.  Etc.  If you were ever in an elevator control room you could hear a symphony of clicks and clacks from the relays as they processed new inputs and issued action commands to safely move people up and down a building.  Some Boolean number crunching, though, could often eliminate a lot of redundant gates while still making the same decisions based on the same input conditions. 

The physical size constraints of putting more and more relays or vacuum tubes together limited these decision-making machines, though.  But new technology solved that problem.  By exchanging relays and vacuum tubes for transistors.  Made from small amounts of semiconductor material.  Such as silicon.  As in Silicon Valley.  These transistors are very small and consume far less power.  Which allowed us to build larger and more complex logic arrays.  Built with latching flip-flops.  Such as the J-K flip-flop.  Logic gates wired together to store a single bit of information.  A ‘1’ or a ‘0’.  Eight of these devices in a row can hold 8 bits of information.  Or a byte.  When a clock was added to these flip-flops they would check the inputs and change their outputs (if necessary) with each pulse of the clock.  Miniaturization allowed us to place more and more of these transistors onto an integrated circuit.  A computer chip.  Which could hold a lot of bytes of information. 

To Program Computers we used Assembly Language and High-Level Programming Languages like FORTRAN

The marriage of latching flip-flops and a clock gave birth to the microprocessor.  A sequential digital logic device.  Where the microprocessor checks inputs in sequence and based on the instructions stored in the computer’s memory (those registers built from flip-flops encoded with bytes of binary instructions) executes output actions.  Like the elevator.  The microprocessor notes the inputs.  It then looks in its memory to see what those inputs mean.  And then executes the instructions for that set of inputs.  The bigger the registers and the faster the clock speed the faster this sequence.

Putting information into these registers can be tedious.  Especially if you’re programming in machine language.  Entering a ‘1’ or a ‘0’ for each bit in a byte.  To help humans program these machines we developed assembly language.  Where we wrote lines of program using words we could better understand.  Then used an assembler to covert that programming into the machine language the machine could understand.  Because the machine only looks at bytes of data full of 1s and 0s and compares it to a stored program for instructions to generate an output.  To improve on this we developed high-level programming languages.  Such as FORTRAN.  FORTRAN, short for formula translation, made more sense to humans and was therefore more powerful for people.  A compiler would then translate the human gibberish into the machine language the computer could understand.

Computing has come a long way from those electromechanical relays and vacuum tubes.  Where once you had to be an engineer or a computer scientist to program and operate a computer.  Through the high-tech revolution of the Eighties and Silicon Valley.  Where chip making changed our world and created an economic boom the likes few have ever seen.  To today where anyone can use a laptop computer or a smartphone to surf the Internet.  And they don’t have to understand any of the technology that makes it work.  Which is why people curse when their device doesn’t do what they want it to do.  It doesn’t help.  But it’s all they can do.  Curse.  Unlike an engineer or computer scientist.  Who don’t curse.  Much.


Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,